WO2023213252A1 - 扫描数据处理方法、装置、设备及介质 - Google Patents

扫描数据处理方法、装置、设备及介质 Download PDF

Info

Publication number
WO2023213252A1
WO2023213252A1 PCT/CN2023/091806 CN2023091806W WO2023213252A1 WO 2023213252 A1 WO2023213252 A1 WO 2023213252A1 CN 2023091806 W CN2023091806 W CN 2023091806W WO 2023213252 A1 WO2023213252 A1 WO 2023213252A1
Authority
WO
WIPO (PCT)
Prior art keywords
dimensional coordinate
processed
points
target
coordinate points
Prior art date
Application number
PCT/CN2023/091806
Other languages
English (en)
French (fr)
Inventor
张远松
陈晓军
张健
江腾飞
赵晓波
Original Assignee
先临三维科技股份有限公司
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by 先临三维科技股份有限公司 filed Critical 先临三维科技股份有限公司
Publication of WO2023213252A1 publication Critical patent/WO2023213252A1/zh

Links

Classifications

    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61CDENTISTRY; APPARATUS OR METHODS FOR ORAL OR DENTAL HYGIENE
    • A61C19/00Dental auxiliary appliances
    • A61C19/04Measuring instruments specially adapted for dentistry

Definitions

  • the present disclosure relates to the technical field of intraoral scanning, and in particular, to a scanning data processing method, device, equipment and medium.
  • the implant position is determined by scanning with a scanning rod.
  • the technical problem to be solved by this disclosure is to solve the problem that the existing multi-data splicing scheme is usually used when scanning intraoral data. Due to the existence of accumulated errors, the overall accuracy of the model is ultimately low.
  • a scan data processing method, device, equipment and medium including:
  • a scan data processing method includes:
  • the target position transformation relationship is determined.
  • a scan data processing device in a second aspect, includes:
  • An image acquisition module configured to acquire multiple frames of images to be processed including auxiliary feature points; wherein the auxiliary feature points have corresponding distribution true values;
  • Image processing module used to process based on the multiple frames of images to be processed to obtain the same coordinate system All three-dimensional coordinate points;
  • a measurement processing module used to perform measurement processing on all three-dimensional coordinate points to obtain the target three-dimensional coordinate point
  • a determination module is used to determine the target position transformation relationship based on the target three-dimensional coordinate points and distribution true values.
  • an electronic device in a third aspect, includes: a processor; a memory for storing instructions executable by the processor; and the processor is configured to read the executable instructions from the memory. The instructions are executed, and the instructions are executed to implement the scan data processing method provided by the embodiments of the present disclosure.
  • a computer storage medium stores a computer program, and the computer program is used to execute a scan data processing method as provided in the first aspect of the present disclosure.
  • the scanning data processing scheme obtained by the embodiment of the present disclosure obtains multiple frames of images to be processed including auxiliary feature points; wherein the auxiliary feature points have corresponding distribution true values, and are processed based on the multiple frames of images to be processed to obtain the images in the same coordinate system. All three-dimensional coordinate points are measured and processed to obtain the target three-dimensional coordinate point. Based on the target three-dimensional coordinate point and the distribution true value, the target position transformation relationship is determined. Using the above technical solution, the target position transformation relationship is determined based on the preset distribution true values of the auxiliary feature points and the three-dimensional coordinate points calculated by the scan, so that the relative positions between the scanning rods can be accurately located and the data processing efficiency in the oral scanning scenario can be improved. and precision.
  • Figure 1 is an application scenario diagram of scanning data processing provided by an embodiment of the present disclosure
  • Figure 2 is a schematic flowchart of a scan data processing method provided by an embodiment of the present disclosure
  • FIG. 3 is a schematic flowchart of another scan data processing method provided by an embodiment of the present disclosure.
  • Figure 4 is a schematic flowchart of another scan data processing method provided by an embodiment of the present disclosure.
  • Figure 5 is a schematic structural diagram of a scan data processing device provided by an embodiment of the present disclosure.
  • FIG. 6 is a schematic structural diagram of an electronic device provided by an embodiment of the present disclosure.
  • FIG. 1 is an application scenario diagram of scanning data processing provided by an embodiment of the present disclosure.
  • the application environment includes: installing multiple intraoral scanning rods in the target oral cavity.
  • the intraoral scanning rods include: a scanning rod component 11 connected to the scanning rod component 11
  • the auxiliary component 12 is provided with auxiliary feature points on the scanning rod component 11 and/or the auxiliary component 12, wherein the shape feature of the auxiliary component 12 itself serves as an auxiliary feature, and the scanning rod component 11 is adapted to the implant installed in the target oral cavity, By fitting and installing the scanning rod component 11 with the implant, the intraoral scanning rod is installed in the target oral cavity.
  • the auxiliary parts 12 of any two intraoral scanning rods among the plurality of intraoral scanning rods are adapted to each other, so that when any two intraoral scanning rods 10 are installed adjacently in the oral cavity, the auxiliary feature points on the two auxiliary parts are continuously distributed, and can be preset
  • the true value coordinate points of the auxiliary feature points are obtained through, for example, a SLR photogrammetry system, a three-dimensional coordinate instrument, etc.
  • the three-dimensional coordinate points corresponding to the scanned image correspond to the true value coordinate points of the pre-obtained auxiliary feature points.
  • the intraoral scanning rod includes a scanning rod component for connecting to the implant and an auxiliary component connected to the scanning rod.
  • the intraoral scanning rod is provided with a target feature, and the target feature is Continuously distributed on the scanning rod and/or auxiliary components, and the target features are not distributed on one side of the scanning rod and/or auxiliary components.
  • the intraoral scanner scans the target oral cavity, acquires multiple frames of images, and transmits them to the data processing module for data processing.
  • the data processing module performs the following methods:
  • the initial three-dimensional data includes the initial point set of the target oral cavity and the three-dimensional coordinate measurement values of the target features in the same coordinate system;
  • the preset model includes the true three-dimensional coordinate value of the target feature in the same coordinate system and the true point set of the intraoral scanning rod (the true three-dimensional coordinate value of each point)
  • the initial point set of the target oral cavity and the true point set of the intraoral scanning rod are spliced;
  • the positioning information of the intraoral scanning rod is determined based on the real point set of the spliced intraoral scanning rod.
  • the positioning information of the intraoral scanning rod is the positioning information of the implant. Based on this positioning information, the tooth design is carried out so that the designed tooth can match the Implant adaptation and installation.
  • auxiliary feature points have corresponding distribution true values
  • processing based on the multiple frames of images to be processed all three-dimensional coordinate points in the same coordinate system are obtained, and all three-dimensional coordinate points are obtained.
  • the three-dimensional coordinate points are measured and processed to obtain the target three-dimensional coordinate point.
  • the target position transformation relationship is determined, and the target position is determined based on the preset distribution true value of the auxiliary feature point and the three-dimensional coordinate point calculated by scanning. Transform the relationship, so that the relative position between the scanning rods can be accurately located, and the data processing efficiency and accuracy in the oral scanning scenario can be improved.
  • FIG. 2 is a schematic flowchart of a scan data processing method provided by an embodiment of the present disclosure.
  • the method can be executed by a scan data processing device, where the device can be implemented using software and/or hardware, and can generally be integrated in an electronic device. middle. As shown in Figure 2, the method includes:
  • Step 101 Obtain multiple frames of images to be processed including auxiliary feature points; wherein the auxiliary feature points have corresponding distribution true values.
  • the scanning rod includes a main body of the scanning rod and an auxiliary feature body, and the main body of the scanning rod is provided with auxiliary feature points; or there is no auxiliary feature point on the main body of the scanning rod, and the auxiliary feature body between the main bodies of the scanning rod is provided with auxiliary feature points; or The shape of the auxiliary body itself is scanned as auxiliary feature points.
  • the target oral cavity refers to the oral cavity that requires dental implantation.
  • Intraoral scanning is required to locate the implantation point in the oral cavity.
  • the auxiliary components of any two intraoral scanning rods among the multiple intraoral scanning rods are installed adjacent to the target.
  • the auxiliary feature points in the oral cavity and on the two auxiliary components are continuously distributed, so that the target oral cavity can be scanned intraorally and multiple frames of images to be processed can be obtained.
  • the target oral cavity can be scanned with a handheld oral scanner (monocular or binocular camera), that is to say, multiple frames of images to be processed are obtained by taking photos, such as dozens of frames of images to be processed in one second. Collection cycle.
  • a handheld oral scanner monocular or binocular camera
  • the scanning rod is a feature object containing auxiliary feature points, where the auxiliary feature points can uniquely identify a feature, that is, the scanning rod is provided with auxiliary feature points, and each auxiliary feature point Can uniquely identify the corresponding position features on the scanning rod.
  • auxiliary feature points can uniquely identify a feature
  • each auxiliary feature point Can uniquely identify the corresponding position features on the scanning rod.
  • Target feature a can uniquely identify the position feature of position 1 on the scanning rod.
  • the target feature b can uniquely identify the position feature of position 2 on the scanning rod.
  • the true value of the distribution of auxiliary feature points or the true value of the distribution of auxiliary feature points in computer-aided design can be obtained through higher-precision equipment, such as SLR photogrammetry systems, three-dimensional coordinate meters, etc., that is, the coordinate values of the auxiliary feature points. .
  • the image to be processed obtained by each scan includes at least a preset number of auxiliary feature points to represent the continuity of the distribution of auxiliary feature points, thereby ensuring subsequent calculation accuracy, where the preset number can be set according to the application scenario. .
  • the preset number is 3 and the scanned image to be processed includes 2 auxiliary feature points, it means that the distribution of auxiliary feature points is discontinuous.
  • the scanned image to be processed includes 3 or 3 auxiliary feature points. The above means that the auxiliary feature points are distributed continuously.
  • multiple frames of images to be processed there are many ways to obtain multiple frames of images to be processed.
  • multiple frames of images to be processed are obtained by controlling a monocular camera to rotate in a certain direction while scanning the target oral cavity at a certain frequency. .
  • multiple frames of images to be processed are obtained by controlling the binocular camera to scan the target oral cavity according to a cycle.
  • the above two methods are only examples of obtaining multiple frames of images to be processed, and the embodiments of the present disclosure do not limit the specific methods of obtaining multiple frames of images to be processed.
  • the target oral cavity including the scanning rod is scanned to obtain multiple frames of images to be processed.
  • Step 102 Process based on multiple frames of images to be processed to obtain all three-dimensional coordinate points in the same coordinate system.
  • all three-dimensional coordinate points refer to the three-dimensional coordinate points corresponding to all auxiliary feature points in the target oral cavity.
  • auxiliary feature points in each frame of the image to be processed are obtained.
  • 3D coordinate points perform 3D reconstruction based on the preset internal parameter matrix and 2D coordinate points, and obtain the 3D coordinate points corresponding to the auxiliary feature points of each frame of the image to be processed, and the 3D coordinates corresponding to the auxiliary feature points of multiple frames of the image to be processed.
  • the points are spliced to obtain the three-dimensional coordinate points of the auxiliary feature points of the multiple frames of images to be processed in the same coordinate system.
  • the two-dimensional coordinate points of the auxiliary feature points in each frame of the image to be processed are obtained, and a three-dimensional reconstruction is performed based on the relative positions between the two cameras and the two-dimensional coordinate points to obtain the auxiliary feature points of each frame of the image to be processed.
  • the corresponding three-dimensional coordinate points are spliced to obtain the three-dimensional coordinate points of the auxiliary feature points of the multiple frames of images to be processed in the same coordinate system.
  • processing can be performed based on the multiple frames of images to be processed to obtain three-dimensional coordinate points in the same coordinate system.
  • Step 103 Perform measurement processing on the three-dimensional coordinate points to obtain the target three-dimensional coordinate points.
  • the target three-dimensional coordinate point refers to the three-dimensional coordinate point after measuring and processing the three-dimensional coordinate point, which can more accurately reflect the three-dimensional coordinate point of the auxiliary feature point.
  • each three-dimensional coordinate point is projected onto the image coordinate system to obtain two-dimensional pixels.
  • coordinate point when the Euclidean distance between the two-dimensional pixel coordinate point and the two-dimensional coordinate point corresponding to the three-dimensional coordinate point is the smallest, the three-dimensional coordinate point is used as the target three-dimensional coordinate point.
  • the Nth frame image in which the three-dimensional coordinate point appears is obtained
  • the two-dimensional pixel coordinate point of the N-th frame image is obtained
  • the target is determined based on the distance between the two-dimensional coordinate point projected by the three-dimensional coordinate point and the two-dimensional pixel coordinate point.
  • all three-dimensional coordinate points can be measured and processed to obtain the target three-dimensional coordinate point.
  • Step 104 Determine the target position transformation relationship based on the target three-dimensional coordinate points and the true distribution value.
  • the target position transformation relationship refers to the transformation matrix that converts the scanned three-dimensional coordinate points of the target into the corresponding designed distribution true value.
  • the target position transformation relationship based on the target three-dimensional coordinate point and the distribution true value.
  • the target three-dimensional coordinate point and the distribution true value based on the preset scaling factor, the target three-dimensional coordinate point and the distribution true value, Calculate and obtain the initial position transformation relationship.
  • the preset optimization formula Based on the preset optimization formula, perform optimization calculation on the scaling factor, target three-dimensional coordinate point, distribution true value and initial position transformation relationship. Obtain the optimization value. Adjust the scaling factor and initial position transformation relationship to obtain the optimization. When the value is less than the preset threshold, the corresponding initial position changes The transformation relationship is used as the target position transformation relationship.
  • the position transformation relationship between each target three-dimensional coordinate point and the true value of the distribution is calculated, and the target position transformation relationship is obtained based on multiple position transformation relationships.
  • the above two methods are only examples of determining the target position transformation relationship based on the target three-dimensional coordinate points and distribution true values.
  • the embodiments of the present disclosure do not limit the specific methods of determining the target position transformation relationship based on the target three-dimensional coordinate points and distribution true values.
  • the scanning data processing scheme obtaineds multiple frames of images to be processed; wherein each frame of the image to be processed includes auxiliary feature points, and the auxiliary feature points have corresponding distribution true values. Based on the processing of the multiple frames of images to be processed, we obtain All three-dimensional coordinate points are measured and processed to obtain the target three-dimensional coordinate point. Based on the target three-dimensional coordinate point and the distribution true value, the target position transformation relationship is determined. Using the above technical solution, the target position transformation relationship is determined based on the preset distribution true values of the auxiliary feature points and the three-dimensional coordinate points calculated by the scan, so that the relative positions between the scanning rods can be accurately located and the data processing efficiency in the oral scanning scenario can be improved. and precision.
  • scanning can be performed by a monocular camera and a binocular camera.
  • the monocular camera and the binocular camera will be described in detail below in conjunction with FIG. 3 and FIG. 4 .
  • FIG. 3 is a schematic flowchart of another scan data processing method provided by an embodiment of the present disclosure. Based on the above embodiment, this embodiment further optimizes the above scan data processing method. As shown in Figure 3, the method includes:
  • Step 201 Control the scanning device to rotate in a preset direction while scanning the target oral cavity including the scanning rod at a preset frequency to obtain multiple frames of images to be processed; wherein the auxiliary feature points have corresponding distribution true values.
  • Step 202 Obtain the two-dimensional coordinate points of the auxiliary feature points in each frame of the image to be processed, and perform calculations based on the preset internal parameter matrix and the two-dimensional coordinate points to obtain the three-dimensional coordinate points of each frame of the image to be processed.
  • the three-dimensional coordinate points are spliced to obtain all three-dimensional coordinate points in the same coordinate system.
  • multiple frames of images to be processed are acquired through scanning (for example, dozens of frames of images to be processed are collected in one second by hardware triggering, and the images are collected cyclically), and based on the images to be processed, the auxiliary features in each frame of the image to be processed are obtained.
  • the two-dimensional coordinates of the point that is, the image recognition processing method extracts the pixel coordinates of the auxiliary feature points on each frame of the image to be processed), and then based on the camera's internal parameters (such as focal length, main point, tilt factor and lens distortion), texture tracking posture (for example, by obtaining the geometric and texture information of the scanning rod surface, performing adjacent Frame splicing), completes the reconstruction of the auxiliary feature points of the current frame, and the reconstructed three-dimensional coordinate points are based on the camera coordinate system of the first frame during scanning (the camera coordinate system of the first frame is determined by the first frame image, and each subsequent frame It is spliced with the first frame, so the coordinate system is the camera coordinate system of the first frame; the graph identifies two-dimensional coordinate points, which need to be converted into the camera coordinate system to obtain three-dimensional coordinate points).
  • the camera's internal parameters such as focal length, main point, tilt factor and lens distortion
  • texture tracking posture for example, by obtaining the geometric and texture information of the scanning rod surface, performing
  • the internal parameter matrix refers to the matrix composed of the internal parameters of the camera, so that the two-dimensional coordinate points are back-projected to the camera coordinate system based on the internal parameter matrix to obtain the three-dimensional coordinate point.
  • the three-dimensional coordinate points of each frame of the image to be processed are spliced to obtain all the three-dimensional coordinate points.
  • the different coordinate systems of the two frames can be aligned through their common parts, thereby splicing them together to obtain all the three-dimensional coordinate points.
  • Three-dimensional coordinate point That is, all three-dimensional coordinate points reconstructed from a single frame are fused into the overall frame points, that is, all three-dimensional coordinate points, according to distance constraints. All three-dimensional coordinate points are based on the overall tooth model coordinate system (world coordinate system).
  • Step 203 Project each three-dimensional coordinate point onto the image coordinate system to obtain a two-dimensional pixel coordinate point.
  • the three-dimensional coordinate point is point as the target three-dimensional coordinate point.
  • the texture map of the i-th frame (which can trigger the collection of black-and-white images and texture maps with color information at the same time) corresponds to the j-th two-dimensional coordinate Point p2D j
  • photogrammetry optimization is based on beam adjustment theory, based on least squares theory
  • optimization target all auxiliary feature points of the actual scan, projected according to the number of image frames in which they appear and the camera attitude T i (the projection function is defined as ⁇ )
  • the Euclidean distance between the two-dimensional pixel coordinates of the image captured in each frame and its corresponding two-dimensional coordinate point of the auxiliary feature point is the smallest, and the calculation is as shown in formula (1):
  • n the number of images to be processed.
  • Step 204 Calculate based on the preset scaling factor, target three-dimensional coordinate point and distribution true value to obtain the initial position transformation relationship, and calculate the scaling factor, target three-dimensional coordinate point, distribution true value and initial position transformation relationship based on the preset optimization formula Perform optimization calculations and obtain optimized values.
  • Step 205 Adjust the scaling factor and the initial position transformation relationship, and obtain the corresponding initial position transformation relationship when the optimization value is less than the preset threshold as the target position transformation relationship.
  • auxiliary feature points on the scanning rod or the auxiliary feature body connected to the scanning rod such as the true distribution value of the auxiliary feature points obtained by a SLR photogrammetry system, a two-dimensional imager, etc.
  • the true distribution of auxiliary feature points in a small area is spliced with scaling (the splicing is a rigid body and does not change the scale. When the scale changes, the S i scaling factor is introduced for calculation), thereby determining the target position transformation relationship of different scanning rods.
  • the scanning data processing scheme obtains multiple frames of images to be processed including auxiliary feature points; wherein the auxiliary feature points have corresponding distribution true values, and the two-dimensional coordinate points of the auxiliary feature points in each frame of the image to be processed are obtained , based on the calculation of the preset internal parameter matrix and two-dimensional coordinate points, the three-dimensional coordinate points of each frame of the image to be processed are obtained, and the three-dimensional coordinate points of each frame of the image to be processed are spliced to obtain all three-dimensional coordinate points in the same coordinate system. Project each three-dimensional coordinate point onto the image coordinate system to obtain a two-dimensional pixel coordinate point.
  • the three-dimensional coordinate point is used as the target.
  • the three-dimensional coordinate point is calculated based on the preset scaling factor, target three-dimensional coordinate point and distribution true value to obtain the initial position transformation relationship.
  • the scaling factor, target three-dimensional coordinate point, distribution true value and initial position transformation are The relationship is optimized and calculated to obtain the optimization value, the scaling factor and the initial position transformation relationship are adjusted, and the corresponding initial position transformation relationship when the optimization value is less than the preset threshold is obtained as the target position transformation relationship.
  • the target position transformation relationship is determined based on the preset distribution true value of the auxiliary feature points and the three-dimensional coordinate points calculated by the scan, so that the relative position between the scanning rods can be accurately located, and the data processing efficiency and accuracy in the oral scan scenario can be improved.
  • FIG. 4 is a schematic flowchart of another scan data processing method provided by an embodiment of the present disclosure. Based on the above embodiment, this embodiment further optimizes the above scan data processing method. As shown in Figure 4, the method includes:
  • Step 301 Control the scanning device to rotate in a preset direction while scanning the target oral cavity including the scanning rod at a preset frequency to obtain multiple frames of images to be processed; wherein the auxiliary feature points have corresponding distribution true values.
  • Step 302 Obtain the two-dimensional coordinate points of the auxiliary feature points in each frame of the image to be processed, and calculate based on the relative positions between the two cameras and the two-dimensional coordinate points to obtain the three-dimensional coordinate points of each frame of the image to be processed.
  • the three-dimensional coordinate points of the image are processed and spliced to obtain all three-dimensional coordinate points in the same coordinate system.
  • the image to be processed is obtained by scanning. Based on the image to be processed, the two-dimensional coordinate points of the auxiliary feature points in each frame of the image to be processed are obtained. Based on the relative positions between the two cameras, the auxiliary feature points of the single frame are reconstructed. Three-dimensional coordinate point, the reconstructed three-dimensional coordinate point is based on the left camera coordinate system of the current frame (usually the reconstructed depth information is based on the left camera coordinate system).
  • auxiliary feature points based on the three-dimensional coordinate points of the auxiliary feature points reconstructed in each frame, and based on the distance distribution information of the auxiliary feature points of adjacent frames (distance calculation can be performed on the coordinate points of two auxiliary feature points), adjacent frames are searched for and captured The same auxiliary feature point is obtained, the auxiliary feature point splicing is completed, and all three-dimensional coordinate points are obtained. That is, all three-dimensional coordinate points reconstructed from a single frame are fused into the overall frame points, that is, all three-dimensional coordinate points, according to distance constraints. All three-dimensional coordinate points are based on the overall tooth model coordinate system (world coordinate system).
  • Step 303 Project each three-dimensional coordinate point onto the image coordinate system to obtain a two-dimensional pixel coordinate point.
  • the three-dimensional coordinate point is point as the target three-dimensional coordinate point.
  • the texture map of the i-th frame corresponds to the j-th two-dimensional point p2DL j , p2DR j , and the rigid body transformation between the two eyes is T RL .
  • Photogrammetry optimization is based on beam adjustment theory. The optimization goal is: all auxiliary feature points of the actual scan are projected (the projection function is defined as ⁇ ) onto the image captured in each frame according to the number of image frames in which they appear and the camera posture T i .
  • the Euclidean distance between the two-dimensional pixel coordinates and the corresponding two-dimensional coordinate points of the auxiliary feature points is the smallest, and the calculation is as shown in formula (3):
  • n the number of images to be processed.
  • Step 304 Calculate based on the preset scaling factor, target three-dimensional coordinate point and distribution true value to obtain the initial position transformation relationship, and calculate the scaling factor, target three-dimensional coordinate point, distribution true value and initial position transformation relationship based on the preset optimization formula Perform optimization calculations and obtain optimized values.
  • Step 305 Adjust the scaling factor and the initial position transformation relationship, and obtain the corresponding initial position transformation relationship when the optimization value is less than the preset threshold as the target position transformation relationship.
  • auxiliary feature points on the scanning rod or the auxiliary feature body connected to the scanning rod such as the true distribution value of the auxiliary feature points obtained by a SLR photogrammetry system, a two-dimensional imager, etc.
  • the target three-dimensional coordinate point P after photogrammetry optimization is spliced with the true distribution of auxiliary feature points in each area (each scanning rod can be considered as a small area) with scaling (the splicing is a rigid body and does not change the scale.
  • S i scaling factor calculation is introduced to determine the target position transformation relationship of different scanning rods.
  • the scanning data processing scheme obtaineds multiple frames of images to be processed including auxiliary feature points; wherein the auxiliary feature points have corresponding distribution true values, and the two-dimensional coordinate points of the auxiliary feature points in each frame of the image to be processed are obtained , based on the relative position between the two cameras and the two-dimensional coordinate points, the three-dimensional coordinate points of each frame of the image to be processed are obtained.
  • the three-dimensional coordinate points of each frame of the image to be processed are spliced to obtain all three-dimensional coordinates in the same coordinate system. point, project each three-dimensional coordinate point onto the image coordinate system to obtain the two-dimensional pixel coordinate point.
  • the three-dimensional coordinate point As the target three-dimensional coordinate point, calculation is performed based on the preset scaling factor, target three-dimensional coordinate point and distribution true value to obtain the initial position transformation relationship. Based on the preset optimization formula, the scaling factor, target three-dimensional coordinate point, distribution true value and initial value are calculated. The position transformation relationship is optimized and calculated to obtain the optimization value, the scaling factor and the initial position transformation relationship are adjusted, and the corresponding initial position transformation relationship when the optimization value is less than the preset threshold is obtained as the target position transformation relationship.
  • the target position transformation relationship is determined based on the preset distribution true value of the auxiliary feature points and the three-dimensional coordinate points calculated by the scan, so that the relative position between the scanning rods can be accurately located, and the data processing efficiency and accuracy in the oral scan scenario can be improved.
  • FIG. 5 is a schematic structural diagram of a scan data processing device provided by an embodiment of the present disclosure.
  • the device can be implemented by software and/or hardware, and can generally be integrated in electronic equipment. As shown in Figure 5, the device includes:
  • the image acquisition module 401 is used to acquire multiple frames of images to be processed including auxiliary feature points; wherein the auxiliary feature points have corresponding distribution true values;
  • the image processing module 402 is used to process based on the multiple frames of images to be processed to obtain all three-dimensional coordinate points in the same coordinate system;
  • the measurement processing module 403 is used to perform measurement processing on all three-dimensional coordinate points to obtain the target three-dimensional coordinate point;
  • the determination module 404 is used to determine the target position transformation relationship based on the target three-dimensional coordinate points and the true distribution value.
  • the image processing module 402 is specifically used to:
  • the three-dimensional coordinate points of the image to be processed in each frame are spliced to obtain all the three-dimensional coordinate points in the same coordinate system.
  • the image processing module 402 is specifically used to:
  • the three-dimensional coordinate points of the image to be processed in each frame are spliced to obtain all the three-dimensional coordinate points in the same coordinate system.
  • the measurement processing module 403 is specifically used to:
  • the three-dimensional coordinate point is used as the target three-dimensional coordinate point.
  • the determination module 404 is specifically used for:
  • Adjust the scaling factor and the initial position transformation relationship and obtain the corresponding initial position transformation relationship when the optimization value is less than a preset threshold as the target position transformation relationship.
  • image acquisition module 401 is specifically used for:
  • the scanning device is controlled to rotate in a preset direction while scanning the target oral cavity including the scanning rod at a preset frequency to obtain the multiple frames of images to be processed.
  • the scan data processing device provided by the embodiments of the present disclosure can execute the scan data processing method provided by any embodiment of the present disclosure, and has functional modules and beneficial effects corresponding to the execution method.
  • Embodiments of the present disclosure also provide a computer program product, including computer programs/instructions, which calculate When the machine program/instruction is executed by the processor, the scan data processing method provided by any embodiment of the present disclosure is implemented.
  • FIG. 6 is a schematic structural diagram of an electronic device provided by an embodiment of the present disclosure.
  • a schematic structural diagram of an electronic device 500 suitable for implementing an embodiment of the present disclosure is shown.
  • the electronic device 500 in the embodiment of the present disclosure may include, but is not limited to, mobile phones, notebook computers, digital broadcast receivers, PDAs (personal digital assistants), PADs (tablet computers), PMP (portable multimedia players), vehicle-mounted terminals ( Mobile terminals such as car navigation terminals) and fixed terminals such as digital TVs, desktop computers, etc.
  • the electronic device shown in FIG. 6 is only an example and should not impose any limitations on the functions and scope of use of the embodiments of the present disclosure.
  • the electronic device 500 may include a processing device (eg, central processing unit, graphics processor, etc.) 501 that may be loaded into a random access device according to a program stored in a read-only memory (ROM) 502 or from a storage device 508 .
  • the program in the memory (RAM) 503 executes various appropriate actions and processes.
  • various programs and data required for the operation of the electronic device 500 are also stored.
  • the processing device 501, ROM 502 and RAM 503 are connected to each other via a bus 504.
  • An input/output (I/O) interface 505 is also connected to bus 504.
  • input devices 506 including, for example, a touch screen, touch pad, keyboard, mouse, camera, microphone, accelerometer, gyroscope, etc.; including, for example, a liquid crystal display (LCD), speakers, vibration
  • An output device 507 such as a computer
  • a storage device 508 including a magnetic tape, a hard disk, etc.
  • Communication device 509 may allow electronic device 500 to communicate wirelessly or wiredly with other devices to exchange data.
  • FIG. 6 illustrates electronic device 500 with various means, it should be understood that implementation or availability of all illustrated means is not required. More or fewer means may alternatively be implemented or provided.
  • embodiments of the present disclosure include a computer program product including a computer program carried on a non-transitory computer-readable medium, the computer program containing program code for performing the method illustrated in the flowchart.
  • the computer program may be downloaded and installed from the network via communication device 509, or from storage device 508, or from ROM 502.
  • the processing device 501 When the computer program is executed by the processing device 501, the above-mentioned functions defined in the scan data processing method of the embodiment of the present disclosure are performed.
  • the computer-readable medium mentioned above in the present disclosure may be a computer-readable signal medium or a computer-readable storage medium, or any combination of the above two.
  • a computer-readable storage medium may be, for example, but not limited to, an electrical, magnetic, optical, electromagnetic, infrared, or semiconductor system, device, or device. pieces, or any combination of the above. More specific examples of computer readable storage media may include, but are not limited to: an electrical connection having one or more wires, a portable computer disk, a hard drive, random access memory (RAM), read only memory (ROM), removable Programmd read-only memory (EPROM or flash memory), fiber optics, portable compact disk read-only memory (CD-ROM), optical storage device, magnetic storage device, or any suitable combination of the above.
  • a computer-readable storage medium may be any tangible medium that contains or stores a program for use by or in connection with an instruction execution system, apparatus, or device.
  • a computer-readable signal medium may include a data signal propagated in baseband or as part of a carrier wave, carrying computer-readable program code therein. Such propagated data signals may take many forms, including but not limited to electromagnetic signals, optical signals, or any suitable combination of the above.
  • a computer-readable signal medium may also be any computer-readable medium other than a computer-readable storage medium that can send, propagate, or transmit a program for use by or in connection with an instruction execution system, apparatus, or device .
  • Program code embodied on a computer-readable medium may be transmitted using any suitable medium, including but not limited to: wire, optical cable, RF (radio frequency), etc., or any suitable combination of the above.
  • the client and server can communicate using any currently known or future developed network protocol such as HTTP (Hyper Text Transfer Protocol), and can communicate with digital data in any form or medium.
  • Data communications e.g., communications network
  • communications networks include local area networks (“LAN”), wide area networks (“WAN”), the Internet (e.g., the Internet), and end-to-end networks (e.g., ad hoc end-to-end networks), as well as any currently known or developed in the future network of.
  • the above-mentioned computer-readable medium may be included in the above-mentioned electronic device; it may also exist independently without being assembled into the electronic device.
  • the computer-readable medium carries one or more programs.
  • the electronic device receives the user's information display triggering operation during the playback of the video; obtains the At least two target information associated with the video; display the first target information among the at least two target information in the information display area of the play page of the video, wherein the size of the information display area is smaller than the size of the play page. Size: Receive the user's first switching triggering operation, and switch the first target information displayed in the information display area to the second target information among the at least two target information.
  • Computer program code for performing the operations of the present disclosure may be written in one or more programming languages, including but not limited to object-oriented programming languages—such as Java, Smalltalk, C++, and Includes conventional procedural programming languages—such as “C” or similar programming language.
  • the program code may execute entirely on the user's computer, partly on the user's computer, as a stand-alone software package, partly on the user's computer and partly on a remote computer or entirely on the remote computer or server.
  • the remote computer can be connected to the user's computer through any kind of network, including a local area network (LAN) or a wide area network (WAN), or it can be connected to an external computer (such as an Internet service provider through Internet connection).
  • LAN local area network
  • WAN wide area network
  • Internet service provider such as an Internet service provider through Internet connection
  • each block in the flowchart or block diagram may represent a module, segment, or portion of code that contains one or more logic functions that implement the specified executable instructions.
  • the functions noted in the block may occur out of the order noted in the figures. For example, two blocks shown one after another may actually execute substantially in parallel, or they may sometimes execute in the reverse order, depending on the functionality involved.
  • each block of the block diagram and/or flowchart illustration, and combinations of blocks in the block diagram and/or flowchart illustration can be implemented by special purpose hardware-based systems that perform the specified functions or operations. , or can be implemented using a combination of specialized hardware and computer instructions.
  • the units involved in the embodiments of the present disclosure can be implemented in software or hardware. Among them, the name of a unit does not constitute a limitation on the unit itself under certain circumstances.
  • FPGAs Field Programmable Gate Arrays
  • ASICs Application Specific Integrated Circuits
  • ASSPs Application Specific Standard Products
  • SOCs Systems on Chips
  • CPLD Complex Programmable Logical device
  • a machine-readable medium may be a tangible medium that may contain or store a program for use by or in connection with an instruction execution system, apparatus, or device.
  • the machine-readable medium may be a machine-readable signal medium or a machine-readable storage medium.
  • Machine-readable media may include, but are not limited to, electronic, magnetic, optical, electromagnetic, infrared, or semiconductor systems, devices or devices, or any suitable combination of the foregoing.
  • machine-readable storage media would include one or more wire-based electrical connections, laptop disks, hard drives, random access memory (RAM), read only memory (ROM), erasable programmable read only memory (EPROM or flash memory), optical fiber, portable compact disk read-only memory (CD-ROM), optical storage device, magnetic storage device equipment, or any suitable combination of the above.
  • RAM random access memory
  • ROM read only memory
  • EPROM or flash memory erasable programmable read only memory
  • CD-ROM portable compact disk read-only memory
  • optical storage device magnetic storage device equipment, or any suitable combination of the above.
  • the present disclosure provides an electronic device, including:
  • memory for storing instructions executable by the processor
  • the processor is configured to read the executable instructions from the memory and execute the instructions to implement any of the scan data processing methods provided by this disclosure.
  • the present disclosure provides a computer-readable storage medium, the storage medium stores a computer program, the computer program is used to perform any of the scanning provided by the present disclosure. Data processing methods.
  • the scanning data processing scheme disclosed in this disclosure determines the target position transformation relationship based on the preset distribution true values of auxiliary feature points and the three-dimensional coordinate points calculated by scanning, so that the relative positions between the scanning rods can be accurately located and the oral scanning scene can be improved. It has high data processing efficiency and accuracy and has strong industrial practicality.

Landscapes

  • Health & Medical Sciences (AREA)
  • Life Sciences & Earth Sciences (AREA)
  • Engineering & Computer Science (AREA)
  • Biomedical Technology (AREA)
  • Biophysics (AREA)
  • Oral & Maxillofacial Surgery (AREA)
  • Dentistry (AREA)
  • Epidemiology (AREA)
  • Animal Behavior & Ethology (AREA)
  • General Health & Medical Sciences (AREA)
  • Public Health (AREA)
  • Veterinary Medicine (AREA)
  • Image Processing (AREA)
  • Dental Tools And Instruments Or Auxiliary Dental Instruments (AREA)

Abstract

本公开实施例涉及一种扫描数据处理方法、装置、设备及介质,其中该方法包括:获取包括辅助特征点的多帧待处理图像;其中,辅助特征点具有对应的分布真值,基于多帧待处理图像进行处理,结合当前相机姿态得到同一坐标系下的所有三维坐标点,对所有三维坐标点进行测量处理,得到目标三维坐标点,基于目标三维坐标点和分布真值,确定目标位置变换关系。采用上述技术方案,基于辅助特征点预先设置的分布真值和扫描计算的三维坐标点确定目标位置变换关系,从而能够准确定位到扫描杆之间的相对位置,提高口扫场景下的数据处理效率和精度。

Description

扫描数据处理方法、装置、设备及介质
本公开要求于2022年05月02日提交中国专利局、申请号为202210494057.6、发明名称为“扫描数据处理方法、装置、设备及介质”的中国专利申请的优先权,其全部内容通过引用结合在本公开中。
技术领域
本公开涉及口内扫描技术领域,尤其涉及一种扫描数据处理方法、装置、设备及介质。
背景技术
通常,对口腔缺失牙齿的修复场景中,通过扫描杆进行扫描确定种植位置。
相关技术中,由于口扫的扫描范围限制,在扫描口内数据时通常使用的是多数据拼接的方案,由于累计误差的存在,最终导致模型的整体精度不高。
发明内容
(一)要解决的技术问题
本公开要解决的技术问题是解决现有的在扫描口内数据时通常使用的是多数据拼接的方案,由于累计误差的存在,最终导致模型的整体精度不高的问题。
(二)技术方案
为了解决上述技术问题,本公开实施例提供了一种扫描数据处理方法、装置、设备及介质,包括:
第一方面,提供一种扫描数据处理方法,所述方法包括:
获取包括辅助特征点的多帧待处理图像;其中,所述辅助特征点具有对应的分布真值;
基于所述多帧待处理图像进行处理,得到同一坐标系下的所有三维坐标点;
对所述所有三维坐标点进行测量处理,得到目标三维坐标点;
基于所述目标三维坐标点和分布真值,确定目标位置变换关系。
第二方面,还提供一种扫描数据处理装置,所述装置包括:
获取图像模块,用于获取包括辅助特征点的多帧待处理图像;其中,所述辅助特征点具有对应的分布真值;
图像处理模块,用于基于所述多帧待处理图像进行处理,得到同一坐标系下 的所有三维坐标点;
测量处理模块,用于对所述所有三维坐标点进行测量处理,得到目标三维坐标点;
确定模块,用于基于所述目标三维坐标点和分布真值,确定目标位置变换关系。
第三方面,还提供一种电子设备,所述电子设备包括:处理器;用于存储所述处理器可执行指令的存储器;所述处理器,用于从所述存储器中读取所述可执行指令,并执行所述指令以实现如本公开实施例提供的扫描数据处理方法。
第四方面,还提供一种计算机存储介质,其中,所述存储介质存储有计算机程序,所述计算机程序用于执行如本公开第一方面提供一种扫描数据处理方法。
(三)有益效果
本公开实施例提供的上述技术方案与现有技术相比具有如下优点:
本公开实施例提供的扫描数据处理方案,获取包括辅助特征点的多帧待处理图像;其中,辅助特征点具有对应的分布真值,基于多帧待处理图像进行处理,得到同一坐标系下的所有三维坐标点,对所有三维坐标点进行测量处理,得到目标三维坐标点,基于目标三维坐标点和分布真值,确定目标位置变换关系。采用上述技术方案,基于辅助特征点预先设置的分布真值和扫描计算的三维坐标点确定目标位置变换关系,从而能够准确定位到扫描杆之间的相对位置,提高口扫场景下的数据处理效率和精度。
应当理解的是,以上的一般描述和后文的细节描述仅是示例性和解释性的,并不能限制本公开。
附图说明
此处的附图被并入说明书中并构成本说明书的一部分,示出了符合本公开的实施例,并与说明书一起用于解释本公开的原理。
为了更清楚地说明本公开实施例或现有技术中的技术方案,下面将对实施例或现有技术描述中所需要使用的附图作简单地介绍,显而易见地,对于本领域普通技术人员而言,在不付出创造性劳动性的前提下,还可以根据这些附图获得其他的附图。
图1为本公开实施例提供的一种扫描数据处理的应用场景图;
图2为本公开实施例提供的一种扫描数据处理方法的流程示意图;
图3为本公开实施例提供的另一种扫描数据处理方法的流程示意图;
图4为本公开实施例提供的又一种扫描数据处理方法的流程示意图;
图5为本公开实施例提供的一种扫描数据处理装置的结构示意图;
图6为本公开实施例提供的一种电子设备的结构示意图。
具体实施方式
为使本公开实施例的目的、技术方案和优点更加清楚,下面将对本公开实施例中的技术方案进行清楚、完整地描述,显然,所描述的实施例是本公开的一部分实施例,而不是全部的实施例。基于本公开中的实施例,本领域普通技术人员在没有做出创造性劳动的前提下所获得的所有其他实施例,都属于本公开保护的范围。
在实际应用中,对口腔缺失牙齿的修复场景中,由于口扫的扫描范围限制,在扫描口内数据时通常使用的是多数据拼接的方案,由于累计误差的存在,最终导致模型的整体精度不高。
针对上述问题,本公开提出一种扫描数据处理方法,可以应用于如图1所示的应用环境中。图1为本公开实施例提供的一种扫描数据处理的应用场景图,该应用环境包括:在目标口腔安装多个口内扫描杆,口内扫描杆包括:扫描杆部件11,与扫描杆部件11连接的辅助部件12,扫描杆部件11和/或辅助部件12上设置有辅助特征点,其中,辅助部件12的形状特征本身作为辅助特征,扫描杆部件11与安装于目标口腔的种植体适配,通过扫描杆部件11与种植体适配安装,使得口内扫描杆安装于目标口腔。
其中,多个口内扫描杆中任意两个口内扫描杆的辅助部件12相互适配,使得任意两个口内扫描杆10相邻安装于口腔时两辅助部件上的辅助特征点呈连续分布,可以预先通过比如单反摄影测量系统,三坐标仪等获取辅助特征点的真值坐标点,理论上扫描获取图像对应的三维坐标点与预先获取的辅助特征点的真值坐标点一一对应。
作为一种场景举例,在目标口腔安装多个口内扫描杆,口内扫描杆包括用于与种植体连接的扫描杆部件及与扫描杆连接的辅助部件,口内扫描杆设有目标特征,目标特征呈连续地分布于扫描杆和/或辅助部件,目标特征非单面地分布于扫描杆和/或辅助部件。
具体地,口内扫描仪扫描目标口腔,获取多帧图像,并传输给数据处理模块进行数据处理,数据处理模块执行以下方法:
获取多帧图像,并基于多帧图像获取目标口腔的初始三维数据,初始三维数 据包括同一坐标系下的目标口腔的初始点集、目标特征的三维坐标测量值;
获取口内扫描杆的预设模型,预设模型包括同一坐标系下目标特征的三维坐标真值和口内扫描杆的真实点集(各个点的三维坐标真值)
基于目标特征的三维坐标测量值与真值的对应关系将目标口腔的初始点集与口内扫描杆的真实点集进行拼接;
基于拼接后的口内扫描杆的真实点集确定口内扫描杆的定位信息,口内扫描杆的定位信息即为种植体的定位信息,基于该定位信息进行牙体设计,使得设计制作的牙体能够与种植体适配安装。
具体地,通过获取包括辅助特征点的多帧待处理图像;其中,辅助特征点具有对应的分布真值,基于多帧待处理图像进行处理,得到同一坐标系下的所有三维坐标点,对所有三维坐标点进行测量处理,得到目标三维坐标点,基于目标三维坐标点和分布真值,确定目标位置变换关系,实现基于辅助特征点预先设置的分布真值和扫描计算的三维坐标点确定目标位置变换关系,从而能够准确定位到扫描杆之间的相对位置,提高口扫场景下的数据处理效率和精度。
具体地,图2为本公开实施例提供的一种扫描数据处理方法的流程示意图,该方法可以由扫描数据处理装置执行,其中该装置可以采用软件和/或硬件实现,一般可集成在电子设备中。如图2所示,该方法包括:
步骤101、获取包括辅助特征点的多帧待处理图像;其中,辅助特征点具有对应的分布真值。
其中,扫描杆包括扫描杆主体和辅助特征体,扫描杆主体设置上有辅助特征点;或者是扫描杆主体上没有辅助特征点,扫描杆主体之间的辅助特征体设置有辅助特征点;或者是扫描辅助体本身的形状作为辅助特征点。
其中,目标口腔指的是需要进行牙齿种植的口腔,需要进行口腔内扫描以对口腔内的种植点进行定位,预先多个口内扫描杆中任意两个口内扫描杆的辅助部件相邻安装于目标口腔中且两辅助部件上的辅助特征点呈连续分布,从而可以对目标口腔进行口内扫描,得到多帧待处理图像。
其中,可以通过手持式口腔扫描仪(单目或者双目相机)对目标口腔进行扫描,也就是说通过拍照的方式获取多帧待处理图像,比如一秒钟采集几十帧待处理图像,可以循环采集。
在本公开实施例中,扫描杆是一种包含辅助特征点的特征物体,其中,辅助特征点能够唯一标识一个特征,即在扫描杆设置有辅助特征点,每个辅助特征点 能够唯一标识扫描杆上对应的位置特征,比如在扫描杆上的位置1和位置2分别设置目标特征a和目标特征b,目标特征a的能够唯一标识扫描杆上位置1的位置特征,目标特征b能够唯一标识扫描杆上位置2的位置特征。
可以理解的是,扫描杆上不同形状、颜色、二维码等具有唯一标识扫描杆上对应的位置特征的都可以作为辅助特征点。
具体地,可以通过更高精度的设备,比如单反摄影测量系统,三坐标仪等获取辅助特征点的分布真值或者是计算机辅助设计的辅助特征点的分布真值,即辅助特征点的坐标值。
在本公开实施例中,每次扫描获取的待处理图像都包括至少预设数量的辅助特征点以表示辅助特征点分布的连续性,从而保证后续计算精度,其中预设数量可以根据应用场景设置。
举例而言,预设数量为3,扫描获取的待处理图像包括辅助特征点为2个时,则表示辅助特征点分布不连续,扫描获取的待处理图像包括辅助特征点为3个或者3个以上时表示辅助特征点分布连续。
本公开实施例中,获取多帧待处理图像的方式有很多种,在一些实施方式中,通过控制单目相机按照一定方向旋转的同时按照一定频率对目标口腔进行扫描,得到多帧待处理图像。
在另一些实施方式中,通过控制双目相机按照循环对目标口腔进行扫描,得到多帧待处理图像。以上两种方式仅为获取多帧待处理图像的示例,本公开实施例不对获取多帧待处理图像的具体方式进行限定。
具体的,在目标口腔连接好扫描杆后,对包括扫描杆的目标口腔进行扫描,获取多帧待处理图像。
步骤102、基于多帧待处理图像进行处理,得到同一坐标系下的所有三维坐标点。
其中,所有三维坐标点指的是目标口腔中所有辅助特征点对应的三维坐标点。
在本公开实施例中,基于多帧待处理图像进行处理,得到同一坐标系下的所有三维坐标点的方式有很多种,在一些实施方式中,获取每帧待处理图像中辅助特征点的二维坐标点,基于预设的内参矩阵和二维坐标点进行三维重建,得到每帧待处理图像的辅助特征点相应的三维坐标点,对多帧待处理图像的辅助特征点相应的的三维坐标点进行拼接,得到多帧待处理图像的辅助特征点在同一坐标系下的三维坐标点。
在另一些实施方式中,获取每帧待处理图像中辅助特征点的二维坐标点,基于双相机之间的相对位置和二维坐标点进行三维重建,得到每帧待处理图像的辅助特征点相应的三维坐标点,对多帧待处理图像的辅助特征点相应的三维坐标点进行拼接,得到多帧待处理图像的辅助特征点在同一坐标系下的三维坐标点。以上两种方式仅为基于多帧待处理图像进行处理,得到同一坐标系下的三维坐标点的示例,本公开实施例不对基于多帧待处理图像进行处理,得到同一坐标系下的三维坐标点的具体方式进行限定。
本公开实施例中,当获取多帧待处理图像之后,可以基于多帧待处理图像进行处理,得到同一坐标系下的三维坐标点。
步骤103、对三维坐标点进行测量处理,得到目标三维坐标点。
其中,目标三维坐标点指的是对三维坐标点进行测量处理后的三维坐标点,能够更加精确体现辅助特征点的三维坐标点。
在本公开实施例中,对所有三维坐标点进行测量处理,得到目标三维坐标点的方式有很多种,在一些实施方式中,将每个三维坐标点投射到图像坐标系上,得到二维像素坐标点,在二维像素坐标点和三维坐标点对应的二维坐标点之间的欧式距离最小时,将三维坐标点作为目标三维坐标点。
在另一些实施方式中,获取三维坐标点出现的第N帧图像,获取第N帧图像的二维像素坐标点,基于三维坐标点投射的二维坐标点和二维像素坐标点的距离确定目标三维坐标点;其中,N为正整数。以上两种方式仅为对所有三维坐标点进行测量处理,得到目标三维坐标点的示例,本公开实施例不对所有三维坐标点进行测量处理,得到目标三维坐标点的具体方式进行限定。
具体的,获取所有三维坐标点之后,可以对所有三维坐标点进行测量处理,得到目标三维坐标点。
步骤104、基于目标三维坐标点和分布真值,确定目标位置变换关系。
其中,目标位置变换关系指的是将扫描得到的目标三维坐标点转换到对应设计的分布真值的变换矩阵。
在本公开实施例中,基于目标三维坐标点和分布真值,确定目标位置变换关系的方式有很多种,在一些实施方式中,基于预设的缩放因子、目标三维坐标点和分布真值进行计算,得到初始位置变换关系,基于预设的优化公式对缩放因子、目标三维坐标点、分布真值和初始位置变换关系进行优化计算,得到优化值,调整缩放因子和初始位置变换关系,获取优化值小于预设阈值时对应的初始位置变 换关系作为目标位置变换关系。
在另一些实施方式中,计算每个目标三维坐标点到分布真值的位置变换关系,基于多个位置变换关系,得到目标位置变换关系。以上两种方式仅为基于目标三维坐标点和分布真值,确定目标位置变换关系的示例,本公开实施例不对基于目标三维坐标点和分布真值,确定目标位置变换关系的具体方式进行限定。
本公开实施例提供的扫描数据处理方案,获取多帧待处理图像;其中,每帧待处理图像包括辅助特征点,辅助特征点具有对应的分布真值,基于多帧待处理图像进行处理,得到所有三维坐标点,对所有三维坐标点进行测量处理,得到目标三维坐标点,基于目标三维坐标点和分布真值,确定目标位置变换关系。采用上述技术方案,基于辅助特征点预先设置的分布真值和扫描计算的三维坐标点确定目标位置变换关系,从而能够准确定位到扫描杆之间的相对位置,提高口扫场景下的数据处理效率和精度。
基于上述实施例的描述,可以通过单目相机和双目相机进行扫描,下面结合图3和图4分别针对单目相机和双目相机进行详细描述。
具体地,图3为本公开实施例提供的另一种扫描数据处理方法的流程示意图,本实施例在上述实施例的基础上,进一步优化了上述扫描数据处理方法。如图3所示,该方法包括:
步骤201、控制扫描设备按照预设方向旋转的同时按照预设的频率对包括扫描杆的目标口腔进行扫描,得到多帧待处理图像;其中,辅助特征点具有对应的分布真值。
具体地,通过控制单目相机按照一定方向旋转的同时按照一定频率对目标口腔进行扫描,得到多帧待处理图像,或通过控制双目相机按照循环对目标口腔进行扫描,得到多帧待处理图像。
步骤202、获取每帧待处理图像中辅助特征点的二维坐标点,基于预设的内参矩阵和二维坐标点进行计算,得到每帧待处理图像的三维坐标点,对每帧待处理图像的三维坐标点进行拼接,得到同一坐标系下的所有三维坐标点。
具体地,通过扫描获取多帧待处理图像(比如通过硬件触发以拍照的方式,一秒钟采集几十帧待处理图像,循环采集),基于待处理图像,获取每帧待处理图像中辅助特征点的二维坐标点(即图像识别处理的方式,提取每帧待处理图像上的辅助特征点像素坐标),再根据相机内参数(比如焦距、主点、倾斜因子和镜头畸变)、纹理跟踪的姿态(比如通过获取扫描杆表面的几何和纹理信息,进行相邻 帧的拼接),完成当前帧辅助特征点的重建,重建出的三维坐标点基于扫描时的第一帧相机坐标系(第一帧相机坐标系是第一帧图像确定的,后续每一帧都是和第一帧拼接,所以坐标系都是第一帧相机坐标系;图识别的是二维坐标点,需要将其转换到相机坐标系下,得到三维坐标点)。
其中,内参矩阵指的是相机内参数构成的矩阵,从而基于内参矩阵将二维坐标点反投影到相机坐标系下,得到三维坐标点。
具体地,根据每帧待处理图像的三维坐标点进行拼接,得到所有三维坐标点,基于预设的拼接算法能够将两帧不同坐标系,通过其公共部分,完成对齐,从而拼接起来,得到所有三维坐标点。即,所有单帧重建出的三维坐标点按照距离约束融合成整体框架点即所有三维坐标点,该所有三维坐标点基于牙齿整体模型坐标系(世界坐标系)。
步骤203、将每个三维坐标点投射到图像坐标系上,得到二维像素坐标点,在二维像素坐标点和三维坐标点对应的二维坐标点之间的欧式距离最小时,将三维坐标点作为目标三维坐标点。
具体地,假设第K个辅助特征点p3Dk出现在第i帧图像,第i帧的纹理图(同一时刻可以触发采集黑白图和带有颜色信息的纹理图)上对应第j个二维坐标点p2Dj,摄影测量优化基于光束平差理论,基于最小二乘理论,优化目标:实际扫描的所有辅助特征点,根据其出现的图像帧数和相机姿态Ti投射(投影函数定义为Π)到每一帧拍摄到的图像上的二维像素坐标和其对应的辅助特征点二维坐标点欧氏距离最小,计算如公式(1)所示:
其中,m表示三维坐标点的数量,n表示待处理图像的数量。
步骤204、基于预设的缩放因子、目标三维坐标点和分布真值进行计算,得到初始位置变换关系,基于预设的优化公式对缩放因子、目标三维坐标点、分布真值和初始位置变换关系进行优化计算,得到优化值。
步骤205、调整缩放因子和初始位置变换关系,获取优化值小于预设阈值时对应的初始位置变换关系作为目标位置变换关系。
具体地,根据扫描杆或扫描杆连接辅助特征体上的辅助特征点的分布真值(比如单反摄影测量系统,二维影像仪等获取辅助特征点的分布真值),记为qi,将上述摄影测量优化后的目标三维坐标点点P,与各区域(每一根扫描杆可以认为是一 个小区域的辅助特征点的分布真值做带有缩放的拼接(拼接是刚体,不改变尺度,当尺度发生变化时,引入Si缩放因子计算),从而确定不同扫描杆的目标位置变换关系,minf()指的是非线性最小二乘算法,具体如公式(2)所示:
min f(si,Ri,ti)=si*Ri*p+ti-qi (2)
具体地,假设Si=1(真实值可能是0.9-1.1之间),然后通过P和qi计算一个初始(Ri,ti),最后将假设的Si和初始(Ri,ti),放在一起优化,优化的目标就是让P点经过Si缩放后,再通过初始(Ri,ti)能够和qi正好拼接上,整个优化是一个数学迭代的过程,就是不停地去调整Si、(Ri,ti),使得拼接后的P和qi两个点无限接近。
由此,通过扫描整个口腔,能够准确定位到扫描杆之间的相对位置。
本公开实施例提供的扫描数据处理方案,获取包括辅助特征点的多帧待处理图像;其中,辅助特征点具有对应的分布真值,获取每帧待处理图像中辅助特征点的二维坐标点,基于预设的内参矩阵和二维坐标点进行计算,得到每帧待处理图像的三维坐标点,对每帧待处理图像的三维坐标点进行拼接,得到同一坐标系下的所有三维坐标点,将每个三维坐标点投射到图像坐标系上,得到二维像素坐标点,在二维像素坐标点和三维坐标点对应的二维坐标点之间的欧式距离最小时,将三维坐标点作为目标三维坐标点,基于预设的缩放因子、目标三维坐标点和分布真值进行计算,得到初始位置变换关系,基于预设的优化公式对缩放因子、目标三维坐标点、分布真值和初始位置变换关系进行优化计算,得到优化值,调整缩放因子和初始位置变换关系,获取优化值小于预设阈值时对应的初始位置变换关系作为目标位置变换关系。由此,基于辅助特征点预先设置的分布真值和扫描计算的三维坐标点确定目标位置变换关系,从而能够准确定位到扫描杆之间的相对位置,提高口扫场景下的数据处理效率和精度。
具体地,图4为本公开实施例提供的又一种扫描数据处理方法的流程示意图,本实施例在上述实施例的基础上,进一步优化了上述扫描数据处理方法。如图4所示,该方法包括:
步骤301、控制扫描设备按照预设方向旋转的同时按照预设的频率对包括扫描杆的目标口腔进行扫描,得到多帧待处理图像;其中,辅助特征点具有对应的分布真值。
具体地,通过控制单目相机按照一定方向旋转的同时按照一定频率对目标口腔进行扫描,得到多帧待处理图像,或通过控制双目相机按照循环对目标口腔进行扫描,得到多帧待处理图像。
步骤302,获取每帧待处理图像中辅助特征点的二维坐标点,基于双相机之间的相对位置和二维坐标点进行计算,得到每帧待处理图像的三维坐标点,对每帧待处理图像的三维坐标点进行拼接,得到同一坐标系下的所有三维坐标点。
具体地,通过扫描获取待处理图像,基于待处理图像,获取每帧待处理图像中辅助特征点的二维坐标点,基于双相机之间的相对位置,重建出的单帧的辅助特征点的三维坐标点,重建出的三维坐标点基于当前帧左相机坐标系下(通常重建出来的深度信息基于左相机坐标系下)。
具体地,根据每一帧重建出的辅助特征点的三维坐标点,基于相邻帧的辅助特征点的距离分布信息(两两辅助特征点点坐标点进行距离计算即可),搜索相邻帧拍摄到的同一个辅助特征点,完成辅助特征点拼接,得到所有三维坐标点。即,所有单帧重建出的三维坐标点按照距离约束融合成整体框架点即所有三维坐标点,该所有三维坐标点基于牙齿整体模型坐标系(世界坐标系)。
步骤303、将每个三维坐标点投射到图像坐标系上,得到二维像素坐标点,在二维像素坐标点和三维坐标点对应的二维坐标点之间的欧式距离最小时,将三维坐标点作为目标三维坐标点。
具体地,假设第K个辅助特征点p3Dk出现在第i帧,第i帧的纹理图上对应第j个二维点p2DLj,p2DRj,双目之间的刚体变换为TRL。摄影测量优化基于光束平差理论,优化目标:实际扫描的所有辅助特征点,根据其出现的图像帧数和相机姿态Ti投射(投影函数定义为Π)到每一帧拍摄到的图像上的二维像素坐标和其对应的辅助特征点二维坐标点欧氏距离最小,计算如公式(3)所示:
其中,m表示三维坐标点的数量,n表示待处理图像的数量。
步骤304、基于预设的缩放因子、目标三维坐标点和分布真值进行计算,得到初始位置变换关系,基于预设的优化公式对缩放因子、目标三维坐标点、分布真值和初始位置变换关系进行优化计算,得到优化值。
步骤305、调整缩放因子和初始位置变换关系,获取优化值小于预设阈值时对应的初始位置变换关系作为目标位置变换关系。
具体地,根据扫描杆或扫描杆连接辅助特征体上的辅助特征点的分布真值(比如单反摄影测量系统,二维影像仪等获取辅助特征点的分布真值),记为qi,将上 述摄影测量优化后的目标三维坐标点点P,与各区域(每一根扫描杆可以认为是一个小区域的辅助特征点的分布真值做带有缩放的拼接(拼接是刚体,不改变尺度,当尺度发生变化时,引入Si缩放因子计算),从而确定不同扫描杆的目标位置变换关系,minf()指的是非线性最小二乘算法,具体如公式(2)所示:
minf(si,Ri,ti)=si*Ri*p+ti-qi (2)
具体地,假设Si=1(真实值可能是0.9-1.1之间),然后通过P和qi计算一个初始(Ri,ti),最后将假设的Si和初始(Ri,ti),放在一起优化,优化的目标就是让P点经过Si缩放后,再通过初始(Ri,ti)能够和qi正好拼接上,整个优化是一个数学迭代的过程,就是不停地去调整Si、(Ri,ti),使得拼接后的P和qi两个点无限接近。
由此,通过扫描整个口腔,能够准确定位到扫描杆之间的相对位置。
本公开实施例提供的扫描数据处理方案,获取包括辅助特征点的多帧待处理图像;其中,辅助特征点具有对应的分布真值,获取每帧待处理图像中辅助特征点的二维坐标点,基于双相机之间的相对位置和二维坐标点进行计算,得到每帧待处理图像的三维坐标点,对每帧待处理图像的三维坐标点进行拼接,得到同一坐标系下的所有三维坐标点,将每个三维坐标点投射到图像坐标系上,得到二维像素坐标点,在二维像素坐标点和三维坐标点对应的二维坐标点之间的欧式距离最小时,将三维坐标点作为目标三维坐标点,基于预设的缩放因子、目标三维坐标点和分布真值进行计算,得到初始位置变换关系,基于预设的优化公式对缩放因子、目标三维坐标点、分布真值和初始位置变换关系进行优化计算,得到优化值,调整缩放因子和初始位置变换关系,获取优化值小于预设阈值时对应的初始位置变换关系作为目标位置变换关系。由此,基于辅助特征点预先设置的分布真值和扫描计算的三维坐标点确定目标位置变换关系,从而能够准确定位到扫描杆之间的相对位置,提高口扫场景下的数据处理效率和精度。
图5为本公开实施例提供的一种扫描数据处理装置的结构示意图,该装置可由软件和/或硬件实现,一般可集成在电子设备中。如图5所示,该装置包括:
获取图像模块401,用于获取包括辅助特征点的多帧待处理图像;其中,所述辅助特征点具有对应的分布真值;
图像处理模块402,用于基于所述多帧待处理图像进行处理,得到同一坐标系下的所有三维坐标点;
测量处理模块403,用于对所述所有三维坐标点进行测量处理,得到目标三维坐标点;
确定模块404,用于基于所述目标三维坐标点和分布真值,确定目标位置变换关系。
可选的,所述图像处理模块402具体用于:
获取每帧所述待处理图像中辅助特征点的二维坐标点;
基于预设的内参矩阵和所述二维坐标点进行计算,得到每帧所述待处理图像的三维坐标点;
对每帧所述待处理图像的三维坐标点进行拼接,得到同一坐标系下的所述所有三维坐标点。
可选的,所述图像处理模块402具体用于:
获取每帧所述待处理图像中辅助特征点的二维坐标点;
基于双相机之间的相对位置和所述二维坐标点进行计算,得到每帧所述待处理图像的三维坐标点;
对每帧所述待处理图像的三维坐标点进行拼接,得到同一坐标系下的所述所有三维坐标点。
可选的,所述测量处理模块403具体用于:
将每个所述三维坐标点投射到图像坐标系上,得到二维像素坐标点;
在所述二维像素坐标点和所述三维坐标点对应的二维坐标点之间的欧式距离最小时,将所述三维坐标点作为所述目标三维坐标点。
可选的,所述确定模块404,具体用于:
基于预设的缩放因子、所述目标三维坐标点和所述分布真值进行计算,得到初始位置变换关系;
基于预设的优化公式对所述缩放因子、所述目标三维坐标点、所述分布真值和所述初始位置变换关系进行优化计算,得到优化值;
调整所述缩放因子和所述初始位置变换关系,获取所述优化值小于预设阈值时对应的所述初始位置变换关系作为所述目标位置变换关系。
可选的,获取图像模块401,具体用于:
控制扫描设备按照预设方向旋转的同时按照预设的频率对包括扫描杆的目标口腔进行扫描,得到所述多帧待处理图像。
本公开实施例所提供的扫描数据处理装置可执行本公开任意实施例所提供的扫描数据处理方法,具备执行方法相应的功能模块和有益效果。
本公开实施例还提供了一种计算机程序产品,包括计算机程序/指令,该计算 机程序/指令被处理器执行时实现本公开任意实施例所提供的扫描数据处理方法。
图6为本公开实施例提供的一种电子设备的结构示意图。下面具体参考图6,其示出了适于用来实现本公开实施例中的电子设备500的结构示意图。本公开实施例中的电子设备500可以包括但不限于诸如移动电话、笔记本电脑、数字广播接收器、PDA(个人数字助理)、PAD(平板电脑)、PMP(便携式多媒体播放器)、车载终端(例如车载导航终端)等等的移动终端以及诸如数字TV、台式计算机等等的固定终端。图6示出的电子设备仅仅是一个示例,不应对本公开实施例的功能和使用范围带来任何限制。
如图6所示,电子设备500可以包括处理装置(例如中央处理器、图形处理器等)501,其可以根据存储在只读存储器(ROM)502中的程序或者从存储装置508加载到随机访问存储器(RAM)503中的程序而执行各种适当的动作和处理。在RAM 503中,还存储有电子设备500操作所需的各种程序和数据。处理装置501、ROM502以及RAM 503通过总线504彼此相连。输入/输出(I/O)接口505也连接至总线504。
通常,以下装置可以连接至I/O接口505:包括例如触摸屏、触摸板、键盘、鼠标、摄像头、麦克风、加速度计、陀螺仪等的输入装置506;包括例如液晶显示器(LCD)、扬声器、振动器等的输出装置507;包括例如磁带、硬盘等的存储装置508;以及通信装置509。通信装置509可以允许电子设备500与其他设备进行无线或有线通信以交换数据。虽然图6示出了具有各种装置的电子设备500,但是应理解的是,并不要求实施或具备所有示出的装置。可以替代地实施或具备更多或更少的装置。
特别地,根据本公开的实施例,上文参考流程图描述的过程可以被实现为计算机软件程序。例如,本公开的实施例包括一种计算机程序产品,其包括承载在非暂态计算机可读介质上的计算机程序,该计算机程序包含用于执行流程图所示的方法的程序代码。在这样的实施例中,该计算机程序可以通过通信装置509从网络上被下载和安装,或者从存储装置508被安装,或者从ROM 502被安装。在该计算机程序被处理装置501执行时,执行本公开实施例的扫描数据处理方法中限定的上述功能。
需要说明的是,本公开上述的计算机可读介质可以是计算机可读信号介质或者计算机可读存储介质或者是上述两者的任意组合。计算机可读存储介质例如可以是——但不限于——电、磁、光、电磁、红外线、或半导体的系统、装置或器 件,或者任意以上的组合。计算机可读存储介质的更具体的例子可以包括但不限于:具有一个或多个导线的电连接、便携式计算机磁盘、硬盘、随机访问存储器(RAM)、只读存储器(ROM)、可擦式可编程只读存储器(EPROM或闪存)、光纤、便携式紧凑磁盘只读存储器(CD-ROM)、光存储器件、磁存储器件、或者上述的任意合适的组合。在本公开中,计算机可读存储介质可以是任何包含或存储程序的有形介质,该程序可以被指令执行系统、装置或者器件使用或者与其结合使用。而在本公开中,计算机可读信号介质可以包括在基带中或者作为载波一部分传播的数据信号,其中承载了计算机可读的程序代码。这种传播的数据信号可以采用多种形式,包括但不限于电磁信号、光信号或上述的任意合适的组合。计算机可读信号介质还可以是计算机可读存储介质以外的任何计算机可读介质,该计算机可读信号介质可以发送、传播或者传输用于由指令执行系统、装置或者器件使用或者与其结合使用的程序。计算机可读介质上包含的程序代码可以用任何适当的介质传输,包括但不限于:电线、光缆、RF(射频)等等,或者上述的任意合适的组合。
在一些实施方式中,客户端、服务器可以利用诸如HTTP(Hyper Text Transfer Protocol,超文本传输协议)之类的任何当前已知或未来研发的网络协议进行通信,并且可以与任意形式或介质的数字数据通信(例如,通信网络)互连。通信网络的示例包括局域网(“LAN”),广域网(“WAN”),网际网(例如,互联网)以及端对端网络(例如,ad hoc端对端网络),以及任何当前已知或未来研发的网络。
上述计算机可读介质可以是上述电子设备中所包含的;也可以是单独存在,而未装配入该电子设备中。
上述计算机可读介质承载有一个或者多个程序,当上述一个或者多个程序被该电子设备执行时,使得该电子设备:在视频的播放过程中,接收用户的信息展示触发操作;获取所述视频关联的至少两个目标信息;在所述视频的播放页面的信息展示区域中展示所述至少两个目标信息中的第一目标信息其中,所述信息展示区域的尺寸小于所述播放页面的尺寸;接收用户的第一切换触发操作,将所述信息展示区域中展示的所述第一目标信息切换为所述至少两个目标信息中的第二目标信息。
可以以一种或多种程序设计语言或其组合来编写用于执行本公开的操作的计算机程序代码,上述程序设计语言包括但不限于面向对象的程序设计语言—诸如Java、Smalltalk、C++,还包括常规的过程式程序设计语言—诸如“C”语言或类 似的程序设计语言。程序代码可以完全地在用户计算机上执行、部分地在用户计算机上执行、作为一个独立的软件包执行、部分在用户计算机上部分在远程计算机上执行、或者完全在远程计算机或服务器上执行。在涉及远程计算机的情形中,远程计算机可以通过任意种类的网络——包括局域网(LAN)或广域网(WAN)—连接到用户计算机,或者,可以连接到外部计算机(例如利用因特网服务提供商来通过因特网连接)。
附图中的流程图和框图,图示了按照本公开各种实施例的系统、方法和计算机程序产品的可能实现的体系架构、功能和操作。在这点上,流程图或框图中的每个方框可以代表一个模块、程序段、或代码的一部分,该模块、程序段、或代码的一部分包含一个或多个用于实现规定的逻辑功能的可执行指令。也应当注意,在有些作为替换的实现中,方框中所标注的功能也可以以不同于附图中所标注的顺序发生。例如,两个接连地表示的方框实际上可以基本并行地执行,它们有时也可以按相反的顺序执行,这依所涉及的功能而定。也要注意的是,框图和/或流程图中的每个方框、以及框图和/或流程图中的方框的组合,可以用执行规定的功能或操作的专用的基于硬件的系统来实现,或者可以用专用硬件与计算机指令的组合来实现。
描述于本公开实施例中所涉及到的单元可以通过软件的方式实现,也可以通过硬件的方式来实现。其中,单元的名称在某种情况下并不构成对该单元本身的限定。
本文中以上描述的功能可以至少部分地由一个或多个硬件逻辑部件来执行。例如,非限制性地,可以使用的示范类型的硬件逻辑部件包括:现场可编程门阵列(FPGA)、专用集成电路(ASIC)、专用标准产品(ASSP)、片上系统(SOC)、复杂可编程逻辑设备(CPLD)等等。
在本公开的上下文中,机器可读介质可以是有形的介质,其可以包含或存储以供指令执行系统、装置或设备使用或与指令执行系统、装置或设备结合地使用的程序。机器可读介质可以是机器可读信号介质或机器可读储存介质。机器可读介质可以包括但不限于电子的、磁性的、光学的、电磁的、红外的、或半导体系统、装置或设备,或者上述内容的任何合适组合。机器可读存储介质的更具体示例会包括基于一个或多个线的电气连接、便携式计算机盘、硬盘、随机存取存储器(RAM)、只读存储器(ROM)、可擦除可编程只读存储器(EPROM或快闪存储器)、光纤、便捷式紧凑盘只读存储器(CD-ROM)、光学储存设备、磁储存设 备、或上述内容的任何合适组合。
根据本公开的一个或多个实施例,本公开提供了一种电子设备,包括:
处理器;
用于存储所述处理器可执行指令的存储器;
所述处理器,用于从所述存储器中读取所述可执行指令,并执行所述指令以实现如本公开提供的任一所述的扫描数据处理方法。
根据本公开的一个或多个实施例,本公开提供了一种计算机可读存储介质,所述存储介质存储有计算机程序,所述计算机程序用于执行如本公开提供的任一所述的扫描数据处理方法。
需要说明的是,在本文中,诸如“第一”和“第二”等之类的关系术语仅仅用来将一个实体或者操作与另一个实体或操作区分开来,而不一定要求或者暗示这些实体或操作之间存在任何这种实际的关系或者顺序。而且,术语“包括”、“包含”或者其任何其他变体意在涵盖非排他性的包含,从而使得包括一系列要素的过程、方法、物品或者设备不仅包括那些要素,而且还包括没有明确列出的其他要素,或者是还包括为这种过程、方法、物品或者设备所固有的要素。在没有更多限制的情况下,由语句“包括一个……”限定的要素,并不排除在包括所述要素的过程、方法、物品或者设备中还存在另外的相同要素。
以上所述仅是本公开的具体实施方式,使本领域技术人员能够理解或实现本公开。对这些实施例的多种修改对本领域的技术人员来说将是显而易见的,本文中所定义的一般原理可以在不脱离本公开的精神或范围的情况下,在其它实施例中实现。因此,本公开将不会被限制于本文所示的这些实施例,而是要符合与本文所公开的原理和新颖特点相一致的最宽的范围。
工业实用性
本公开公开的扫描数据处理方案,其基于辅助特征点预先设置的分布真值和扫描计算的三维坐标点确定目标位置变换关系,从而能够准确定位到扫描杆之间的相对位置,提高口扫场景下的数据处理效率和精度,具有很强的工业实用性。

Claims (10)

  1. 一种扫描数据处理方法,包括:
    获取包括辅助特征点的多帧待处理图像;其中,所述辅助特征点具有对应的分布真值;
    基于所述多帧待处理图像进行处理,得到同一坐标系下的辅助特征点的三维坐标点;
    对所述辅助特征点的三维坐标点进行测量处理,得到目标三维坐标点;
    基于所述目标三维坐标点和分布真值,确定目标位置变换关系。
  2. 根据权利要求1所述的扫描数据处理方法,其特征在于,所述基于所述多帧待处理图像进行处理,得到同一坐标系下的所有三维坐标点,包括:
    获取每帧所述待处理图像中辅助特征点的二维坐标点;
    基于预设的内参矩阵和所述二维坐标点进行计算,得到每帧所述待处理图像的三维坐标点;
    对每帧所述待处理图像的三维坐标点进行拼接,得到同一坐标系下的所述所有三维坐标点。
  3. 根据权利要求1所述的扫描数据处理方法,其特征在于,所述基于所述多帧待处理图像进行处理,得到同一坐标系下的所有三维坐标点,包括:
    获取每帧所述待处理图像中辅助特征点的二维坐标点;
    基于双相机之间的相对位置和所述二维坐标点进行计算,得到每帧所述待处理图像的三维坐标点;
    对每帧所述待处理图像的三维坐标点进行拼接,得到同一坐标系下的所述所有三维坐标点。
  4. 根据权利要求1所述的扫描数据处理方法,其特征在于,所述对所有三维坐标点进行测量处理,得到目标三维坐标点,包括:
    将每个所述三维坐标点投射到图像坐标系上,得到二维像素坐标点;
    在所述二维像素坐标点和所述三维坐标点对应的二维坐标点之间的欧式距离最小时,将所述三维坐标点作为所述目标三维坐标点。
  5. 根据权利要求1所述的扫描数据处理方法,其特征在于,基于所述目标三维坐标点和分布真值,确定目标位置变换关系,包括:
    基于预设的缩放因子、所述目标三维坐标点和所述分布真值进行计算,得到 初始位置变换关系;
    基于预设的优化公式对所述缩放因子、所述目标三维坐标点、所述分布真值和所述初始位置变换关系进行优化计算,得到优化值;
    调整所述缩放因子和所述初始位置变换关系,获取所述优化值小于预设阈值时对应的所述初始位置变换关系作为所述目标位置变换关系。
  6. 根据权利要求1所述的扫描数据处理方法,其特征在于,所述获取包括辅助特征点的多帧待处理图像,包括:
    控制扫描设备按照预设方向旋转的同时按照预设的频率对包括扫描杆的目标口腔进行扫描,得到所述多帧待处理图像。
  7. 一种扫描数据处理装置,其特征在于,包括:
    获取图像模块,用于获取包括辅助特征点的多帧待处理图像;其中,所述辅助特征点具有对应的分布真值;
    图像处理模块,用于基于所述多帧待处理图像进行处理,得到同一坐标系下的所有三维坐标点;
    测量处理模块,用于对所述所有三维坐标点进行测量处理,得到目标三维坐标点;
    确定模块,用于基于所述目标三维坐标点和分布真值,确定目标位置变换关系。
  8. 根据权利要求7所述的扫描数据处理装置,其特征在于,所述图像处理模块,具体用于:
    获取每帧所述待处理图像中辅助特征点的二维坐标点;
    基于预设的内参矩阵,当前帧相机姿态和所述二维坐标点进行计算,得到每帧所述待处理图像的三维坐标点;
    对每帧所述待处理图像的三维坐标点进行拼接,得到所述所有三维坐标点。
  9. 一种电子设备,其特征在于,所述电子设备包括:
    处理器;
    用于存储所述处理器可执行指令的存储器;
    所述处理器,用于从所述存储器中读取所述可执行指令,并执行所述指令以实现上述权利要求1-6中任一所述的扫描数据处理方法。
  10. 一种计算机可读存储介质,其特征在于,所述存储介质存储有计算机程序,所述计算机程序用于执行上述权利要求1-6中任一所述的扫描数据处理方法。
PCT/CN2023/091806 2022-05-02 2023-04-28 扫描数据处理方法、装置、设备及介质 WO2023213252A1 (zh)

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
CN202210494057.6A CN114869528A (zh) 2022-05-02 2022-05-02 扫描数据处理方法、装置、设备及介质
CN202210494057.6 2022-05-02

Publications (1)

Publication Number Publication Date
WO2023213252A1 true WO2023213252A1 (zh) 2023-11-09

Family

ID=82674066

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/CN2023/091806 WO2023213252A1 (zh) 2022-05-02 2023-04-28 扫描数据处理方法、装置、设备及介质

Country Status (2)

Country Link
CN (1) CN114869528A (zh)
WO (1) WO2023213252A1 (zh)

Families Citing this family (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN114869528A (zh) * 2022-05-02 2022-08-09 先临三维科技股份有限公司 扫描数据处理方法、装置、设备及介质

Citations (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN109272454A (zh) * 2018-07-27 2019-01-25 阿里巴巴集团控股有限公司 一种增强现实设备的坐标系校准方法及装置
CN109523578A (zh) * 2018-08-27 2019-03-26 中铁上海工程局集团有限公司 一种bim模型与点云数据的匹配方法
DE102019100221A1 (de) * 2019-01-07 2020-07-09 Ralf Schätzle Bildaufnahmeverfahren und Referenzkörper zur Durchführung einer intraoralen Aufnahme einer Mundsituation
DE102019100220A1 (de) * 2019-01-07 2020-07-09 Ralf Schätzle Bildaufnahmeverfahren und Referenzkörper zur Durchführung einer intraoralen Aufnahme einer Mundsituation
CN113483695A (zh) * 2021-07-01 2021-10-08 先临三维科技股份有限公司 三维扫描系统、辅助件、处理方法、装置、设备及介质
CN114869528A (zh) * 2022-05-02 2022-08-09 先临三维科技股份有限公司 扫描数据处理方法、装置、设备及介质

Family Cites Families (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US7027642B2 (en) * 2000-04-28 2006-04-11 Orametrix, Inc. Methods for registration of three-dimensional frames to create three-dimensional virtual models of objects
CN208552101U (zh) * 2017-10-13 2019-03-01 深圳市康泰健牙科器材有限公司 一种种植桥口内扫描定位装置
EP3629336A1 (en) * 2018-09-28 2020-04-01 Ivoclar Vivadent AG Dental design transfer
CN113491533B (zh) * 2021-07-07 2022-05-03 北京大学口腔医学院 辅助装置、辅助装置组件及口内三维图形的获取方法
CN215384788U (zh) * 2021-07-16 2022-01-04 朴永浩 一种用于牙科种植修复的扫描杆
CN114343895B (zh) * 2021-12-09 2024-02-13 深圳市菲森科技有限公司 一种提高种植杆扫描效率的工作流方法

Patent Citations (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN109272454A (zh) * 2018-07-27 2019-01-25 阿里巴巴集团控股有限公司 一种增强现实设备的坐标系校准方法及装置
CN109523578A (zh) * 2018-08-27 2019-03-26 中铁上海工程局集团有限公司 一种bim模型与点云数据的匹配方法
DE102019100221A1 (de) * 2019-01-07 2020-07-09 Ralf Schätzle Bildaufnahmeverfahren und Referenzkörper zur Durchführung einer intraoralen Aufnahme einer Mundsituation
DE102019100220A1 (de) * 2019-01-07 2020-07-09 Ralf Schätzle Bildaufnahmeverfahren und Referenzkörper zur Durchführung einer intraoralen Aufnahme einer Mundsituation
CN113483695A (zh) * 2021-07-01 2021-10-08 先临三维科技股份有限公司 三维扫描系统、辅助件、处理方法、装置、设备及介质
CN114869528A (zh) * 2022-05-02 2022-08-09 先临三维科技股份有限公司 扫描数据处理方法、装置、设备及介质

Also Published As

Publication number Publication date
CN114869528A (zh) 2022-08-09

Similar Documents

Publication Publication Date Title
CN107194962B (zh) 点云与平面图像融合方法及装置
US11504055B2 (en) Systems and methods for analyzing cutaneous conditions
US11557083B2 (en) Photography-based 3D modeling system and method, and automatic 3D modeling apparatus and method
CN110660098B (zh) 基于单目视觉的定位方法和装置
WO2023213254A1 (zh) 一种口内扫描处理方法、系统、电子设备及介质
CN111292420B (zh) 用于构建地图的方法和装置
WO2023213253A1 (zh) 一种扫描数据处理方法、装置、电子设备及介质
WO2023213252A1 (zh) 扫描数据处理方法、装置、设备及介质
WO2024001959A1 (zh) 一种扫描处理方法、装置、电子设备及存储介质
CN114494388B (zh) 一种大视场环境下图像三维重建方法、装置、设备及介质
WO2024104248A1 (zh) 虚拟全景图的渲染方法、装置、设备及存储介质
WO2023241704A1 (zh) 牙齿模型的获取方法、装置、设备及介质
WO2022166868A1 (zh) 漫游视图的生成方法、装置、设备和存储介质
CN114140771A (zh) 一种图像深度数据集自动标注方法及系统
WO2023213255A1 (zh) 扫描装置及其连接方法、装置、电子设备及介质
WO2023193613A1 (zh) 高光渲染方法、装置、介质及电子设备
CN113706692B (zh) 三维图像重构方法、装置、电子设备以及存储介质
JP6625654B2 (ja) 投影装置、投影方法、および、プログラム
CN110349109B (zh) 基于鱼眼畸变校正方法及其系统、电子设备
CN116188583B (zh) 相机位姿信息生成方法、装置、设备和计算机可读介质
WO2024109796A1 (zh) 一种扫描头位姿检测方法、装置、设备及介质
CN114125411B (zh) 投影设备校正方法、装置、存储介质以及投影设备
CN112308809A (zh) 一种图像合成方法、装置、计算机设备及存储介质
CA3102860C (en) Photography-based 3d modeling system and method, and automatic 3d modeling apparatus and method
WO2024051783A1 (zh) 一种三维重建方法、装置、设备和存储介质

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 23799238

Country of ref document: EP

Kind code of ref document: A1