WO2024032668A1 - 三维重建方法及装置、系统 - Google Patents

三维重建方法及装置、系统 Download PDF

Info

Publication number
WO2024032668A1
WO2024032668A1 PCT/CN2023/112047 CN2023112047W WO2024032668A1 WO 2024032668 A1 WO2024032668 A1 WO 2024032668A1 CN 2023112047 W CN2023112047 W CN 2023112047W WO 2024032668 A1 WO2024032668 A1 WO 2024032668A1
Authority
WO
WIPO (PCT)
Prior art keywords
image
target
target pixel
symbol
pixel point
Prior art date
Application number
PCT/CN2023/112047
Other languages
English (en)
French (fr)
Inventor
陈瀚
赵晓波
张健
黄磊杰
马超
Original Assignee
先临三维科技股份有限公司
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by 先临三维科技股份有限公司 filed Critical 先临三维科技股份有限公司
Publication of WO2024032668A1 publication Critical patent/WO2024032668A1/zh

Links

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T17/00Three dimensional [3D] modelling, e.g. data description of 3D objects
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V10/00Arrangements for image or video recognition or understanding
    • G06V10/70Arrangements for image or video recognition or understanding using pattern recognition or machine learning
    • G06V10/74Image or video pattern matching; Proximity measures in feature spaces
    • G06V10/75Organisation of the matching processes, e.g. simultaneous or sequential comparisons of image or video features; Coarse-fine approaches, e.g. multi-scale approaches; using context analysis; Selection of dictionaries
    • G06V10/751Comparing pixel values or logical combinations thereof, or feature values having positional relevance, e.g. template matching

Definitions

  • the present application relates to the field of three-dimensional reconstruction, specifically, to a three-dimensional reconstruction method, device, and system.
  • Structured light 3D reconstruction technology is a 3D reconstruction technology that projects an optical encoding pattern onto the surface of a measured object and restores the 3D data of the object surface through the collected deformation pattern. It has high efficiency, anti-interference and other characteristics, and is widely used in various three-dimensional reconstruction scenarios.
  • the core issue in structured light 3D reconstruction technology is the matching of pixels with the same name. Different matching strategies rely on different encoding methods. According to the different encoding methods of the projected pattern, structured light technology can be divided into temporal encoding and spatial encoding. Time encoding requires the sequential projection of multiple frames of patterns into the measurement scene. It usually requires the measured object and the projector to be relatively stationary, so high frame rate scanning cannot be achieved.
  • the applicable scenes are subject to certain restrictions, and it is mostly used in static scanning scenes. Spatial coding usually only requires projecting a pattern into the measured scene to complete three-dimensional reconstruction. Currently, in related technologies, the relative displacement relationship of neighborhood symbols is used to decode through the encoding method of circular symbols. Another method The method is to obtain three-dimensional data through matching between image blocks. In the former method, the coding capacity of the code elements is small, and only sparse code element points can be reconstructed. The single frame data is small and the scanning efficiency is low; in the latter method, using Image blocks are matched with lower accuracy.
  • Embodiments of the present application provide a three-dimensional reconstruction method, device, and system to at least solve the technical problem of low accuracy of three-dimensional reconstruction results due to incomplete matching during the symbol matching process.
  • a three-dimensional reconstruction method including: acquiring a first image and a second image, the first image and the second image being respectively acquired by different image acquisition devices and projected onto the surface of the object being measured. Obtained from the code element image, the code element image contains multiple target code elements randomly distributed according to the preset direction, and the target code elements are line segment stripes; obtain the first target pixel point of each code element in the first image, and Determine the second target pixel points that match the first target pixel point in the two images one by one to complete the matching; the second target pixel point in the first image exists When a target pixel point has not completed matching, a second matching is performed on the first target pixel point that has not completed matching; at least based on the predetermined pixel coordinates of the first target pixel point in the first image and the The pixel coordinates of the second target pixel point in the second image determine the three-dimensional coordinates of the first target pixel point to complete three-dimensional reconstruction.
  • performing a second matching on the first target pixel point that has not completed the matching includes: determining the second target pixel point that has completed the matching within the first peripheral preset area of the first target pixel point that has not completed the matching.
  • a target pixel point determine that the light planes corresponding to the first target pixel points of all symbols in the second peripheral preset area area of the matched first target pixel point are the first light plane sequence, wherein the first target pixel point Including: the midpoint of the line segment stripe and the feature point of the line segment stripe; determining that the three-dimensional coordinates of the unmatched first target pixel point in all light planes in the first light plane sequence are candidate three-dimensional coordinates; determining that all candidate three-dimensional coordinates are in the first light plane sequence.
  • the corresponding pixels in the two images are candidate pixels, and the second target pixel that matches the unmatched first target pixel is determined from the candidate pixels.
  • determining the second target pixel matched by the unmatched first target pixel from the candidate pixels includes: determining the closest second target pixel of each candidate pixel in the second image. target pixel point, and determine the distance between each candidate pixel point and the nearest second target pixel point as the first distance; determine the candidate pixel point with the smallest first distance as the third target pixel point, and determine the distance between each candidate pixel point and the closest second target pixel point as the third target pixel point.
  • the second target pixel point with the closest pixel distance is determined as the second target pixel point that matches the second target pixel point that has not completed matching.
  • the method further includes: determining the three-dimensional coordinates of the first target pixel point that has been matched; and substituting the three-dimensional coordinates of the first target pixel point that has been matched into the corresponding symbols of all symbols in the first image. From the light plane equation, the residual parameters are obtained; the light plane corresponding to the light plane equation with the smallest residual parameter is determined as the light plane corresponding to the first target pixel point that has completed matching.
  • the length of the target symbol in the first image is determined in the following manner, including: when multiple target symbols are randomly distributed along the horizontal axis of the symbol image, determining the first magnification and The second magnification; multiply the ratio of the second magnification to the first magnification by the minimum length of the target symbol to determine the minimum length of the target symbol in the first image; multiply the preset ratio of the width of the first image The value determines the maximum length of the target symbol in the first image.
  • the predetermined symbol image contains multiple target symbols randomly distributed in a preset direction.
  • the target symbols are line segment stripes, including: based on the length of the target symbol and the width of the target symbol. , the spacing between target symbols and the pixel coordinates of the center pixel of the target symbol determine the target area where the target symbol is located, where the pixel coordinates of the center pixel of the target symbol are randomly generated in the area where the symbol image is located; traversal For all pixels in the target area, if there is no target symbol in the target area, the target symbol is generated in the target area, where the target symbol includes at least a line segment of a preset length and two corresponding lines of the preset length. endpoints; generate target symbols in all target areas within the symbol image area.
  • the method further includes: determining a first neighborhood symbol set of any symbol in the first image and a plurality of second neighborhood symbol sets of multiple candidate symbols in the second image. ; Determine the number of matching neighborhood symbols in the plurality of second neighborhood symbol sets with the neighborhood symbols in the first neighborhood symbol set, and combine the matching numbers in the plurality of second neighborhood symbol sets.
  • the second neighborhood symbol set with the largest number is determined as the target second neighborhood symbol set; the candidate symbol corresponding to the target second neighborhood symbol set is determined as a symbol matching any symbol.
  • the method further includes: determining the target area where the target symbol is located based on the coordinates of the center pixel of the target symbol, the length of the target symbol, and the width of the target symbol. There is only one target area in the target area. Target code element.
  • a three-dimensional reconstruction device including: an acquisition module, configured to acquire a first image and a second image.
  • the first image and the second image are respectively acquired by different image acquisition devices.
  • the matching module is used to obtain each of the first image The first target pixel point of the code element, and determine the second target pixel points that match the first target pixel point one by one in the second image to complete the matching; the existence of the first target pixel point in the first image does not complete the matching In the case of, perform a second matching on the first target pixel point that has not completed matching;
  • a reconstruction module configured to at least base on the predetermined pixel coordinates of the first target pixel point in the first image and the third The pixel coordinates of the two target pixel points in the second image determine the three-dimensional coordinates of the first target
  • a three-dimensional reconstruction system is also provided, which is applied to the three-dimensional reconstruction method, including: at least two image acquisition devices, a projection device and a first processor; the projection device is used to convert the predetermined The code element image is projected onto the surface of the measured object; at least two image acquisition modules are used to collect predetermined code element images from the surface of the measured object to obtain the first image and the second image; the first processor is used to obtain each of the first images first target pixel points of symbols, and determine the second target pixel points that match the first target pixel points one by one in the second image to complete the matching; the existence of the first target pixel point in the first image is not completed In the case of matching, perform a second matching on the first target pixel point that has not completed the matching; and is also used to at least base on the predetermined pixel coordinates of the first target pixel point in the first image and the second The pixel coordinates of the target pixel point in the second image determine the three-dimensional coordinates of the first target pixel point
  • a non-volatile storage medium includes a stored program, wherein when the program is running, the device where the non-volatile storage medium is located is controlled to execute the above 3D reconstruction method.
  • an electronic device including: a memory and a processor; the processor is configured to run a program, wherein the above three-dimensional reconstruction method is executed when the program is running.
  • the first image and the second image are obtained, and the first image and the second image are respectively represented by Different image acquisition equipment collects code element images projected onto the surface of the object being measured.
  • the code element image contains multiple target code elements randomly distributed in a preset direction.
  • the target code elements are line segment stripes; each image in the first image is acquired.
  • first target pixel points of symbols and determine the second target pixel points that match the first target pixel points one by one in the second image to complete the matching; the existence of the first target pixel point in the first image is not completed
  • a second matching is performed on the first target pixel point that has not completed matching; at least based on the predetermined pixel coordinates of the first target pixel point in the first image and the second target pixel point.
  • the pixel coordinates in the second image determine the three-dimensional coordinates of the first target pixel point, so as to complete the three-dimensional reconstruction and perform secondary matching on the unmatched first target pixel point to achieve the goal of
  • the purpose of matching all the first target pixels is to achieve the technical effect of complete matching of the first target pixels in the first image, thereby solving the problem of incomplete matching in the code element matching process that causes the accuracy of the three-dimensional reconstruction results. Low-tech problem.
  • Figure 1 is a hardware structure block diagram of a computer terminal (or mobile device) used for a three-dimensional reconstruction method according to an embodiment of the present application;
  • Figure 2 is a schematic diagram of a three-dimensional reconstruction method according to the present application.
  • Figure 3a is a schematic diagram of an optional symbol image according to an embodiment of the present application.
  • Figure 3b is a schematic diagram of another optional symbol image according to an embodiment of the present application.
  • Figure 4 is a schematic diagram of five optional symbol shapes according to an embodiment of the present application.
  • Figure 5 is a schematic diagram of an optional line segment stripe according to an embodiment of the present application.
  • Figure 6 is an optional three-dimensional reconstruction system according to an embodiment of the present application.
  • Figure 7 is an optional three-dimensional reconstruction device according to an embodiment of the present application.
  • an embodiment of a three-dimensional reconstruction method is also provided. It should be noted that the steps shown in the flow chart of the accompanying drawings can be executed in a computer system such as a set of computer-executable instructions, and, Although a logical sequence is shown in the flowcharts, in some cases the steps shown or described may be performed in a sequence different from that herein.
  • Figure 1 shows a hardware structure block diagram of a computer terminal (or mobile device) used to implement a three-dimensional reconstruction method.
  • the computer terminal 10 may include one or more (shown as 102a, 102b, ..., 102n in the figure) processor 102 (the processor 102 may include but is not limited to a microprocessor).
  • a processing device such as a processor MCU or a programmable logic device FPGA), a memory 104 for storing data, and a transmission module 106 for communication functions.
  • the computer terminal 10 may also include: a display, an input/output interface (I/O interface), a universal serial bus (USB) port (which may be included as one of the ports of the I/O interface), a network interface, a power supply and/or camera.
  • I/O interface input/output interface
  • USB universal serial bus
  • FIG. 1 is only illustrative, and it does not limit the structure of the above-mentioned electronic device.
  • the computer terminal 10 may also include more or fewer components than shown in FIG. 1 , or have a different configuration than shown in FIG. 1 .
  • the one or more processors 102 and/or other data processing circuitry described above may generally be referred to herein as "data processing circuitry.”
  • the data processing circuit may be embodied in whole or in part as software, hardware, firmware or any other combination.
  • the data processing circuit may be a single independent processing module, or may be fully or partially integrated into any of the other components in the computer terminal 10 (or mobile device).
  • the data processing circuit serves as a processor control (eg, selection of a variable resistor terminal path connected to the interface).
  • the memory 104 can be used to store software programs and modules of application software, such as program instructions/data storage devices corresponding to the three-dimensional reconstruction method in the embodiment of the present application.
  • the processor 102 runs the programs stored in the memory 104.
  • Software programs and modules are used to perform various functional applications and data processing, that is, to implement the three-dimensional reconstruction method of the above-mentioned application program.
  • Memory 104 may include high-speed random access memory, and may also include non-volatile memory, such as one or more magnetic storage devices, flash memory, or other non-volatile solid-state memory.
  • the memory 104 may further include memory located remotely relative to the processor 102, and these remote memories may be connected to the computer terminal 10 through a network. Examples of the above-mentioned networks include but are not limited to the Internet, intranets, local area networks, mobile communication networks and combinations thereof.
  • the transmission module 106 is used to receive or send data via a network.
  • Specific examples of the above-mentioned network may include a wireless network provided by a communication provider of the computer terminal 10 .
  • the transmission module 106 includes a network adapter (Network Interface Controller, NIC), which can be connected to other network devices through a base station to communicate with the Internet.
  • the transmission module 106 may be a radio frequency (Radio Frequency, RF) module, which is used to communicate with the Internet wirelessly.
  • RF Radio Frequency
  • the display may be, for example, a touch-screen liquid crystal display (LCD), which may enable a user to interact with the user interface of the computer terminal 10 (or mobile device).
  • LCD liquid crystal display
  • an embodiment of a three-dimensional reconstruction method is provided. It should be noted that the steps shown in the flow chart of the accompanying drawings can be executed in a computer system such as a set of computer-executable instructions, and although A logical order is shown in the flowcharts, but in some cases, the steps shown or described may be performed in a different order than herein.
  • Figure 2 is a flow chart of a three-dimensional reconstruction method according to an embodiment of the present application. As shown in Figure 2, the method includes the following steps:
  • Step S202 obtain a first image and a second image.
  • the first image and the second image are respectively obtained by different image acquisition devices collecting code element images projected onto the surface of the object being measured.
  • the code element images contain multiple target codes. The elements are randomly distributed according to the preset direction, and the target symbol is a line segment stripe;
  • Step S204 obtain the first target pixel points of each symbol in the first image, and determine the second target pixel points that match the first target pixel points one by one in the second image to complete the matching; in the first image In the case where the first target pixel point has not completed the matching, perform a second matching on the first target pixel point that has not completed the matching;
  • Step S206 Determine the first target pixel based on at least the predetermined pixel coordinates of the first target pixel point in the first image and the predetermined pixel coordinates of the second target pixel point in the second image. The three-dimensional coordinates of the points to complete three-dimensional reconstruction.
  • three-dimensional reconstruction refers to the establishment of a mathematical model of a three-dimensional object suitable for computer representation and processing. It is the basis for processing, operating and analyzing its properties in a computer environment. It is also the establishment of a model to express objective events in the computer. virtual reality technology. Structured light temporal coding and spatial coding technology are usually used in three-dimensional reconstruction technology. Among them, the difficulty of spatial coding of structured light is to use pixel spatial grayscale information to perform stable and reliable coding and decoding of each pixel, which is a category of related technologies.
  • the method is to encode specific pixels with certain code element information, and use the encoding method of large and small circular code elements to encode each code element using epipolar constraints and the relative displacement relationship between neighborhood code elements and code elements in the image. Encoding and decoding to achieve the reconstruction of each symbol point.
  • the disadvantage of this type of method is that the code element encoding capacity is limited, only sparse code element points can be reconstructed, the single frame data is small, and the scanning efficiency is low; it is greatly affected by the surface texture of the object.
  • Another type of method is to match identical points through the correlation between pixel blocks in the form of random speckles.
  • the coding method of line segment stripe symbols is used. Multiple feature points in the line segment stripes are randomly distributed among the symbols, which increases the coding capacity, increases the data amount of a single frame image, and thereby improves the Improve scanning efficiency.
  • the method proposed in this application uses line segment stripe distribution, the distribution density of code elements is high, which improves the accuracy of the obtained reconstructed data.
  • the method proposed in this application can be applied to oral scanners, facial scanners, industrial scanners, professional scanners and other scanners, and can realize scanning of teeth, faces, human bodies, industrial products, industrial equipment, cultural relics, artworks, prosthetics, etc. Three-dimensional reconstruction of items or scenes such as medical equipment and buildings.
  • the first image is an image collected by the first image acquisition device
  • the second image is an image collected by the second image acquisition device
  • multiple target symbols in the first image are oriented in the same direction, and the spacing between symbols is random.
  • the distance between number 1 and number 2 is 100 ⁇ m
  • the distance between number 2 and number 3 is 80 ⁇ m.
  • the code elements fluctuate randomly up and down in the direction of the image height.
  • the features of the code elements include at least two extractable grayscale feature point patterns, and the feature points are distributed up and down with a preset spacing.
  • the position of the stripes can be between the image width and image height. Randomly distributed in both directions, or only randomly distributed in either direction.
  • Figure 3a shows a code element image randomly distributed according to the two directions of image width and image height.
  • the stripe length is fixed.
  • Figure 3b shows a code element image randomly distributed according to the stripe position in the image height direction. , the stripe length in Figure 3b is randomly set.
  • step S204 the first target pixel points of each symbol in the first image are obtained, and the second target pixel points that match the first target pixel points are determined one by one in the second image to complete the matching.
  • the first image and the second image are obtained by different image acquisition devices collecting projection images projected onto the surface of the object to be measured, so the code element structure in the first image and the code element structure in the second image are the same.
  • the matching pixel point in the second image is the midpoint of the line segment stripes in the second image.
  • the three-dimensional reconstruction method provided in the embodiment of the present application is suitable for secondary matching of unmatched symbol feature points in the symbol feature point matching method, and is also suitable for secondary matching of unmatched target pixel points in the target pixel point matching method. matches.
  • step S206 the target symbol in the first image corresponds to the symbol in the second image one-to-one.
  • the first image is preset, that is, the projection parameters of the symbol in the first image are preset.
  • the one-to-one correspondence between the code element and the code element in the second image is combined with the triangulation method to determine the three-dimensional coordinates of the code element in the second image.
  • the unmatched first target pixels are matched twice.
  • the surface of the object can be used to present a certain depth of continuity in a small area for completion.
  • the pixel position Q of the estimated matching point of P in the second image can be further calculated.
  • the first target pixel closest to the Q point is found within the neighborhood pixels of the Q point in the second image.
  • the consistency principle of the light plane sequence can also be used to perform secondary matching. Specifically, determine the first target pixel that has been matched within the first peripheral preset area area of the first target pixel that has not completed matching; determine that it has been completed.
  • the light planes corresponding to the first target pixel points of all symbols in the second peripheral preset area area of the matched first target pixel point are the first light plane sequence, where the first target pixel point includes: the midpoint of the line segment stripe and Feature points of line segment stripes; determine the three-dimensional coordinates of the unmatched first target pixel point in all light planes in the first light plane sequence as candidate three-dimensional coordinates; determine the corresponding pixel points of all candidate three-dimensional coordinates in the second image as candidate pixels, and determine from the candidate pixels the second target pixels that match the unmatched first target pixels.
  • first target pixel point U search for the first target pixel point that has been successfully matched in its neighborhood (the first peripheral preset area area), and obtain its neighborhood (the second surrounding preset area area).
  • the light plane sequence ⁇ N k ⁇ k ⁇ V of the surrounding preset area area), and then complete the current neighborhood light plane sequence according to the calibrated light plane sequence information, and obtain the candidate light plane sequence ⁇ N k ⁇ k ⁇ M (first light plane sequence) M is the sequence range of V after completion.
  • each stripe in the pattern producing a light plane in space, and each light plane intersects with the surface of the object.
  • each stripe can be represented by a light plane serial number and light plane parameters.
  • the light plane number of each stripe in the pattern and the light plane number of its neighbor stripes can form a known sequence ⁇ N i ⁇ i ⁇ V , where V represents the light plane of the neighbor stripe, and N i represents the light plane number.
  • This residual parameter represents the distance from the point to the light plane. The smaller the residual, the higher the possibility that the point belongs to the light plane. It can be understood that the three-dimensional coordinates of all the first target pixel points are substituted into the target light plane. In the equation, the first target pixel with the smallest residual parameter is determined to belong to the target light plane.
  • the first light plane sequence determines the three-dimensional coordinates of the first target pixel point that has completed matching; substitute the three-dimensional coordinates of the first target pixel point that has completed matching into the first image
  • the residual parameters are obtained from the light plane equations corresponding to all symbols in the algorithm; the light plane corresponding to the light plane equation with the smallest residual parameter is determined as the light plane corresponding to the first target pixel point that has completed matching.
  • all light planes are traversed to find the light plane with the smallest residual error. This light plane is the light plane corresponding to the current first target pixel point, and the sequence number of the light plane is recorded. Mark the light plane serial numbers of all reconstructed fringe center points in turn.
  • a method for determining second target pixels matched by unmatched first target pixels from candidate pixels including: determining the position of each candidate pixel in the second image. The nearest second target pixel point in the middle distance, and determine the distance between each candidate pixel point and the nearest second target pixel point as the first distance; determine the candidate pixel point with the smallest first distance as the third target pixel point, And the second target pixel point closest to the third target pixel point is determined as the second target pixel point that matches the second target pixel point that has not completed the matching.
  • the second target pixel point closest to each candidate pixel point is searched from the second image. Taking the second target pixel point as the midpoint of the line segment stripe in the second image as an example, each candidate pixel point is searched in the second image. The distance between the pixel and the nearest line segment stripe midpoint is calculated, and the first distance between each candidate pixel and the nearest line segment stripe midpoint is calculated, and the second target pixel with the smallest first distance is determined as the unmatched first target. Matching pixels of pixels.
  • the first target pixel point can also be used as a feature point to determine the first neighborhood symbol set and sum of any symbol in the first image.
  • the number of neighbor symbol matches, the matching number in the plurality of second neighbor symbol sets is The second neighborhood symbol set with the largest number of allocations is determined as the target second neighborhood symbol set; the candidate symbol corresponding to the target second neighborhood symbol set is determined as the target symbol.
  • the first neighborhood symbol set is a neighborhood symbol set of a symbol in the second image.
  • the symbol p in the second image is a line segment of a set length and the two endpoints corresponding to the line segment.
  • the neighborhood symbol set of symbol p is ⁇ p 1 , p 2 , p 3 , p 4 ⁇ ;
  • the second neighborhood symbol set is the neighborhood of the candidate symbol of any symbol in the second image.
  • Code element set for example: the candidate code element of code element p is code element q, the neighborhood code element set of code element q is ⁇ q 1 , q 2 , q 3 , q 4 ⁇ , in the candidate code element of code element p
  • the length of the line segment stripe corresponding to the target symbol in the first image is determined in the following manner.
  • the first Magnification and the second magnification multiply the ratio of the second magnification to the first magnification by the minimum length of the target symbol to determine the minimum length of the target symbol in the first image; multiply the width of the first image
  • the preset ratio value determines the maximum length of the target symbol in the first image.
  • the magnification of the projection equipment and the magnification of the image acquisition module are system inherent parameters.
  • the unit pixel length lp of the symbol image and the unit pixel length l c of the first image Since the minimum length of the unit pixel of the symbol image that the projection device can project is determined to be l min , according to this formula, the minimum length of the stripes in the first image L min can be determined.
  • the maximum length L max of the stripes in the pattern does not exceed H/2, then the length of each stripe L i ⁇ [L min , L max ].
  • the stripe length L in the projection pattern can be a fixed length value or a random length value. If it is a random length value, the length of each stripe can be determined by a pseudo-random sequence ⁇ L i ⁇ , and the value range of the pseudo-random sequence is L i ⁇ [L min , L max ].
  • the symbol image can be determined based on the length of the target symbol, the width of the target symbol, the spacing between the target symbols and the pixel coordinates of the center pixel of the target symbol.
  • Target area in which the pixel coordinates of the target symbol center pixel are randomly generated in the area where the symbol image is located; traverse all pixels in the target area, and when there is no target symbol in the target area, in the target area
  • Generate target symbols where the target symbols include at least a line segment of preset length and two endpoints corresponding to the line segment of preset length; generate target symbols in all target areas within the symbol image area, as shown in Figure 4, Five target code elements are shown.
  • the code element numbered 1 consists of a line segment and two circular endpoints corresponding to the line segment.
  • the code element numbered 2 consists of a line segment with a first preset length and two line segments with a second preset length. It consists of endpoints. The first preset length is greater than the second preset length.
  • the symbol number 3 is composed of one line segment of the first preset length and three line segments of the second preset length as endpoints.
  • the number 4 is composed of a line segment and the line segment itself. Composed of two endpoints, the symbol numbered 5 is composed of a line segment intersecting another line segment.
  • the target area where the target symbol is located is determined based on the coordinates of the center pixel of the target symbol, the length of the target symbol, and the width of the target symbol. There is only one target symbol in the target area.
  • a schematic diagram of the distribution of the first line segment stripes 201 in an image color spot the stripe length is L, the stripe width is S, and the stripe spacing is G/2, then the first line segment stripes 201 occupy The area size is (S+G) ⁇ (L+G).
  • An image spot area may contain only one stripe, or may contain no stripes.
  • a coordinate position (u, v) is randomly generated as the center position of a candidate image color spot, and then each pixel in the image to be filled corresponding to the color spot is traversed to retrieve the candidate image Whether the color spot already contains stripes. If it does not contain stripes, a stripe will be generated at the position of the candidate color spot. Otherwise, no stripes will be generated. Then, the next random coordinate generation, color spot retrieval, and stripe generation are performed. Repeat the above process until no more streaks are produced in the entire image to be filled.
  • this application can achieve rapid and accurate reconstruction of three-dimensional data on the surface of the measured object.
  • the stripe centerline reconstruction method in this application can achieve accurate three-dimensional data acquisition; the random stripe method increases the number of coding points (any pixel in the stripe center is a coding point), improves data redundancy, and thus improves Scanning efficiency.
  • the embodiment of the present application also provides a three-dimensional reconstruction system, as shown in Figure 6, including: at least two image acquisition devices 603, a projection device 604, and a first processor 602; the projection device 604 is used to convert predetermined code elements The image is projected onto the surface of the measured object 601; at least two image acquisition modules 603 are used to collect predetermined code element images from the surface of the measured object 601 to obtain the first image and the second image; the first processor 602 is used to obtain the first The first target pixel point of each symbol in the image, and determine the second target pixel point that matches the first target pixel point one by one in the second image to complete the matching; the existence of the first target pixel in the first image When the points have not been matched, perform a secondary match on the first target pixel points that have not completed the matching; and are also used to determine the three-dimensional coordinates of the first target pixel points based on the predetermined image parameters of the first image and the second image, to complete three-dimensional reconstruction.
  • the projection device 604 is used to convert
  • the image acquisition device 603 includes but is not limited to a grayscale camera and a color camera
  • the projection method of the projection device 604 includes but is not limited to DLP (Digtal Light Processing), MASK (Mask Projection), and DOE (Diffraction Projection). and other projection methods, capable of projecting structured light patterns.
  • DLP Total Light Processing
  • MASK Mask Projection
  • DOE DOE
  • other projection methods capable of projecting structured light patterns.
  • the embodiment of the present application also provides a model training device, as shown in Figure 7, including: an acquisition module 70, configured to acquire a first image and a second image.
  • the first image and the second image are respectively obtained by different image acquisition devices.
  • the code element image is obtained by collecting a code element image projected onto the surface of the object being measured.
  • the code element image contains multiple target code elements randomly distributed in a preset direction, and the target code elements are line segment stripes;
  • the matching module 72 is configured to obtain the first image The first target pixel point of each symbol is determined one by one in the second image to determine the second target pixel point that matches the first target pixel point to complete the matching; the existence of the first target pixel point in the first image is not When the matching is completed, a second matching is performed on the first target pixel point that has not completed the matching;
  • reconstruction module is configured to determine the three-dimensional coordinates of the first target pixel point based on the predetermined image parameters of the first image and the second image. , to complete the three-dimensional reconstruction.
  • the matching module 72 includes: a first determination sub-module, the first determination sub-module is configured to determine the first target pixel point that has been matched within the first peripheral preset area area of the first target pixel point that has not completed matching; determine that the matching has been completed.
  • the light planes corresponding to the first target pixel points of all symbols in the second peripheral preset area area of the matched first target pixel point are the first light plane sequence, where the first target pixel point includes: the midpoint of the line segment stripe and Feature points of line segment stripes; determine the three-dimensional coordinates of the unmatched first target pixel point in all light planes in the first light plane sequence as candidate three-dimensional coordinates; determine the corresponding pixel points of all candidate three-dimensional coordinates in the second image as candidate pixel points, and determine from the candidate pixel points the second target pixel point that matches the first target pixel point that has not completed matching.
  • the first determination sub-module includes: a first determination unit and a second determination unit; the first determination unit is configured to determine the closest second target pixel point of each candidate pixel point in the second image, and determine each candidate pixel point The distance to the nearest second target pixel is the first distance; the candidate pixel with the smallest first distance is determined as the third target pixel, and the second target pixel closest to the third target pixel is determined is a second target pixel that matches the second target pixel that has not completed matching; the second determination unit is configured to determine the three-dimensional coordinates of the first target pixel that has completed matching; The three-dimensional coordinates are substituted into the light plane equation corresponding to all symbols in the first image to obtain the residual parameters; the light plane corresponding to the light plane equation with the smallest residual parameter is determined as the light plane corresponding to the first target pixel point that has completed matching flat.
  • the acquisition module 70 includes: a second determination sub-module, the second determination sub-module is configured to determine the first magnification and the second magnification when multiple target symbols are randomly distributed according to the horizontal axis direction of the symbol image; The ratio of the second magnification to the first magnification multiplied by the minimum length of the target symbol is determined as the minimum length of the target symbol in the first image; the preset proportion of the width of the first image is used to determine the target in the first image The maximum length of the code element.
  • the second determination sub-module includes: a third determination unit, which is configured to determine the target area where the target symbol is located based on the coordinates of the center pixel of the target symbol, the length of the target symbol, and the width of the target symbol. There is only one target symbol in the area.
  • a non-volatile storage medium including a stored program, wherein when the program is running, the device where the non-volatile storage medium is located is controlled to perform the above three-dimensional reconstruction method.
  • a processor is also provided, and the processor is configured to run a program, wherein the above three-dimensional reconstruction method is executed when the program is running.
  • the above-mentioned processor is configured to run a program that performs the following functions: acquiring a first image and a second image.
  • the first image and the second image are respectively obtained by different image acquisition devices acquiring code element images projected onto the surface of the object being measured,
  • the code element image contains multiple target code elements randomly distributed in a preset direction, and the target code elements are line segment stripes; obtain the first target pixel point of each code element in the first image, and determine the first target pixel point of each code element in the second image one by one in the second image.
  • a second target pixel that matches one target pixel to complete the matching when there is a first target pixel in the first image that has not completed matching, perform a second matching on the first target pixel that has not completed matching;
  • the three-dimensional coordinates of the first target pixel point are determined according to the predetermined image parameters of the first image and the second image to complete three-dimensional reconstruction.
  • the above-mentioned processor executes the above-mentioned three-dimensional reconstruction method, and achieves the purpose of matching all the first target pixel points by performing secondary matching on the unmatched first target pixel points, thereby achieving the purpose of matching the first target pixel points in the first image.
  • the technical effect of complete matching of the first target pixel points thus solves the technical problem of low accuracy of three-dimensional reconstruction results due to incomplete matching during the symbol matching process.
  • the disclosed technical content can be implemented in other ways.
  • the device embodiments described above are only illustrative.
  • the division of units can be a logical functional division. In actual implementation, there may be other division methods.
  • multiple units or components can be combined or integrated into Another system, or some features can be ignored, or not implemented.
  • the coupling or direct coupling or communication connection between each other shown or discussed may be through some interfaces, and the indirect coupling or communication connection of the units or modules may be in electrical or other forms.
  • Units described as separate components may or may not be physically separate, and components shown as units may or may not be physical units, that is, they may be located in one place, or they may be distributed over multiple units. Some or all of the units can be selected according to actual needs to achieve the purpose of the solution of this embodiment.
  • each functional unit in each embodiment of the present application can be integrated into one processing unit, each unit can exist physically alone, or two or more units can be integrated into one unit.
  • the above integrated units can be implemented in the form of hardware or software functional units.
  • Integrated units may be stored in a computer-readable storage medium if they are implemented in the form of software functional units and sold or used as independent products.
  • the technical solution of the present application is essentially or contributes to the existing technology, or all or part of the technical solution can be embodied in the form of a software product, and the computer software product is stored in a storage medium , including several instructions to make a
  • the computer device (which can be a personal computer, a server or a network device, etc.) executes all or part of the steps of the methods of various embodiments of the present application.
  • the aforementioned storage media include: U disk, read-only memory (ROM, Read-Only Memory), random access memory (RAM, Random Access Memory), mobile hard disk, magnetic disk or optical disk and other media that can store program code. .
  • the technical solutions provided by the embodiments of the present application can be applied to the field of three-dimensional reconstruction.
  • the first image and the second image are acquired.
  • the first image and the second image are respectively collected by different image acquisition devices and projected to Obtained from the code element image of the surface of the object being measured, the code element image contains multiple target code elements randomly distributed in a preset direction, and the target code elements are line segment stripes;
  • the first target pixel point of each code element in the first image is obtained , and determine the second target pixel points that match the first target pixel point one by one in the second image to complete the matching; in the case where the first target pixel point in the first image has not completed the matching, the uncompleted matching Perform secondary matching on the first target pixel point; at least based on the predetermined pixel coordinates of the first target pixel point in the first image and the pixel coordinates of the second target pixel point in the second image
  • the coordinates determine the three-dimensional coordinates of the first target pixel point, in order to complete the three-dimensional reconstruction,

Landscapes

  • Engineering & Computer Science (AREA)
  • Theoretical Computer Science (AREA)
  • Physics & Mathematics (AREA)
  • Computer Vision & Pattern Recognition (AREA)
  • Software Systems (AREA)
  • General Physics & Mathematics (AREA)
  • Health & Medical Sciences (AREA)
  • Geometry (AREA)
  • Computer Graphics (AREA)
  • Artificial Intelligence (AREA)
  • Computing Systems (AREA)
  • Databases & Information Systems (AREA)
  • Evolutionary Computation (AREA)
  • General Health & Medical Sciences (AREA)
  • Medical Informatics (AREA)
  • Multimedia (AREA)
  • Length Measuring Devices By Optical Means (AREA)

Abstract

本申请公开了一种三维重建方法及装置、系统。其中,该方法包括:获取第一图像和第二图像,第一图像和第二图像分别由不同的图像采集设备采集被投射到被测物体表面的码元图像得到的,码元图像中包含多个目标码元按照预设方向随机分布,目标码元为线段条纹;获取第一图像中每个码元的第一目标像素点,并在第二图像中逐一确定与第一目标像素点匹配的第二目标像素点,以完成匹配;在第一图像中的存在第一目标像素点未完成匹配的情况下,对未完成匹配的第一目标像素点进行二次匹配;至少依据预先确定的第一目标像素点在第一图像中的像素坐标和第二目标像素点在第二图像中的像素坐标确定第一目标像素点的三维坐标,以完成三维重建。

Description

三维重建方法及装置、系统
交叉援引
本申请要求于2022年08月10日提交中国专利局,申请号为202210956344.4,申请名称为“三维重建方法及装置、系统”的中国专利申请的优先权,其全部内容通过引用结合在本申请中。
技术领域
本申请涉及三维重建领域,具体而言,涉及一种三维重建方法及装置、系统。
背景技术
结构光三维重建技术是一种向被测物体表面投射光学编码图案,并通过采集到的变形图案恢复物体表面三维数据的三维重构技术。其具有高效率、抗干扰等特性,被广泛应用于各类三维重建场景。结构光三维重建技术中核心问题是同名像素点的匹配。不同的匹配策略,依赖于不同的编码方法。根据投射图案的编码方式不同,结构光技术可分为时间编码和空间编码。时间编码需要向测量场景中时序投射多帧图案,通常要求被测物体与投射器相对静止,因此无法实现高帧率扫描,适用场景受到一定的限制,多用于静态扫描场景。空间编码通常只需要向被测场景中投射一幅图案即可完成三维重建,目前,相关技术中,通过圆形码元的编码方式,利用邻域码元的相对位移关系来解码,另一种方式是通过图像块间的匹配来获取三维数据,前一种方式,码元的编码容量较小,只能重建稀疏的码元点,单帧数据少,扫描效率低;后一种方式,利用图像块进行匹配,精度较低。
针对上述问题,目前尚未提出有效的解决方案。
发明内容
本申请实施例提供了一种三维重建方法及装置、系统,以至少解决由于码元匹配的过程中匹配不完全造成三维重建结果准确性低的技术问题。
根据本申请实施例的一个方面,提供了一种三维重建方法,包括:获取第一图像和第二图像,第一图像和第二图像分别由不同的图像采集设备采集被投射到被测物体表面的码元图像得到的,码元图像中包含多个目标码元按照预设方向随机分布,目标码元为线段条纹;获取第一图像中每个码元的第一目标像素点,并在第二图像中逐一确定与第一目标像素点匹配的第二目标像素点,以完成匹配;在第一图像中的存在第 一目标像素点未完成匹配的情况下,对未完成匹配的第一目标像素点进行二次匹配;至少依据预先确定的所述第一目标像素点在所述第一图像中的像素坐标和所述第二目标像素点在所述第二图像中的像素坐标确定所述第一目标像素点的三维坐标,以完成三维重建。
在本申请的一些实施例中,对未完成匹配的第一目标像素点进行二次匹配,包括:确定未完成匹配的第一目标像素点的第一周边预设面积区域内已完成匹配的第一目标像素点;确定已完成匹配的第一目标像素点的第二周边预设面积区域内所有码元第一目标像素点对应的光平面为第一光平面序列,其中,第一目标像素点包括:线段条纹的中点和线段条纹的特征点;确定未完成匹配的第一目标像素点在第一光平面序列中所有光平面中的三维坐标为候选三维坐标;确定所有候选三维坐标在第二图像中对应的像素点为候选像素点,并从候选像素点中确定未完成匹配的第一目标像素点匹配的第二目标像素点。
在本申请的一些实施例中,从候选像素点中确定未完成匹配的第一目标像素点匹配的第二目标像素点,包括:确定每个候选像素点在第二图像中距离最近的第二目标像素点,并确定每个候选像素点与距离最近的第二目标像素点的距离为第一距离;将第一距离最小的候选像素点确定为第三目标像素点,并将与第三目标像素点距离最近的第二目标像素点确定为与未完成匹配的第二目标像素点匹配的第二目标像素点。
在本申请的一些实施例中,方法还包括:确定已完成匹配的第一目标像素点的三维坐标;将已完成匹配的第一目标像素点的三维坐标代入到第一图像中所有码元对应的光平面方程中,得到残差参数;将残差参数最小的光平面方程对应的光平面确定为已完成匹配的第一目标像素点对应的光平面。
在本申请的一些实施例中,第一图像中目标码元的长度通过以下方式确定,包括:在多个目标码元按照码元图像横轴方向随机分布的情况下,确定第一放大倍率和第二放大倍率;将第二放大倍率与第一放大倍率的比值乘以目标码元的长度最小值确定为第一图像中目标码元的长度最小值;将第一图像的宽度的预设比例值确定第一图像中目标码元的长度最大值。
在本申请的一些实施例中,预先确定的码元图像中包含多个目标码元按照预设方向随机分布,目标码元为线段条纹,包括:基于目标码元的长度、目标码元的宽度、目标码元之间的间距和目标码元中心像素点的像素坐标确定目标码元所在的目标区域,其中,目标码元中心像素点的像素坐标在码元图像所在区域内随机生成的;遍历目标区域内的所有像素点,在目标区域中不存在目标码元的情况下,在目标区域生成目标码元,其中目标码元至少包括一条预设长度的线段和预设长度的线段对应的两个端点;在码元图像区域内的所有目标区域生成目标码元。
在本申请的一些实施例中,方法还包括:确定第一图像中任一码元的第一邻域码元集合和第二图像中多个候选码元的多个第二邻域码元集合;确定多个第二邻域码元集合中的邻域码元与第一邻域码元集合中的邻域码元匹配的个数,将多个第二邻域码元集合中匹配的个数最多的第二邻域码元集合确定为目标第二邻域码元集合;将目标第二邻域码元集合对应的候选码元确定为与任一码元匹配的码元。
在本申请的一些实施例中,方法还包括:依据目标码元的中心像素点坐标、目标码元的长度和目标码元的宽度确定目标码元所处的目标区域,目标区域中只存在一个目标码元。
根据本申请实施例的另一方面,还提供了一种三维重建装置,包括:获取模块,用于获取第一图像和第二图像,第一图像和第二图像分别由不同的图像采集设备采集被投射到被测物体表面的码元图像得到的,码元图像中包含多个目标码元按照预设方向随机分布,目标码元为线段条纹;匹配模块,用于获取第一图像中每个码元的第一目标像素点,并在第二图像中逐一确定与第一目标像素点匹配的第二目标像素点,以完成匹配;在第一图像中的存在第一目标像素点未完成匹配的情况下,对未完成匹配的第一目标像素点进行二次匹配;重建模块,用于至少依据预先确定的所述第一目标像素点在所述第一图像中的像素坐标和所述第二目标像素点在所述第二图像中的像素坐标确定所述第一目标像素点的三维坐标,以完成三维重建。
根据本申请实施例的再一方面,还提供了一种三维重建系统,应用于三维重建方法,包括:至少两个图像采集设备、投影设备和第一处理器;投影设备用于将预先确定的码元图像投射到被测物体表面;至少两个图像采集模块用于从被测物体表面采集预先确定的码元图像得到第一图像和第二图像;第一处理器用于获取第一图像中每个码元的第一目标像素点,并在第二图像中逐一确定与第一目标像素点匹配的第二目标像素点,以完成匹配;在第一图像中的存在第一目标像素点未完成匹配的情况下,对未完成匹配的第一目标像素点进行二次匹配;还用于至少依据预先确定的所述第一目标像素点在所述第一图像中的像素坐标和所述第二目标像素点在所述第二图像中的像素坐标确定所述第一目标像素点的三维坐标,以完成三维重建。。
根据本申请实施例的再一方面,还提供了一种非易失性存储介质,非易失性存储介质包括存储的程序,其中,在程序运行时控制非易失性存储介质所在设备执行上述三维重建方法。
根据本申请实施例的再一方面,还提供了一种电子设备,包括:存储器和处理器;处理器设置为运行程序,其中,程序运行时执行上述的三维重建方法。
在本申请实施例中,采用获取第一图像和第二图像,第一图像和第二图像分别由 不同的图像采集设备采集被投射到被测物体表面的码元图像得到的,码元图像中包含多个目标码元按照预设方向随机分布,目标码元为线段条纹;获取第一图像中每个码元的第一目标像素点,并在第二图像中逐一确定与第一目标像素点匹配的第二目标像素点,以完成匹配;在第一图像中的存在第一目标像素点未完成匹配的情况下,对未完成匹配的第一目标像素点进行二次匹配;至少依据预先确定的所述第一目标像素点在所述第一图像中的像素坐标和所述第二目标像素点在所述第二图像中的像素坐标确定所述第一目标像素点的三维坐标,以完成三维重建的方式,通过对未完成匹配的第一目标像素点进行二次匹配的方式,达到了将所有的第一目标像素点匹配完成的目的,从而实现了第一图像中的第一目标像素点完全匹配的技术效果,进而解决了由于码元匹配的过程中匹配不完全造成三维重建结果准确性低技术问题。
附图说明
此处所说明的附图用来提供对本申请的进一步理解,构成本申请的一部分,本申请的示意性实施例及其说明用于解释本申请,并不构成对本申请的不当限定。在附图中:
图1是根据本申请实施例的一种用于三维重建方法的计算机终端(或移动设备)的硬件结构框图;
图2是根据本申请的一种三维重建方法的示意图;
图3a根据本申请实施例的一种可选的码元图像示意图;
图3b根据本申请实施例的另一种可选的码元图像示意图;
图4是根据本申请实施例的五种可选的码元形状示意图;
图5是根据本申请实施例的一种可选的线段条纹示意图;
图6是根据本申请实施例的一种可选的三维重建系统;
图7是根据本申请实施例的一种可选的三维重建装置;
其中,上述附图包括以下附图标记:
601、被测物体;602、第一处理器;603、图像采集设备;604、投影设备;201、第一线段条纹。
具体实施方式
为了使本技术领域的人员更好地理解本申请方案,下面将结合本申请实施例中的 附图,对本申请实施例中的技术方案进行清楚、完整地描述,显然,所描述的实施例仅仅是本申请一部分的实施例,而不是全部的实施例。基于本申请中的实施例,本领域普通技术人员在没有做出创造性劳动前提下所获得的所有其他实施例,都应当属于本申请保护的范围。
需要说明的是,本申请的说明书和权利要求书及上述附图中的术语“第一”、“第二”等是用于区别类似的对象,而不必用于描述特定的顺序或先后次序。应该理解这样使用的数据在适当情况下可以互换,以便这里描述的本申请的实施例能够以除了在这里图示或描述的那些以外的顺序实施。此外,术语“包括”和“具有”以及他们的任何变形,意图在于覆盖不排他的包含,例如,包含了一系列步骤或单元的过程、方法、系统、产品或设备不必限于清楚地列出的那些步骤或单元,而是可包括没有清楚地列出的或对于这些过程、方法、产品或设备固有的其它步骤或单元。
根据本申请实施例,还提供了一种三维重建方法的实施例,需要说明的是,在附图的流程图示出的步骤可以在诸如一组计算机可执行指令的计算机系统中执行,并且,虽然在流程图中示出了逻辑顺序,但是在某些情况下,可以以不同于此处的顺序执行所示出或描述的步骤。
本申请实施例所提供的方法实施例可以在移动终端、计算机终端、云端服务器或者类似的运算装置中执行。图1示出了一种用于实现三维重建方法的计算机终端(或移动设备)的硬件结构框图。如图1所示,计算机终端10(或移动设备10)可以包括一个或多个(图中采用102a、102b,……,102n来示出)处理器102(处理器102可以包括但不限于微处理器MCU或可编程逻辑器件FPGA等的处理装置)、用于存储数据的存储器104、以及用于通信功能的传输模块106。除此以外,还可以包括:显示器、输入/输出接口(I/O接口)、通用串行总线(USB)端口(可以作为I/O接口的端口中的一个端口被包括)、网络接口、电源和/或相机。本领域普通技术人员可以理解,图1所示的结构仅为示意,其并不对上述电子装置的结构造成限定。例如,计算机终端10还可包括比图1中所示更多或者更少的组件,或者具有与图1所示不同的配置。
应当注意到的是上述一个或多个处理器102和/或其他数据处理电路在本文中通常可以被称为“数据处理电路”。该数据处理电路可以全部或部分的体现为软件、硬件、固件或其他任意组合。此外,数据处理电路可为单个独立的处理模块,或全部或部分的结合到计算机终端10(或移动设备)中的其他元件中的任意一个内。如本申请实施例中所涉及到的,该数据处理电路作为一种处理器控制(例如与接口连接的可变电阻终端路径的选择)。
存储器104可用于存储应用软件的软件程序以及模块,如本申请实施例中的三维重建方法对应的程序指令/数据存储装置,处理器102通过运行存储在存储器104内的 软件程序以及模块,从而执行各种功能应用以及数据处理,即实现上述的应用程序的三维重建方法。存储器104可包括高速随机存储器,还可包括非易失性存储器,如一个或者多个磁性存储装置、闪存、或者其他非易失性固态存储器。在一些实例中,存储器104可进一步包括相对于处理器102远程设置的存储器,这些远程存储器可以通过网络连接至计算机终端10。上述网络的实例包括但不限于互联网、企业内部网、局域网、移动通信网及其组合。
传输模块106用于经由一个网络接收或者发送数据。上述的网络具体实例可包括计算机终端10的通信供应商提供的无线网络。在一个实例中,传输模块106包括一个网络适配器(Network Interface Controller,NIC),其可通过基站与其他网络设备相连从而可与互联网进行通讯。在一个实例中,传输模块106可以为射频(Radio Frequency,RF)模块,其用于通过无线方式与互联网进行通讯。
显示器可以例如触摸屏式的液晶显示器(LCD),该液晶显示器可使得用户能够与计算机终端10(或移动设备)的用户界面进行交互。
根据本申请实施例,提供了一种三维重建方法的实施例,需要说明的是,在附图的流程图示出的步骤可以在诸如一组计算机可执行指令的计算机系统中执行,并且,虽然在流程图中示出了逻辑顺序,但是在某些情况下,可以以不同于此处的顺序执行所示出或描述的步骤。
图2是根据本申请实施例的三维重建方法的流程图,如图2所示,该方法包括如下步骤:
步骤S202,获取第一图像和第二图像,第一图像和第二图像分别由不同的图像采集设备采集被投射到被测物体表面的码元图像得到的,码元图像中包含多个目标码元按照预设方向随机分布,目标码元为线段条纹;
步骤S204,获取第一图像中每个码元的第一目标像素点,并在第二图像中逐一确定与第一目标像素点匹配的第二目标像素点,以完成匹配;在第一图像中的存在第一目标像素点未完成匹配的情况下,对未完成匹配的第一目标像素点进行二次匹配;
步骤S206,至少依据预先确定的所述第一目标像素点在所述第一图像中的像素坐标和所述第二目标像素点在所述第二图像中的像素坐标确定所述第一目标像素点的三维坐标,以完成三维重建。
通过上述步骤,可以实现通过对未完成匹配的第一目标像素点进行二次匹配的方式,达到了将所有的第一目标像素点匹配完成的目的,从而实现了第一图像中的第一目标像素点完全匹配的技术效果,进而解决了由于码元匹配的过程中匹配不完全造成三维重建结果准确性低技术问题。
需要进行说明的是,三维重建是指对三维物体建立适合计算机表示和处理的数学模型,是在计算机环境下对其进行处理、操作和分析其性质的基础,也是在计算机中建立表达客观事件的虚拟现实技术。三维重建技术中通常会用到结构光时间编码和空间编码技术,其中,空间编码结构光难点在于利用像素空间灰度信息,对每个像素点进行稳定可靠的编解码,相关技术中的一类方法是通过一定的码元信息对特定像素点进行编码,通过大小圆形码元的编码方式,利用极线约束条件和邻域码元与图像中码元的相对位移关系对每个码元进行编解码,从而实现每个码元点的重建。这类方法的缺点在于码元编码容量有限,只能重建稀疏的码元点,单帧数据少,扫描效率低;受物体表面纹理影响较大。另外一类方法是通过随机散斑的形式,通过像素块间的相关性进行同名点匹配,例如:将伪随机散斑图像通过投影设备投射到被测物体表面,采用改进的SGM(semi-global matching,立体匹配)算法进行图像块间的匹配,从而得到被测物体表面的三维数据。该类方法的缺点是匹配适用的图像块空间大,重建数据精度和细节较差,难以实现复杂物体的重建。相关技术中,在码元匹配的过程中,由于图像噪声、匹配图像块大小和对于被测物体表面存在深度不连续的地方,往往相邻点的深度差较大,条纹图案调制较为严重等原因,会导致第一图像中部分码元的第一目标像素点无法正确找到匹配点,从而造成局部数据缺失。由于物体表面小区域内深度呈现一定的连续性,因此可根据深度连续性原则进行匹配点补全。
同时,本申请提出的方法中,利用线段条纹码元的编码方式,线段条纹中的多个特征点,且码元间随机分布,增加了编码容量,提高了单帧图像的数据量,进而提高了扫描效率。同时由于本申请提出的方法是线段条纹分布,码元的分布密度高,提高了获取到的重建数据的精确度。
本申请提出的方法可以应用于口腔扫描仪、面部扫描仪、工业扫描仪、专业扫描仪等扫描仪,可实现对牙齿、人脸、人身、工业产品、工业设备、文物、艺术品、假肢、医疗器具、建筑等物品或场景的三维重建。
在步骤S202中,第一图像为第一图像采集设备采集的图像,第二图像为第二图像采集设备采集的图像;第一图像中的多个目标码元朝向相同,码元间的间距随机确定,例如:1号码元的与2号码元之间的间距为100μm,2号码元与3号码元之间的间距为80μm。码元按照图像高度方向随机上下波动分布,码元的特征包含至少两个可提取的灰度特征点图案,且特征点之间呈预设间距上下分布,条纹的位置可以在图像宽度和图像高度两个方向随机分布,也可以只在其中任一方向随机分布。图3a示出了一种随机按照图像宽度和图像高度两个方向随机分布的码元图像,图3a中条纹长度固定,图3b示出了一种按照图像高度方向条纹位置随机分布的码元图像,图3b中条纹长度随机设定。
在步骤S204中,获取第一图像中每个码元的第一目标像素点,并在第二图像中逐一确定与第一目标像素点匹配的第二目标像素点,以完成匹配,可以理解的是,第一图像与第二图像为不同的图像采集设备采集被投射到被测物体表面的投影图像得到的,所以第一图像中的码元结构和第二图像中的码元结构相同。由此可知,以第一目标像素点为第一图像中线段条纹的中点为例,第二图像中的匹配像素点为第二图像中线段条纹的中点。造成码元未匹配的原因有很多,例如:图像噪声、图像不清晰等。而且本申请实施例中提供的三维重建方法适用于码元特征点匹配方法中对未匹配的码元特征点进行二次匹配,也适用于目标像素点匹配方法中对未匹配目标像素点进行二次匹配。
在步骤S206中,第一图像中的目标码元与第二图像中的码元一一对应,第一图像为预设的,即第一图像中码元的投影参数为预设,根据目标码元与第二图像中的码元一一对应的关系,结合三角测量方法,确定第二图像中码元的三维坐标。
下面通过具体的实施例来详细说明上述步骤S202至步骤S206。
在步骤S204中对未完成匹配的第一目标像素点进行二次匹配,可以利用物体表面在小范围区域内深度呈现一定的连续性来进行补全,具体的,对第一图像中未找到正确匹配点的第一目标像素点P,利用其邻域内已经重建出来的第一目标像素点的深度{Di}插值出第一目标像素点P的预估深度Ds=f(Di),f(Di)为代入Di的差值函数。然后根据预估深度Ds可进一步计算得到P在第二图像中的预估匹配点的像素位置Q,最后在第二图像的Q点邻域像素内查找距离Q点最近的第一目标像素点将点确定为P点的匹配像素点。也可以利用光平面顺序的一致性原则进行二次匹配,具体的,确定未完成匹配的第一目标像素点的第一周边预设面积区域内已完成匹配的第一目标像素点;确定已完成匹配的第一目标像素点的第二周边预设面积区域内所有码元第一目标像素点对应的光平面为第一光平面序列,其中,第一目标像素点包括:线段条纹的中点和线段条纹的特征点;确定未完成匹配的第一目标像素点在第一光平面序列中所有光平面中的三维坐标为候选三维坐标;确定所有候选三维坐标在第二图像中对应的像素点为候选像素点,并从候选像素点中确定未完成匹配的第一目标像素点匹配的第二目标像素点。
具体的,对第一图像中任一个未匹配的第一目标像素点U,搜索其邻域(第一周边预设面积区域)已经成功匹配的第一目标像素点,获取其邻域(第二周边预设面积区域)的光平面序列{Nk}k∈V,然后根据标定的光平面序列信息,补全当前的邻域光平面序列,获得第一目标像素点U的候选光平面序列{Nk}k∈M(第一光平面序列)M是V经过补全后的序列范围。
接着,在三角测量原理中,由相机参数、条纹中心坐标以及光平面方程,即可计 算该中心点的三维坐标。遍历候选光平面,计算U点在每一个光平面中的三维坐标值{(x,y,z)T k}k∈M。根据每一个三维坐标,确定U点对应的多个在第二图像中的候选像素点,并从多个候选像素点中确定与U点匹配的第二目标像素点。
需要进行说明的是,条纹结构光图案通过投射模块投影到被测物体表面的过程,可以看成是图案中每一个条纹在空间产生一个光平面,每个光平面与物体表面相交,在物体表面形成变形的条纹图案。由于光平面信息可以预先标定获取,每一个条纹可由一个光平面序号和光平面参数表示。同时,图案中每个条纹自身光平面序号与其邻域条纹的光平面序号可组成一个已知的序列{Ni}i∈V,V表示邻域条纹的光平面,Ni表示光平面序号。对于一个已经获得三维坐标的第一目标像素点,将其坐标代入到任一光平面方程中f(x,y,z)=ax+by+cz+d,可计算得到一个残差参数该残差参数表示点到光平面的距离,残差越小,该点属于光光平面的可能性越高,可以理解的是,将所有的第一目标像素点的三维坐标代入到目标光平面方程中,残差参数最小的第一目标像素点确定属于目标光平面。
在一种可选的方式中,在确定第一光平面序列之前,确定已完成匹配的第一目标像素点的三维坐标;将已完成匹配的第一目标像素点的三维坐标代入到第一图像中所有码元对应的光平面方程中,得到残差参数;将残差参数最小的光平面方程对应的光平面确定为已完成匹配的第一目标像素点对应的光平面。具体的,遍历所有光平面,找到残差最小的光平面,该光平面即为当前第一目标像素点对应的光平面,并记录光平面的序号。依次对所有已经重建出来的条纹中心点进行光平面序号的标记。
在本申请的一些实施例中,提出了一种从候选像素点中确定未完成匹配的第一目标像素点匹配的第二目标像素点的方法,包括:确定每个候选像素点在第二图像中距离最近的第二目标像素点,并确定每个候选像素点与距离最近的第二目标像素点的距离为第一距离;将第一距离最小的候选像素点确定为第三目标像素点,并将与第三目标像素点距离最近的第二目标像素点确定为与未完成匹配的第二目标像素点匹配的第二目标像素点。
具体的,从第二图像中搜索每个候选像素点最近的第二目标像素点,以第二目标像素点为第二图像中的线段条纹中点为例,在第二图像中搜索每个候选像素点距离最近的线段条纹中点,并计算每个候选像素点与距离其最近的线段条纹中点的第一距离,将第一距离最小的第二目标像素点确定为未匹配的第一目标像素点的匹配像素点。
在确定完第一目标像素点的匹配像素点第二目标像素点之后,还可以第一目标像素点为特征点,确定所述第一图像中任一码元的第一邻域码元集合和第二图像中多个候选码元的多个第二邻域码元集合;确定所述多个第二邻域码元集合中的邻域码元与所述第一邻域码元集合中的邻域码元匹配的个数,将所述多个第二邻域码元集合中匹 配的个数最多的第二邻域码元集合确定为目标第二邻域码元集合;将所述目标第二邻域码元集合对应的所述候选码元确定为所述目标码元。
具体的,第一邻域码元集合为第二图像中一个码元的邻域码元集合,例如:以第二图像中的码元p为一个设定长度的线段和线段对应的两个端点为例,码元p的邻域码元集合为{p1,p2,p3,p4};第二邻域码元集合为第二图像中任一个码元的候选码元的邻域码元集合,例如:码元p的候选码元为码元q,码元q的邻域码元集合为{q1,q2,q3,q4},在码元p的候选码元存在三个情况下,例如:qi,i=1,2,3,就存在三个第二邻域码元集合在码元p存在多个的情况,与上述情况类似,在此不再赘述。
需要进一步说明的是,以码元p为例,对于码元p的每一个候选码元qi,i为正整数,首先确认码元p1是否与候选码元q1的第一个邻域码元匹配,然后依次检索码元p的所有候选码元,确定所有候选码元的邻域码元与码元p1是否匹配,与码元p1匹配的候选码元,匹配个数据加1,例如:码元与码元p1匹配,则候选码元q1的匹配个数为1,如果码元与码元p2匹配则候选码元q1的匹配个数加1为2。将匹配个数最大的候选码元确定为目标码元,以码元p为例,在候选码元q1与码元p的匹配个数最多,则候选码元q1确定为目标码元与码元p匹配。
在本申请的一些实施例中,第一图像中的目标码元对应的线段条纹的长度通过以下方式确定,在多个目标码元按照码元图像横轴方向随机分布的情况下,确定第一放大倍率和第二放大倍率;将第二放大倍率与第一放大倍率的比值乘以目标码元的长度最小值确定为第一图像中目标码元的长度最小值;将第一图像的宽度的预设比例值确定第一图像中目标码元的长度最大值。
具体的,当重建系统的投影模块及图像采集模块确定之后,投影设备的放大倍率和图像采集模块的放大倍率为系统固有参数。码元图像的单元像素长度lp与第一图像的单元像素长度lc之间存在关系:由于投影设备能够投射的码元图像的单元像素的最小长度确定为lmin,根据该公式,可确定第一图像中条纹的最小长度Lmin。为保证宽度W,高度H的投射图案的随机性,图案中的条纹最大长度Lmax不超过H/2,则每一个条纹长度Li∈[Lmin,Lmax]。
特别的,投射图案中条纹长度L可以是固定长度值,也可以是随机长度值。若为随机长度值时,每一个条纹的长度可由一个伪随机序列{Li}确定,伪随机序列的取值范围为Li∈[Lmin,Lmax]。
在本申请的一些实施例中,码元图像的确定可以基于目标码元的长度、目标码元的宽度、目标码元之间的间距和目标码元中心像素点的像素坐标确定目标码元所在的 目标区域,其中,目标码元中心像素点的像素坐标在码元图像所在区域内随机生成的;遍历目标区域内的所有像素点,在目标区域中不存在目标码元的情况下,在目标区域生成目标码元,其中目标码元至少包括一条预设长度的线段和预设长度的线段对应的两个端点;在码元图像区域内的所有目标区域生成目标码元,如图4所示,示出了五种目标码元,标号1的码元由一条线段和线段对应的两个圆形端点组成,标号2由一条第一预设长度的线段和两条第二预设长度的线段为端点组成,第一预设长度大于第二预设长度,标号3的码元由一条第一预设长度的线段和三条第二预设长度的线段为端点组成,标号4由一条线段和线段本身的两个端点组成,标号5的码元由一条线段和另一条线段相交。
在生成码元图像的过程中,依据目标码元的中心像素点坐标、目标码元的长度和目标码元的宽度确定目标码元所处的目标区域,目标区域中只存在一个目标码元。如图5所示,一个图像色斑中第一线段条纹201的分布示意图,条纹长度为L、条纹宽度为S、条纹间距为G/2,则第一线段条纹201在图像中占据的区域大小为(S+G)×(L+G)。在一个图像色斑的区域内只能包含一个条纹,或者不包含条纹。在W×H的待填充条纹的图像中,随机生成一个坐标位置(u,v)作为一个候选图像色斑中心位置,然后遍历该色斑对应的待填充图像中的每一个像素,检索候选图像色斑内是否已经包含有条纹,若不包含条纹,则在中候选色斑位置处产生一个条纹,否则不产生条纹。接着,进行下一个随机坐标的生成、色斑的检索、条纹的生成。重复上述过程,直到整个待填充图像中无法再产生条纹为止。
基于上述方法,本申请可实现被测物体表面三维数据的快速、精确的重建。本申请中条纹中心线重建方法可实现精确的三维数据获取;随机条纹的方式,增加了编码点的数量(条纹中心任一像素点均为编码点),提升了数据冗余度,进而提高了扫描效率。
本申请实施例还提供了一种三维重建系统,如图6所示,包括:至少两个图像采集设备603、投影设备604和第一处理器602;投影设备604用于将预先确定的码元图像投射到被测物体601表面;至少两个图像采集模块603用于从被测物体601表面采集预先确定的码元图像得到第一图像和第二图像;第一处理器602用于获取第一图像中每个码元的第一目标像素点,并在第二图像中逐一确定与第一目标像素点匹配的第二目标像素点,以完成匹配;在第一图像中的存在第一目标像素点未完成匹配的情况下,对未完成匹配的第一目标像素点进行二次匹配;还用于依据预先确定的第一图像和第二图像的图像参数确定第一目标像素点的三维坐标,以完成三维重建。
其中,图像采集设备603包括但不限于灰度相机和彩色相机,投影设备604的投影方式包括但不限于DLP(Digtal Light Processing,数字光处理)、MASK(掩码投影)、DOE(衍射投影)等投影方式,能够投射结构光图案。一种可选的方式中,图像采集设备也可以是多个。
本申请实施例还提供了一种模型训练装置,如图7所示,包括:获取模块70,设置为获取第一图像和第二图像,第一图像和第二图像分别由不同的图像采集设备采集被投射到被测物体表面的码元图像得到的,码元图像中包含多个目标码元按照预设方向随机分布,目标码元为线段条纹;匹配模块72,设置为获取第一图像中每个码元的第一目标像素点,并在第二图像中逐一确定与第一目标像素点匹配的第二目标像素点,以完成匹配;在第一图像中的存在第一目标像素点未完成匹配的情况下,对未完成匹配的第一目标像素点进行二次匹配;74重建模块,设置为依据预先确定的第一图像和第二图像的图像参数确定第一目标像素点的三维坐标,以完成三维重建。
匹配模块72包括:第一确定子模块,第一确定子模块设置为确定未完成匹配的第一目标像素点的第一周边预设面积区域内已完成匹配的第一目标像素点;确定已完成匹配的第一目标像素点的第二周边预设面积区域内所有码元第一目标像素点对应的光平面为第一光平面序列,其中,第一目标像素点包括:线段条纹的中点和线段条纹的特征点;确定未完成匹配的第一目标像素点在第一光平面序列中所有光平面中的三维坐标为候选三维坐标;确定所有候选三维坐标在第二图像中对应的像素点为候选像素点,并从候选像素点中确定未完成匹配的第一目标像素点匹配的第二目标像素点。
第一确定子模块包括:第一确定单元和第二确定单元;第一确定单元设置为确定每个候选像素点在第二图像中距离最近的第二目标像素点,并确定每个候选像素点与距离最近的第二目标像素点的距离为第一距离;将第一距离最小的候选像素点确定为第三目标像素点,并将与第三目标像素点距离最近的第二目标像素点确定为与未完成匹配的第二目标像素点匹配的第二目标像素点;第二确定单元设置为确定已完成匹配的第一目标像素点的三维坐标;将已完成匹配的第一目标像素点的三维坐标代入到第一图像中所有码元对应的光平面方程中,得到残差参数;将残差参数最小的光平面方程对应的光平面确定为已完成匹配的第一目标像素点对应的光平面。
获取模块70包括:第二确定子模块,第二确定子模块设置为在多个目标码元按照码元图像横轴方向随机分布的情况下,确定第一放大倍率和第二放大倍率;将第二放大倍率与第一放大倍率的比值乘以目标码元的长度最小值确定为第一图像中目标码元的长度最小值;将第一图像的宽度的预设比例值确定第一图像中目标码元的长度最大值。
第二确定子模块包括:第三确定单元,第三确定单元设置为依据目标码元的中心像素点坐标、目标码元的长度和目标码元的宽度确定目标码元所处的目标区域,目标区域中只存在一个目标码元。
根据本申请实施例的另一方面,还提供了一种非易失性存储介质,包括存储的程序,其中,在程序运行时控制非易失性存储介质所在设备执行上述三维重建方法。
根据本申请实施例的另一方面,还提供了一种处理器,处理器设置为运行程序,其中,程序运行时执行上述三维重建方法。
上述处理器设置为运行执行以下功能的程序:获取第一图像和第二图像,第一图像和第二图像分别由不同的图像采集设备采集被投射到被测物体表面的码元图像得到的,码元图像中包含多个目标码元按照预设方向随机分布,目标码元为线段条纹;获取第一图像中每个码元的第一目标像素点,并在第二图像中逐一确定与第一目标像素点匹配的第二目标像素点,以完成匹配;在第一图像中的存在第一目标像素点未完成匹配的情况下,对未完成匹配的第一目标像素点进行二次匹配;依据预先确定的第一图像和第二图像的图像参数确定第一目标像素点的三维坐标,以完成三维重建。
上述处理器执行上述三维重建方法,通过对未完成匹配的第一目标像素点进行二次匹配的方式,达到了将所有的第一目标像素点匹配完成的目的,从而实现了第一图像中的第一目标像素点完全匹配的技术效果,进而解决了由于码元匹配的过程中匹配不完全造成三维重建结果准确性低技术问题。
在本申请的上述实施例中,对各个实施例的描述都各有侧重,某个实施例中没有详述的部分,可以参见其他实施例的相关描述。
在本申请所提供的几个实施例中,应该理解到,所揭露的技术内容,可通过其它的方式实现。其中,以上所描述的装置实施例仅仅是示意性的,例如单元的划分,可以为一种逻辑功能划分,实际实现时可以有另外的划分方式,例如多个单元或组件可以结合或者可以集成到另一个系统,或一些特征可以忽略,或不执行。另一点,所显示或讨论的相互之间的耦合或直接耦合或通信连接可以是通过一些接口,单元或模块的间接耦合或通信连接,可以是电性或其它的形式。
作为分离部件说明的单元可以是或者也可以不是物理上分开的,作为单元显示的部件可以是或者也可以不是物理单元,即可以位于一个地方,或者也可以分布到多个单元上。可以根据实际的需要选择其中的部分或者全部单元来实现本实施例方案的目的。
另外,在本申请各个实施例中的各功能单元可以集成在一个处理单元中,也可以是各个单元单独物理存在,也可以两个或两个以上单元集成在一个单元中。上述集成的单元既可以采用硬件的形式实现,也可以采用软件功能单元的形式实现。
集成的单元如果以软件功能单元的形式实现并作为独立的产品销售或使用时,可以存储在一个计算机可读取存储介质中。基于这样的理解,本申请的技术方案本质上或者说对现有技术做出贡献的部分或者该技术方案的全部或部分可以以软件产品的形式体现出来,该计算机软件产品存储在一个存储介质中,包括若干指令用以使得一台 计算机设备(可为个人计算机、服务器或者网络设备等)执行本申请各个实施例方法的全部或部分步骤。而前述的存储介质包括:U盘、只读存储器(ROM,Read-Only Memory)、随机存取存储器(RAM,Random Access Memory)、移动硬盘、磁碟或者光盘等各种可以存储程序代码的介质。
以上仅是本申请的优选实施方式,应当指出,对于本技术领域的普通技术人员来说,在不脱离本申请原理的前提下,还可以做出若干改进和润饰,这些改进和润饰也应视为本申请的保护范围。
工业实用性
本申请实施例提供的技术方案可适用于三维重建领域,在本申请实施例中,采用获取第一图像和第二图像,第一图像和第二图像分别由不同的图像采集设备采集被投射到被测物体表面的码元图像得到的,码元图像中包含多个目标码元按照预设方向随机分布,目标码元为线段条纹;获取第一图像中每个码元的第一目标像素点,并在第二图像中逐一确定与第一目标像素点匹配的第二目标像素点,以完成匹配;在第一图像中的存在第一目标像素点未完成匹配的情况下,对未完成匹配的第一目标像素点进行二次匹配;至少依据预先确定的所述第一目标像素点在所述第一图像中的像素坐标和所述第二目标像素点在所述第二图像中的像素坐标确定所述第一目标像素点的三维坐标,以完成三维重建的方式,通过对未完成匹配的第一目标像素点进行二次匹配的方式,达到了将所有的第一目标像素点匹配完成的目的,从而实现了第一图像中的第一目标像素点完全匹配的技术效果,进而解决了由于码元匹配的过程中匹配不完全造成三维重建结果准确性低技术问题。

Claims (12)

  1. 一种三维重建方法,包括:
    获取第一图像和第二图像,所述第一图像和所述第二图像分别由不同的图像采集设备采集被投射到被测物体表面的码元图像得到的,所述码元图像中包含多个目标码元按照预设方向随机分布,所述目标码元为线段条纹;
    获取所述第一图像中每个码元的第一目标像素点,并在所述第二图像中逐一确定与所述第一目标像素点匹配的第二目标像素点,以完成匹配,在所述第一图像中的存在所述第一目标像素点未完成匹配的情况下,对未完成匹配的第一目标像素点进行二次匹配;
    至少依据预先确定的所述第一目标像素点在所述第一图像中的像素坐标和所述第二目标像素点在所述第二图像中的像素坐标确定所述第一目标像素点的三维坐标,以完成三维重建。
  2. 根据权利要求1所述的方法,其中,对未完成匹配的第一目标像素点进行二次匹配,包括:
    确定所述未完成匹配的第一目标像素点的第一周边预设面积区域内已完成匹配的所述第一目标像素点;
    确定所述已完成匹配的所述第一目标像素点的第二周边预设面积区域内所有码元所述第一目标像素点对应的光平面为第一光平面序列,其中,所述第一目标像素点包括:所述线段条纹的中点和所述线段条纹的特征点;
    确定所述未完成匹配的第一目标像素点在所述第一光平面序列中所有光平面中的三维坐标为候选三维坐标;
    确定所有候选三维坐标在所述第二图像中对应的像素点为候选像素点,并从候选像素点中确定所述未完成匹配的第一目标像素点匹配的所述第二目标像素点。
  3. 根据权利要求2所述的方法,其中,从候选像素点中确定所述未完成匹配的第一目标像素点匹配的所述第二目标像素点,包括:
    确定每个所述候选像素点在所述第二图像中距离最近的第二目标像素点,并确定每个所述候选像素点与所述距离最近的第二目标像素点的距离为第一距离;
    将所述第一距离最小的所述候选像素点确定为第三目标像素点,并将与所述第三目标像素点距离最近的所述第二目标像素点确定为与所述未完成匹配的 第二目标像素点匹配的所述第二目标像素点。
  4. 根据权利要求2所述的方法,其中,所述方法还包括:
    确定已完成匹配的所述第一目标像素点的三维坐标;
    将所述已完成匹配的所述第一目标像素点的三维坐标代入到所述第一图像中所有码元对应的光平面方程中,得到残差参数;
    将残差参数最小的光平面方程对应的光平面确定为所述已完成匹配的所述第一目标像素点对应的光平面。
  5. 根据权利要求1所述的方法,其中,所述第一图像中目标码元的长度通过以下方式确定,包括:
    在所述多个目标码元按照所述码元图像横轴方向随机分布的情况下,确定第一放大倍率和第二放大倍率;
    将所述第二放大倍率与所述第一放大倍率的比值乘以所述目标码元的长度最小值确定为所述第一图像中目标码元的长度最小值;
    将所述第一图像的宽度的预设比例值确定所述第一图像中目标码元的长度最大值。
  6. 根据权利要求1所述的方法,其中,所述预先确定的码元图像中包含多个目标码元按照预设方向随机分布,所述目标码元为线段条纹,包括:
    基于所述目标码元的长度、所述目标码元的宽度、所述目标码元之间的间距和所述目标码元中心像素点的像素坐标确定所述目标码元所在的目标区域,其中,所述目标码元中心像素点的像素坐标在所述码元图像所在区域内随机生成的;
    遍历所述目标区域内的所有像素点,在所述目标区域中不存在所述目标码元的情况下,在所述目标区域生成所述目标码元,其中所述目标码元至少包括一条预设长度的线段和所述预设长度的线段对应的两个端点;
    在所述码元图像区域内的所有所述目标区域生成所述目标码元。
  7. 根据权利要求1所述的方法,其中,所述方法还包括:
    确定所述第一图像中任一码元的第一邻域码元集合和所述第二图像中多个候选码元的多个第二邻域码元集合;
    确定所述多个第二邻域码元集合中的邻域码元与所述第一邻域码元集合中的邻域码元匹配的个数,将所述多个第二邻域码元集合中匹配的个数最多的第二邻 域码元集合确定为目标第二邻域码元集合;
    将所述目标第二邻域码元集合对应的所述候选码元确定为与所述任一码元匹配的码元。
  8. 根据权利要求1所述的方法,其中,所述方法还包括:
    依据所述目标码元的中心像素点坐标、所述目标码元的长度和所述目标码元的宽度确定所述目标码元所处的目标区域,所述目标区域中只存在一个所述目标码元。
  9. 一种三维重建装置,包括:
    获取模块,用于获取第一图像和第二图像,所述第一图像和所述第二图像分别由不同的图像采集设备采集被投射到被测物体表面的码元图像得到的,所述码元图像中包含多个目标码元按照预设方向随机分布,所述目标码元为线段条纹;
    匹配模块,用于获取所述第一图像中每个码元的第一目标像素点,并在所述第二图像中逐一确定与所述第一目标像素点匹配的第二目标像素点,以完成匹配;在所述第一图像中的存在所述第一目标像素点未完成匹配的情况下,对未完成匹配的第一目标像素点进行二次匹配;
    重建模块,用于至少依据预先确定的所述第一目标像素点在所述第一图像中的像素坐标和所述第二目标像素点在所述第二图像中的像素坐标确定所述第一目标像素点的三维坐标,以完成三维重建。
  10. 一种三维重建系统,包括:
    至少两个图像采集设备、投影设备和第一处理器;
    所述投影设备用于将预先确定的码元图像投射到被测物体表面;
    所述至少两个图像采集模块用于从被测物体表面采集所述预先确定的码元图像得到第一图像和第二图像;
    第一处理器用于获取所述第一图像中每个码元的第一目标像素点,并在所述第二图像中逐一确定与所述第一目标像素点匹配的第二目标像素点,以完成匹配;在所述第一图像中的存在所述第一目标像素点未完成匹配的情况下,对未完成匹配的第一目标像素点进行二次匹配;还用于至少依据预先确定的所述第一目标像素点在所述第一图像中的像素坐标和所述第二目标像素点在所述第二图像中的像素坐标确定所述第一目标像素点的三维坐标,以完成三维重建。
  11. 一种非易失性存储介质,所述非易失性存储介质包括存储的程序,其中,在所述程序运行时控制所述非易失性存储介质所在设备执行权利要求1至8中任意一项所述的三维重建方法。
  12. 一种电子设备,包括:存储器和处理器;所述处理器设置为运行程序,其中,所述程序运行时执行权利要求1至8中任意一项所述的三维重建方法。
PCT/CN2023/112047 2022-08-10 2023-08-09 三维重建方法及装置、系统 WO2024032668A1 (zh)

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
CN202210956344.4 2022-08-10
CN202210956344.4A CN115345993A (zh) 2022-08-10 2022-08-10 三维重建方法及装置、系统

Publications (1)

Publication Number Publication Date
WO2024032668A1 true WO2024032668A1 (zh) 2024-02-15

Family

ID=83952318

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/CN2023/112047 WO2024032668A1 (zh) 2022-08-10 2023-08-09 三维重建方法及装置、系统

Country Status (2)

Country Link
CN (1) CN115345993A (zh)
WO (1) WO2024032668A1 (zh)

Families Citing this family (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN115345993A (zh) * 2022-08-10 2022-11-15 先临三维科技股份有限公司 三维重建方法及装置、系统

Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20060167648A1 (en) * 2003-08-13 2006-07-27 Hitoshi Ohtani 3-Dimensional measurement device and electronic storage medium
CN108267097A (zh) * 2017-07-17 2018-07-10 杭州先临三维科技股份有限公司 基于双目三维扫描系统的三维重构方法和装置
CN112270748A (zh) * 2020-11-18 2021-01-26 Oppo广东移动通信有限公司 基于图像的三维重建方法及装置
CN114820939A (zh) * 2022-04-28 2022-07-29 杭州海康机器人技术有限公司 一种图像重建方法、装置及设备
CN115345993A (zh) * 2022-08-10 2022-11-15 先临三维科技股份有限公司 三维重建方法及装置、系统

Patent Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20060167648A1 (en) * 2003-08-13 2006-07-27 Hitoshi Ohtani 3-Dimensional measurement device and electronic storage medium
CN108267097A (zh) * 2017-07-17 2018-07-10 杭州先临三维科技股份有限公司 基于双目三维扫描系统的三维重构方法和装置
CN112270748A (zh) * 2020-11-18 2021-01-26 Oppo广东移动通信有限公司 基于图像的三维重建方法及装置
CN114820939A (zh) * 2022-04-28 2022-07-29 杭州海康机器人技术有限公司 一种图像重建方法、装置及设备
CN115345993A (zh) * 2022-08-10 2022-11-15 先临三维科技股份有限公司 三维重建方法及装置、系统

Also Published As

Publication number Publication date
CN115345993A (zh) 2022-11-15

Similar Documents

Publication Publication Date Title
WO2024032670A1 (zh) 三维重建方法及装置、系统
US11839820B2 (en) Method and apparatus for generating game character model, processor, and terminal
CN107481304B (zh) 在游戏场景中构建虚拟形象的方法及其装置
CN110276829B (zh) 通过多尺度体素哈希处理的三维表示
CN107452034B (zh) 图像处理方法及其装置
US20120176478A1 (en) Forming range maps using periodic illumination patterns
WO2024032668A1 (zh) 三维重建方法及装置、系统
CN109978984A (zh) 人脸三维重建方法及终端设备
US20120176380A1 (en) Forming 3d models using periodic illumination patterns
US11722652B2 (en) Method and system for multi-wavelength depth encoding for three- dimensional range geometry compression
CN107610171B (zh) 图像处理方法及其装置
Sahay et al. Geometric inpainting of 3D structures
WO2024032666A1 (zh) 三维重建方法及装置、系统
CN113160420A (zh) 一种三维点云重建的方法、装置、电子设备及存储介质
US10650584B2 (en) Three-dimensional modeling scanner
CN111476884B (zh) 基于单帧rgbd图像的实时三维人体重建方法及系统
CN114152217B (zh) 基于监督学习的双目相位展开方法
CN113313832B (zh) 三维模型的语义生成方法、装置、存储介质与电子设备
CN111382618A (zh) 一种人脸图像的光照检测方法、装置、设备和存储介质
CN108895979B (zh) 线段编码的结构光深度获取方法
CN113610962A (zh) 一种三维重建方法以及相关设备
CN108989682B (zh) 一种主动光场深度成像方法及系统
CN113706692B (zh) 三维图像重构方法、装置、电子设备以及存储介质
Fang et al. 3D shape recovery of complex objects from multiple silhouette images
CN101853515A (zh) 一种基于动态规划的De Bruijn彩色结构光解码方法

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 23851889

Country of ref document: EP

Kind code of ref document: A1