CN112710253B - Three-dimensional scanner and three-dimensional scanning method - Google Patents

Three-dimensional scanner and three-dimensional scanning method Download PDF

Info

Publication number
CN112710253B
CN112710253B CN201911018729.0A CN201911018729A CN112710253B CN 112710253 B CN112710253 B CN 112710253B CN 201911018729 A CN201911018729 A CN 201911018729A CN 112710253 B CN112710253 B CN 112710253B
Authority
CN
China
Prior art keywords
light
image
camera
stripe
target object
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Active
Application number
CN201911018729.0A
Other languages
Chinese (zh)
Other versions
CN112710253A (en
Inventor
马超
赵晓波
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Shining 3D Technology Co Ltd
Original Assignee
Shining 3D Technology Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Priority to CN201911018729.0A priority Critical patent/CN112710253B/en
Application filed by Shining 3D Technology Co Ltd filed Critical Shining 3D Technology Co Ltd
Priority to CA3158933A priority patent/CA3158933A1/en
Priority to US17/771,470 priority patent/US20220364853A1/en
Priority to PCT/CN2020/123684 priority patent/WO2021078300A1/en
Priority to JP2022524057A priority patent/JP7298025B2/en
Priority to KR1020227017511A priority patent/KR20220084402A/en
Priority to EP20878731.7A priority patent/EP4050302A4/en
Priority to AU2020371142A priority patent/AU2020371142B2/en
Publication of CN112710253A publication Critical patent/CN112710253A/en
Application granted granted Critical
Publication of CN112710253B publication Critical patent/CN112710253B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Classifications

    • GPHYSICS
    • G01MEASURING; TESTING
    • G01BMEASURING LENGTH, THICKNESS OR SIMILAR LINEAR DIMENSIONS; MEASURING ANGLES; MEASURING AREAS; MEASURING IRREGULARITIES OF SURFACES OR CONTOURS
    • G01B11/00Measuring arrangements characterised by the use of optical techniques
    • G01B11/24Measuring arrangements characterised by the use of optical techniques for measuring contours or curvatures
    • G01B11/25Measuring arrangements characterised by the use of optical techniques for measuring contours or curvatures by projecting a pattern, e.g. one or more lines, moiré fringes on the object
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B5/00Measuring for diagnostic purposes; Identification of persons
    • A61B5/0059Measuring for diagnostic purposes; Identification of persons using light, e.g. diagnosis by transillumination, diascopy, fluorescence
    • A61B5/0082Measuring for diagnostic purposes; Identification of persons using light, e.g. diagnosis by transillumination, diascopy, fluorescence adapted for particular medical purposes
    • A61B5/0088Measuring for diagnostic purposes; Identification of persons using light, e.g. diagnosis by transillumination, diascopy, fluorescence adapted for particular medical purposes for oral or dental tissue
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B5/00Measuring for diagnostic purposes; Identification of persons
    • A61B5/72Signal processing specially adapted for physiological signals or for diagnostic purposes
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01BMEASURING LENGTH, THICKNESS OR SIMILAR LINEAR DIMENSIONS; MEASURING ANGLES; MEASURING AREAS; MEASURING IRREGULARITIES OF SURFACES OR CONTOURS
    • G01B11/00Measuring arrangements characterised by the use of optical techniques
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01BMEASURING LENGTH, THICKNESS OR SIMILAR LINEAR DIMENSIONS; MEASURING ANGLES; MEASURING AREAS; MEASURING IRREGULARITIES OF SURFACES OR CONTOURS
    • G01B11/00Measuring arrangements characterised by the use of optical techniques
    • G01B11/24Measuring arrangements characterised by the use of optical techniques for measuring contours or curvatures
    • G01B11/25Measuring arrangements characterised by the use of optical techniques for measuring contours or curvatures by projecting a pattern, e.g. one or more lines, moiré fringes on the object
    • G01B11/254Projection of a pattern, viewing through a pattern, e.g. moiré
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T17/00Three dimensional [3D] modelling, e.g. data description of 3D objects
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/40Analysis of texture
    • G06T7/49Analysis of texture based on structural texture description, e.g. using primitives or placement rules

Abstract

The application discloses a three-dimensional scanner and a three-dimensional scanning method. The three-dimensional scanner includes: the projection device is used for projecting light onto the target object, wherein the light comprises preset light projected in the form of color coding stripes, and the color coding stripes are formed by at least two color stripe codes; and the image acquisition device is used for acquiring the light modulated by the target object under the condition that the target object is projected by the projection device so as to acquire at least one fringe image, wherein the acquired fringe image is used as a coding diagram to determine each fringe sequence and is used as a reconstruction diagram to reconstruct the target object in three dimensions. According to the method and the device, the technical problems that the hardware cost required by the existing three-dimensional reconstruction method in the related technology is high, and popularization and use of the three-dimensional scanning device are not facilitated are solved.

Description

Three-dimensional scanner and three-dimensional scanning method
Technical Field
The present application relates to the field of three-dimensional scanning, and in particular, to a three-dimensional scanner and a three-dimensional scanning method.
Background
For the field of three-dimensional scanning inside the oral cavity, the existing three-dimensional scanner generally performs three-dimensional reconstruction processing by using the following modes: firstly, sinusoidal stripe solutions based on time coding are matched, and then three-dimensional reconstruction and splicing fusion are carried out to obtain the three-dimensional shape of an object; secondly, acquiring a three-dimensional shape of an object based on a time-coded stripe center line extraction and three-dimensional reconstruction and splicing fusion algorithm; thirdly, acquiring the three-dimensional shape of the object based on a microscopic confocal three-dimensional imaging principle.
However, each of the above-described modes has drawbacks, and is not suitable for popularization and use of a three-dimensional scanning device in an oral cavity, and specific drawbacks are as follows:
firstly, a three-dimensional reconstruction method based on time coding is difficult to realize handheld scanning with small volume, so that the method cannot be applied to the field of three-dimensional scanning in an oral cavity, and in addition, the three-dimensional reconstruction method based on time coding also needs a high-frame-rate camera and high-speed algorithm support, so that the generation cost of three-dimensional scanning equipment is high, and the popularization and the use are not facilitated;
and secondly, when three-dimensional reconstruction is carried out based on a microscopic confocal three-dimensional imaging principle, the cost of required hardware is high, and the popularization and the use of three-dimensional scanning equipment are also not facilitated.
Aiming at the technical problems that the existing three-dimensional reconstruction method is high in hardware cost and unfavorable for popularization and use of a three-dimensional scanning device in the related art, no effective solution is proposed at present.
Disclosure of Invention
The application provides a three-dimensional scanner and a three-dimensional scanning method, which are used for solving the technical problems that the existing three-dimensional reconstruction method in the related technology is high in hardware cost and is not beneficial to popularization and use of a three-dimensional scanning device.
According to one aspect of the present application, a three-dimensional scanner is provided. The three-dimensional scanner includes: the projection device is used for projecting light rays onto the target object, wherein the light rays comprise preset light rays projected in the form of color coding stripes, and the color coding stripes are formed by at least two color coding stripes; and the image acquisition device is used for acquiring the light modulated by the target object under the condition that the target object is projected by the projection device so as to acquire at least one fringe image, wherein the acquired fringe image is used as a coding diagram to determine each fringe sequence and is used as a reconstruction diagram to reconstruct the target object in three dimensions.
Optionally, the image acquisition device further includes a plurality of cameras, where the plurality of cameras include at least one black-and-white camera, and the image acquisition device acquires the light modulated by the target object through the plurality of cameras to obtain a plurality of stripe images, where the stripe image obtained by at least one black-and-white camera is used as a reconstruction map to reconstruct the target object in three dimensions; and, at least the stripe images obtained by the plurality of black and white cameras are used as code diagrams to determine each stripe sequence, and/or the stripe images obtained by the at least one color camera are used as code diagrams to determine each stripe sequence.
Optionally, the image acquisition device further includes a light beam processing device, the light beam processing device includes a light inlet portion and at least two light outlet portions, wherein each camera is respectively corresponding to a different light outlet portion, and the image acquisition device is configured to acquire light modulated by the target object through the light beam processing device.
Optionally, the beam processing device further includes at least one first beam splitting unit, where the first beam splitting unit is configured to split light beams projected from the light inlet portion, so that the light beams are respectively projected from the at least two light outlet portions to cameras disposed corresponding to the light outlet portions.
Optionally, the beam processing device further includes at least one second beam splitting unit, where the second beam splitting unit is configured to split the light to be acquired by the specified camera, so that the specified camera acquires the light including the specified wavelength band, where the color-coded stripe includes a stripe of a color corresponding to the specified wavelength band.
Optionally, the specified camera is the black-and-white camera.
Optionally, the beam processing device includes a right-angle two-channel dichroic prism, and the right-angle two-channel dichroic prism includes a third light-emitting portion and a fourth light-emitting portion, where the beam processing device performs, through the right-angle two-channel dichroic prism, beam splitting processing on light rays projected from the light-in portion, so that the light rays are respectively projected from the third light-emitting portion and the fourth light-emitting portion to cameras correspondingly disposed in the respective light-emitting portions; the image acquisition device comprises a third camera and a fourth camera, the third camera is arranged corresponding to the third light-emitting part, the fourth camera is arranged corresponding to the fourth light-emitting part, the third camera generates a third stripe image based on the acquired light, the fourth camera generates a fourth stripe image based on the acquired light, and the third stripe image and the fourth stripe image comprise stripes with at least two colors and the stripes with at least two colors are identifiable; the light beam processing device is used for separating the acquired light rays of the appointed camera through the right-angle two-channel dichroic prism, so that the appointed camera acquires the light rays containing the appointed wave band, and the appointed camera acquires the light rays containing the appointed wave band comprises the following steps: and the third camera acquires the light rays of the first filtering wave band and/or the fourth camera acquires the light rays of the second filtering wave band.
Optionally, the beam processing device includes a three-channel dichroic prism, and the three-channel dichroic prism includes a fifth light-emitting portion, a sixth light-emitting portion, and a seventh light-emitting portion, where the beam processing device performs, through the three-channel dichroic prism, beam splitting processing on light rays projected from a light-in portion, so that the light rays are respectively projected from the fifth light-emitting portion, the sixth light-emitting portion, and the seventh light-emitting portion to cameras correspondingly disposed in the respective light-emitting portions; the image acquisition device comprises a fifth camera corresponding to the fifth light emergent part, a sixth camera corresponding to the sixth light emergent part and a seventh camera corresponding to the seventh light emergent part, wherein the fifth camera generates a fifth stripe image based on the acquired light, the sixth camera generates a sixth stripe image based on the acquired light, the seventh camera generates a seventh stripe image based on the acquired light, and the fifth stripe image, the sixth stripe image and the seventh stripe image comprise stripes with at least two colors and the stripes with at least two colors are identifiable; the beam processing device is configured to separate the light beam acquired by the specified camera through the three-way dichroic prism, so that the specified camera acquires the light beam including the specified wavelength band, where the specified camera acquires the light beam including the specified wavelength band at least includes: the fifth camera obtains light rays of a third filtering wave band, the sixth camera obtains light rays of a fourth filtering wave band, and the third filtering wave band is different from the fourth filtering wave band.
Optionally, the beam processing device includes a half-reflection half-transmission prism, and the half-reflection half-transmission prism includes a first light-emitting portion and a second light-emitting portion, where the beam processing device performs beam splitting processing on light rays projected from the light-inlet portion through the half-reflection half-transmission prism, so that the light rays are respectively projected from the first light-emitting portion and the second light-emitting portion to cameras correspondingly disposed in the respective light-emitting portions; the image acquisition device comprises a first camera and a second camera, wherein the first camera is arranged corresponding to the first light-emitting part, the second camera is arranged corresponding to the second light-emitting part, the first camera generates a first stripe image based on the acquired light, the second camera generates a second stripe image based on the acquired light, and stripes with at least two colors are included in the first stripe image and the second stripe image and are identifiable.
Optionally, the beam processing device further includes an optical filter, where the beam processing device performs separation processing on the acquired light by the designated camera through the optical filter, so that the designated camera acquires the light including the fifth filtering band, and at least one of the plurality of cameras is the designated camera.
Optionally, the three-dimensional scanner further comprises an illumination member, wherein the image acquisition device is further configured to acquire illumination light reflected by the target object when the target object is illuminated by the illumination member, so as to acquire texture data of the target object.
Alternatively, the image capture device may identify red, green, and blue light.
According to another aspect of the present application, there is provided a three-dimensional scanning system including: the three-dimensional scanner is used for projecting light onto a target object, and collecting the light modulated by the target object under the condition that the target object is projected with the light so as to obtain at least one fringe image, wherein the projected light comprises preset light projected in a color coding fringe form, and the color coding fringe is formed by at least two color fringe codes; the image processor is connected with the three-dimensional scanner and is used for acquiring at least one fringe image acquired by the three-dimensional scanner, determining each fringe sequence according to the fringe image as a coding diagram and performing three-dimensional reconstruction on the target object as a reconstruction diagram.
Wherein the three-dimensional scanner is any of the three-dimensional scanners described above.
Optionally, in a case that the three-dimensional scanner collects light modulated by the target object through a plurality of cameras to obtain at least one fringe image, and at least one black-and-white camera is included in the plurality of cameras, the image processor is further configured to: taking a stripe image obtained by at least one black-and-white camera as a reconstruction map to reconstruct the target object in three dimensions; the stripe images obtained by at least a plurality of black and white cameras are used as code diagrams to determine each stripe sequence, and/or the stripe images obtained by at least one color camera are used as code diagrams to determine each stripe sequence.
According to another aspect of the present application, a three-dimensional scanning method is provided. The three-dimensional scanning method comprises the following steps: projecting preset light rays to a target object in the form of color coding stripes; collecting light modulated by the target object, and acquiring at least one fringe image based on the light, wherein the acquired fringe image is used as a coding diagram to determine each fringe sequence and is used as a reconstruction diagram to reconstruct the target object in three dimensions; determining a sequence of each stripe in the plurality of stripe images based on the encoding map; and carrying out three-dimensional reconstruction on the reconstruction map based on the sequence to obtain three-dimensional data of the target object.
Wherein the three-dimensional scanning method is applied to the three-dimensional scanner of any of the above.
Optionally, the three-dimensional scanning method further includes: projecting illumination light onto a target object and acquiring texture data of the target object based on the illumination light; and acquiring color three-dimensional data of the target object based on the three-dimensional data and texture data of the target object.
According to another aspect of the present application, a three-dimensional scanning method is provided. The three-dimensional scanning method comprises the following steps: acquiring a first image and a second image, wherein the first image and the second image are stripe images acquired based on the same light beam; determining a coding sequence of each stripe based on the first image; and performing stripe matching on the stripes of the second image based on the coding sequence to realize three-dimensional reconstruction so as to acquire three-dimensional data of a target object.
Wherein the three-dimensional scanning method is applied to the three-dimensional scanner of any of the above.
Optionally, the three-dimensional scanning method further includes: texture data is acquired, and color three-dimensional data of the target object is acquired based on the three-dimensional data and the texture data.
The three-dimensional scanner provided by the embodiment of the application projects light onto a target object through a projection device, wherein the light comprises preset light projected in a color coding stripe form, and the color coding stripe is formed by at least two color stripe codes; the image acquisition device acquires light modulated by the target object under the condition that the target object is projected by the projection device so as to acquire at least one fringe image, wherein the photosensitive wave band of the image acquisition device corresponds to fringe colors contained in color coding fringes one by one, the acquired fringe image is used as a coding map to determine each fringe sequence, and the acquired fringe image is used as a reconstruction map to reconstruct the target object in three dimensions. And further, the technical problems that in the related technology, the hardware cost required by the existing three-dimensional reconstruction method is high, and the popularization and the use of the three-dimensional scanning device are not facilitated are solved.
It should be noted that: since the three-dimensional scanner mentioned in the embodiment of the application is based on a space coding stripe extraction algorithm, the three-dimensional morphology of the target object is obtained. Therefore, the three-dimensional scanner can realize three-dimensional reconstruction of the target object by only needing at least one frame of two-dimensional image, so that the frame rate of the camera and the operation cost of an algorithm are greatly reduced, and the three-dimensional scanner is convenient to popularize and use; specifically, the three-dimensional scanner does not need to use a camera with higher frame rate, so that the volume of the camera required in the three-dimensional scanner can be reduced, and the three-dimensional scanner is more suitable for acquiring the three-dimensional shape of an object in the oral cavity.
And based on the technical characteristics that the three-dimensional scanner can realize three-dimensional reconstruction of the target object by only needing at least one frame of two-dimensional image, the acquisition time difference between the reconstruction image and the texture image is greatly shortened, the time required for projecting and shooting the three-dimensional reconstruction of the target object is reduced, and the three-dimensional scanner is also more suitable for acquiring the three-dimensional shape of the object in the oral cavity (the three-dimensional scanner is convenient for handheld scanning).
In addition, because the three-dimensional scanner provided by the embodiment of the application uses colors as the information of space coding, the technical effects that the coded information is easy to identify and the identification accuracy is improved are also realized.
In addition, the three-dimensional scanner is based on a space coding stripe extraction algorithm to acquire the three-dimensional morphology of the target object, so that the technical effect of canceling the projection requirement of dynamic projection is realized.
Drawings
The accompanying drawings, which are included to provide a further understanding of the application, illustrate and explain the application and are not to be construed as limiting the application. In the drawings:
FIG. 1 is a schematic illustration of an alternative three-dimensional scanner provided in accordance with an embodiment of the present application;
FIG. 2 is a schematic diagram of the diffusion and contrast of red, green and blue colors on an object according to an embodiment of the present application;
FIG. 3 is a schematic illustration of a positional relationship between an illuminator and a mirror provided in accordance with an embodiment of the present application;
FIG. 4 is a schematic diagram of a beam path in a beam processing apparatus according to an embodiment of the present application;
FIG. 5 is a schematic diagram II of an alternative three-dimensional scanner provided in accordance with an embodiment of the present application;
FIG. 6 is a schematic diagram III of an alternative three-dimensional scanner provided in accordance with an embodiment of the present application;
FIG. 7 is a flow chart diagram of an alternative three-dimensional scanning method provided in accordance with an embodiment of the present application; and
Fig. 8 is a second flowchart of an alternative three-dimensional scanning method according to an embodiment of the present application.
Wherein the above figures include the following reference numerals:
10. a projection device; 20. an image acquisition device; 30. a lighting member; 40. a reflective mirror; 11. a light source emitter; 12. a color grating sheet; 13. a first imaging lens; 14. a beam coupling system; 15. a light bar; 16. a phase modulating element; 17. a driving motor; 21. a camera; 22. a semi-reflective semi-transparent prism; 23. a light filter; 24. a right angle two channel dichroic prism; 25. a three-channel dichroic prism; 26. a second imaging lens; 111. a DLP emitter; 112. a laser emitter; 211. a first camera; 212. a second camera; 213. a third camera; 214. a fourth camera; 215. a fifth camera; 216. a sixth camera; 217. and a seventh camera.
Detailed Description
It should be noted that, in the case of no conflict, the embodiments and features in the embodiments may be combined with each other. The present application will be described in detail below with reference to the accompanying drawings in conjunction with embodiments.
In order to make the present application solution better understood by those skilled in the art, the following description will be made in detail and with reference to the accompanying drawings in the embodiments of the present application, it is apparent that the described embodiments are only some embodiments of the present application, not all embodiments. All other embodiments, which can be made by one of ordinary skill in the art based on the embodiments herein without making any inventive effort, shall fall within the scope of the present application.
It should be noted that the terms "first," "second," and the like in the description and claims of the present application and the above figures are used for distinguishing between similar objects and not necessarily for describing a particular sequential or chronological order. It is to be understood that the data so used may be interchanged where appropriate in order to describe the embodiments of the present application described herein. Furthermore, the terms "comprises," "comprising," and "having," and any variations thereof, are intended to cover a non-exclusive inclusion, such that a process, method, system, article, or apparatus that comprises a list of steps or elements is not necessarily limited to those steps or elements expressly listed but may include other steps or elements not expressly listed or inherent to such process, method, article, or apparatus.
According to an embodiment of the present application, a three-dimensional scanner is provided.
Fig. 1 is a schematic diagram of a three-dimensional scanner according to an embodiment of the present application. As shown in fig. 1, the three-dimensional scanner includes the following components:
the projection device 10 is configured to project light onto a target object, where the light includes a preset light projected in the form of color-coded stripes, and the color-coded stripes are composed of at least two color-coded stripes; that is, at least two color stripes are code ordered to combine into a color-coded stripe.
It should be noted that: the color coding stripes can be formed by various pure color stripe codes and can also be formed by various non-pure color stripe codes; however, in order to perform the discrimination processing for each color stripe, a color-coded stripe composed of a plurality of pure color stripe codes, for example, red, green, blue, cyan, magenta, and yellow is preferable, and specifically, the R, G, B component of each color stripe in the color-coded stripe is preferably 0 or 255, and at most only two components will be 255 at the same time.
Also to be described is: because different colors have different diffusion and light transmittance properties on the tooth surface, in order to obtain high-quality stripe patterns (the stripes are more uniform, and the contrast between the stripes is more uniform), the widths of the stripes in the color coding stripes are respectively set to different values in the application so as to adjust the diffusion properties of red, green and blue three colors on a target object, reduce the mutual interference between the color stripes and improve the extraction precision of the color stripes.
Specifically, as shown in fig. 2, the diffusion and contrast of the RGB three colors on the object are different, and at this time, the width of each color stripe is adjusted to achieve that the RGB three colors have uniform diffusion performance and average contrast of each color stripe, so as to improve stripe extraction precision.
Alternatively, the projection device 10 may employ a transmissive projection mode.
Specifically, after light rays of at least two different wavebands are emitted by the light source emitter 11, the light rays of at least two different wavebands are collimated and converged, and then the light rays penetrate through the MASK pattern, and the pattern is projected onto the target object by the first imaging lens 13.
That is, the projection apparatus 10 includes: the light source emitter 11, the color grating sheet 12 and the first imaging lens 13, wherein the light source emitter 11 is used for emitting light rays with at least two different wave bands, the color grating sheet 12 and the first imaging lens 13 are arranged on a transmission path of the light rays, the light rays penetrate through MASK patterns on the color grating sheet 12 and project the patterns onto a target object through the first imaging lens 13, and color types contained in the MASK patterns on the color grating sheet 12 are in one-to-one correspondence with wave band types contained in the light rays transmitted by the MASK patterns.
In an alternative example, the light source emitter 11 may be a DLP emitter 111 or a laser emitter 112, where the laser emitted by the laser emitter 112 has the following characteristics: directional light emission, extremely high brightness, extremely pure color and good coherence.
Taking laser transmitter 112 as an example, it should be noted that: laser light is prone to conditions of non-uniform aperture and divergence angle, and non-uniform light field emphasis. Therefore, the projection device 10 provided in the embodiment of the present application processes the light beam through the beam coupling system 14 and the optical rod 15 to adjust the caliber and the divergence angle of the laser light beam, and outputs the light field with uniform intensity.
For the case where the aperture and divergence angle of the laser beam are small, the beam coupling system 14 may be composed of a collimating system and a converging lens, or an optical system having an equivalent function thereto. For cases where the divergence angle of the laser light rays is large, beam coupling system 14 may comprise a more complex converging system of three or four or more lens elements.
Wherein, the light bar 15 can be an elongated hexahedral prism, a cylindrical prism or a pyramidal prism; the emergent end face of the optical rod 15 is parallel to the incident end face, and the emergent end face and the incident end face can be rectangular or square; the optical rod 15 may be a solid rod in which light is transmitted inside a solid transparent medium, or a hollow rod in which light is reflected for multiple times in a space surrounded by four solid interfaces; an antireflection film is plated on the emergent end face and the incident end face of the solid rod, and a reflecting film or a non-coating film is plated on the surface of the solid rod; the hollow rod is coated with a reflection enhancing film on the inner surface. Specifically, the light rays can be reflected and mixed for multiple times on the inner surface of the optical rod 15, so as to output an optical field with uniform intensity.
That is, the projection apparatus 10 further includes: the light beam coupling system 14 and the light rod 15, wherein the light beam coupling system 14 and the light rod 15 are arranged on a transmission path of light, and the light rays with at least two different wave bands emitted by the light source emitter 11 are respectively projected onto the color grating sheet 12 through the light beam coupling system 14 and the light rod 15.
Taking laser transmitter 112 as an example, it should be noted that: diffraction spots in the projected pattern can occur due to the coherence of the laser light itself. Therefore, in the case of using the laser light source emitter 11 in the projection apparatus 10 provided in the embodiment of the present application, the projection apparatus 10 further includes: a phase modulation element 16 and a drive motor 17. Specifically, as shown in fig. 2, the phase modulation element is disposed on a transmission path of the laser beam, where after at least two laser beams with different wavebands are emitted by the light source emitter 11, the phase modulation element on the transmission path of the laser beam performs phase real-time modulation on the laser beam, and in addition, the phase modulation element is driven by the driving motor 17 to rotate around the rotation axis at a certain speed.
The phase modulation element can be a transparent optical material sheet, a micro-optical element or a random phase plate.
The phase modulation element may be located before the beam coupling system 14 or may be located after the beam coupling system 14.
Taking fig. 1 as an example, the above-mentioned projection device 10 may include several components as follows: the projection device 10 includes: three laser emitters 112, two half-reflecting beam splitters, a phase modulation element 16 (and a driving motor 17 connected with the phase modulation element 16), a beam coupling system 14, an optical rod 15, a color grating sheet 12, and a first imaging lens 13.
Wherein the projection device 10 emits laser beams through three laser emitters 112, for example, one laser emitter emits a red laser beam, one laser emitter emits a green laser beam, and one laser emitter emits a blue laser beam; the laser beam passes through two semi-reflecting and semi-transmitting spectroscopes respectively to realize the technical effect of beam convergence; the converged laser beam is transmitted through the rotary phase modulation element 16, so that diffraction spots of a projection pattern are avoided due to the coherence of the laser beam; further, the laser beam is made to pass through the beam coupling system 14 and the optical rod 15 respectively to adjust the caliber and the divergence angle of the laser ray and output a light field with uniform intensity; finally, the laser beam is transmitted through the color grating sheet 12 to generate preset light projected in the form of color coding stripes, and the preset light is projected onto the target object through the first imaging lens 13. Of course, the projection device 10 may be provided with only two laser emitters 112, so long as it is ensured that at least two laser beams of colors can be emitted for forming color fringes.
In addition, the three-dimensional scanner may further include: the reflector 40, wherein the reflector 40 is used for changing the transmission path of the light, and in this embodiment, the reflector 40 may be used for reflecting the preset light generated by the projection device 10 to change the transmission path of the preset light, the preset light is reflected to the target object by the reflector 40, and is reflected to the image acquisition device 20 by the target object, so as to reduce the constraint on the installation of the projection device 10 and the image acquisition device 20, and reduce the size of the space required by using the projection device 10 and the image acquisition device 20. Illustrating: the size of the space required for the projection device 10 to project the preset light to the target object is: the space size of the projection device 10 and the space size corresponding to the path length of the preset light projected onto the target object, if the projection device 10 is applied to the inside of the oral cavity and the projection device 10 does not include the reflective mirror 40, the two required spaces are arranged in a straight line, which brings a lot of inconvenience to the use of the projection device 10; if the projection device 10 is applied to the inside of the oral cavity, and the projection device 10 includes the reflective mirror 40, the two required spaces are folded, and at this time, the projection device 10 can preferably use the inside space of the oral cavity, thereby realizing a good projection effect.
Alternatively, the projection device 10 may employ a DLP projector.
In particular, DLP projectors use DLP projection technology (digital light processing projection technology, shorthand for Digital Light Procession) and use digital micromirror devices (DMD, shorthand for Digital Micromirror Device) as the main key processing element to implement digital optical processing. It should be noted that: the present application achieves the technical effect of acquiring an image of high contrast and maintaining the color of a screen bright by adopting a DLP projector as the projection apparatus 10.
In an alternative example, embodiments of the present application provide projection modules having pixel sizes of 7 microns to 8 microns. Specifically, in the case of applying the three-dimensional scanner provided in the embodiments of the present application to the field of tooth three-dimensional scanning, the digital micromirror device in the projection apparatus 10 may be built in a 2048X1152 array at most, and when the digital micromirror device projects the preset light onto a single tooth (about 15 mm), color-coded fringes with a single pixel size of about 7.3um may be obtained. It should be noted that: smaller pixel sizes may reduce interference between adjacent stripe images on the teeth.
Illustrating: the projection device 10 provided in this embodiment of the present application may use DLP lightcraft, specifically, an optical engine of the DLP lightcraft may be an RGB LED light source engine developed by the yangming optical company specifically for DLP3000 DMD, where the DLP3000 DMD is mounted at the end of the light source engine, the DLP3000 DMD of the 0.3WVGA chipset is composed of 415,872 micromirrors, the interval between the micromirrors is 7.6 μm, and a 608x684 micromirror matrix is formed, so that a WVGA (854 x 480) resolution image can be generated at maximum.
The image acquisition device 20 is configured to acquire light reflected by the target object, and in this embodiment, to acquire at least one fringe image by acquiring light modulated by the target object when the target object is projected by the projection device 10, wherein the acquired fringe image is used as a coding map to determine each fringe sequence and as a reconstruction map to reconstruct the target object in three dimensions, and to acquire illumination light reflected by the target object when the target object is illuminated by the illumination member 30.
It should be noted that: since the target object is projected with the projection device 10, a predetermined light included in the projected light is also projected onto the target object; at this time, the preset light is projected onto the target object in the form of color coding stripes, and the color coding stripes are mapped on the target object; further, the image capturing device 20 captures the color-coded fringes mapped onto the target object to obtain at least one fringe image.
That is, the light modulated by the target object is: the target object modulates the preset light with its own shape, so that the color-coded stripe corresponding to the preset light changes correspondingly based on the own shape of the target object, and at this time, the image acquisition device 20 acquires the color-coded stripe after the change to generate at least one stripe image.
Preferably, the image capturing device 20 acquires at least two fringe images simultaneously, each of the at least two fringe images corresponding to a same modulated color-coded fringe. Specifically, the projection device 10 projects a color-coded stripe to the target object, the color-coded stripe is synchronously collected by the image collecting device 20 after being modulated by the target object, and the image collecting device 20 generates at least two stripe images in real time.
The three-dimensional scanner provided by the embodiment of the application projects light onto a target object through the projection device 10, wherein the light comprises preset light projected in the form of color coding stripes, and the color coding stripes are composed of at least two color coding stripes; the image capturing device 20 captures light modulated by the target object when the target object is projected by the projection device 10 to obtain at least one fringe image, wherein a photosensitive band of the image capturing device 20 corresponds to a fringe color included in the color-coded fringe, the image capturing device is capable of capturing at least two colors of the color-coded fringe, generally, the projection device is configured with the image capturing device, colors included in preset light of the projection device can be captured by the image capturing device, and the captured fringe image is used as a code map to determine each fringe sequence, and is used as a reconstruction map to reconstruct the target object in three dimensions. And further, the technical problems that in the related technology, the hardware cost required by the existing three-dimensional reconstruction method is high, and the popularization and the use of the three-dimensional scanning device are not facilitated are solved.
It should be noted that: since the three-dimensional scanner mentioned in the embodiment of the application is based on a space coding stripe extraction algorithm, the three-dimensional morphology of the target object is obtained. Therefore, the three-dimensional scanner can realize three-dimensional reconstruction of the target object by only needing at least one frame of two-dimensional image, so that the frame rate of the camera 21 and the operation cost of an algorithm are greatly reduced, and the three-dimensional scanner is convenient to popularize and use; specifically, the three-dimensional scanner does not need to use the camera 21 with higher frame rate, so that the volume of the camera 21 required in the three-dimensional scanner can be reduced, and the three-dimensional scanner is more suitable for acquiring the three-dimensional shape of the object in the oral cavity.
And based on the technical characteristics that the three-dimensional scanner can realize three-dimensional reconstruction of the target object by only needing at least one frame of two-dimensional image, the acquisition time difference between the reconstruction image and the texture image is greatly shortened, the time required for projecting and shooting the three-dimensional reconstruction of the target object is reduced, and the three-dimensional scanner is also more suitable for acquiring the three-dimensional shape of the object in the oral cavity (the three-dimensional scanner is convenient for handheld scanning).
In addition, because the three-dimensional scanner provided by the embodiment of the application uses colors as the information of space coding, the technical effects that the coded information is easy to identify and the identification accuracy is improved are also realized.
In addition, the three-dimensional scanner is based on a space coding stripe extraction algorithm to acquire the three-dimensional morphology of the target object, so that the technical effect of canceling the projection requirement of dynamic projection is realized.
Optionally, in the three-dimensional scanner provided in the embodiment of the present application, the image capturing device 20 further includes a plurality of cameras 21, where the plurality of cameras 21 includes at least one black-and-white camera, and the image capturing device 20 processes light modulated by the target object through the plurality of cameras 21 to obtain a plurality of stripe images, where the stripe image obtained by the at least one black-and-white camera is used as a reconstruction map to reconstruct the target object in three dimensions; and, at least the stripe images obtained by the plurality of black and white cameras are used as code diagrams to determine each stripe sequence, and/or the stripe images obtained by the at least one color camera are used as code diagrams to determine each stripe sequence.
It should be noted that: as the stripe information contained in at least one stripe image of the code pattern, it is necessary to be able to determine the code sequence of each stripe; that is, the code map is composed of a stripe image in which the code sequence of each stripe can be determined.
That is, a pre-designed color-coded fringe image is projected onto a target object (e.g., teeth or gums) by the projection device 10, while the image capturing device 20 is controlled to rapidly capture an image of the target object with the projected pattern, wherein the cameras 21 included in the image capturing device 20 respectively capture different fringe images, for example: camera a is a color camera, a color stripe image is obtained, and camera B is a black-and-white camera, a black-and-white stripe image is obtained. At this time, the color stripe image and the black-and-white stripe image are transmitted to the computer end, the computer end uses the color stripe image as coding information, and uses the black-and-white stripe image as a reconstruction map, so as to obtain the three-dimensional shape of the target object.
It should be noted that: since the black-and-white camera has a higher imaging resolution than the color camera, if the image capturing apparatus 20 captures a streak image with only one color camera, this may result in a lower resolution. In order to avoid the situation that the three-dimensional reconstruction is difficult due to the low resolution, in the above embodiment, the image capturing device 20 includes a plurality of cameras 21, and at least one black-and-white camera is included in the plurality of cameras 21, and a black-and-white stripe image with a high imaging resolution is used as a reconstruction map to obtain the three-dimensional morphology of the target object. Taking the camera 21 included in the image capturing device 20 as an example, a CCD camera will be described as follows: the color coding stripes corresponding to the preset light rays are assumed to be formed by two color stripe codes (such as red and blue), at this time, the image acquisition device 20 acquires different stripe images through different CCD cameras, for example, the color CCD camera acquires stripe images containing two colors of red and blue, the black-and-white CCD camera acquires stripe images containing one color of blue (blue filter is arranged in front of the black-and-white CCD camera), at this time, the stripe images acquired by the color CCD camera are adopted to identify and match the sequence codes of the blue stripes, and then a three-dimensional reconstruction algorithm and a splicing fusion algorithm are performed according to the obtained sequence codes and the stripe images acquired by the black-and-white CCD camera, so as to construct the three-dimensional morphology of the target object. It should be noted that: the CCD camera has the characteristics of small volume, light weight, no influence of a magnetic field, shock resistance and impact resistance, so that in the case that the three-dimensional scanner adopts the 2CCD camera to acquire the fringe image, the volume of the three-dimensional scanner can be correspondingly reduced, and the three-dimensional scanner is convenient to hold and use and is applied to an environment to be scanned (such as an oral cavity) with a small space.
It should be noted that: the black-and-white CCD camera is provided with a color filter 23 of a specified color as an alternative, which is not particularly limited in the embodiment of the present application. However, if the filter 23 with the specified color is arranged in front of the black-and-white CCD camera, the black-and-white CCD camera can acquire the stripe image with the specified color, and at this time, the stripe image with the specified color is more beneficial to the subsequent three-dimensional reconstruction algorithm and the stitching fusion algorithm, so as to construct the three-dimensional shape of the target object.
It should be noted that: the camera is not particularly limited, and a technician can make corresponding substitutions according to technical requirements, for example, the camera can be a CCD camera or a CMOS camera.
Optionally, in the three-dimensional scanner provided in the embodiment of the present application, a photosensitive band configured by the image capturing device 20 included in the three-dimensional scan includes at least: a plurality of designated bands, and the plurality of designated bands correspond to stripe colors contained in the color-coded stripes. That is, in an alternative example, a color camera is provided in the image capturing apparatus 20, and the color camera is capable of capturing a plurality of stripe colors in color-coded stripes corresponding to preset light rays, so as to determine each stripe sequence. The specified wavelength band described in the present application may be a specified wavelength band or a specified plurality of wavelength bands.
In addition, as shown in fig. 3, the three-dimensional scanner may further include an illumination member 30, where the illumination member 30 is used to illuminate the target object so as to subsequently acquire a texture map of the target object, and the illumination member 30 is preferably a white LED lamp, so as to implement true color scanning, that is, acquire a three-dimensional model with a color consistent or substantially consistent with that of the target object. The illuminator 30 may be disposed at the outer periphery of the reflector 40; the light source device may be disposed at other parts of the scanner and cooperate with the reflector 40, and the illumination light is reflected to the target object by the reflector 40, for example, the illumination element 30 is located on one side of the first imaging lens 13 near the light source emitter 11, so that the illumination light and the light projected by the light source emitter 11 can both pass through the first imaging lens 13 and be reflected to the target object by the reflector 40. Specifically, the three-dimensional scanner includes a grip portion and an entrance portion provided at a front end of the grip portion, the projection apparatus 10 and the image pickup apparatus 20 are both mounted on the grip portion, the mirror 40 is mounted on the entrance portion, and the illumination member 30 may be mounted on the entrance portion or the grip portion.
It should be noted that, the image capturing device 20 may identify and determine red light, green light and blue light, so that the image capturing device 20 may capture a texture map of the target object based on the illumination light.
Optionally, in the three-dimensional scanner provided in the embodiment of the present application, the three-dimensional scanner may further include: the time sequence control circuit is connected with the projection device 10, the illumination piece 30 and the image acquisition device 20, and is used for controlling the projection device 10 to project light onto a target object, synchronously controlling the image acquisition device 20 to acquire a plurality of fringe images, controlling the illumination piece 30 to irradiate the target object and synchronously controlling the image acquisition device 20 to acquire a texture map, and preferably controlling the projection device 10 and the illumination piece 30 to alternately project light onto the target object.
Optionally, in the three-dimensional scanner provided in the embodiment of the present application, the image collecting device 20 further includes a beam processing device, where the beam processing device includes a light inlet portion and at least two light outlet portions, where each camera 21 is respectively set corresponding to a different light outlet portion, and the image collecting device 20 is configured to collect light modulated by the target object through the beam processing device.
That is, the image pickup device 20 has completely uniform fields of view and angles by arranging the beam processing device so that the stripe patterns acquired by the plurality of cameras 21, respectively; that is, the plurality of cameras 21 can receive the coaxial light incident from the same second imaging lens 26, and the coaxial light is projected to the plurality of cameras 21, respectively. Specifically, as shown in fig. 4, the image light of the target object is incident through the light inlet of the beam processing device; at this time, the beam processing device performs a beam splitting process on the image light of the target object, so that the image light is respectively emitted from at least two light emitting portions to be projected to the plurality of cameras 21; at this time, the stripe images acquired by the cameras 21 are stripe images acquired at the same viewing angle and based on the same modulated color coding stripes, and stripe sequences in the stripe images have relevance based on the same modulated color coding stripes, so that the subsequent algorithm is convenient for three-dimensional reconstruction of the stripe images.
In an alternative example, the beam processing apparatus further includes at least one first beam splitting unit for splitting the light beam incident from the light incident portion so that the light beam is respectively incident from the at least two light emergent portions to the cameras 21 disposed corresponding to the light emergent portions, and specifically, the first beam splitting unit splits the light of each color into light in two directions, for example, one red and blue light beam is processed by the first beam splitting unit to form two red and blue light beams, and the two red and blue light beams are respectively emitted in different directions.
That is, at least one first beam splitting unit is disposed in the beam processing apparatus, and the first beam splitting unit is configured to perform beam splitting processing on light incident from the light incident portion, so that image light of the target object can be respectively projected from at least two light emergent portions, and each of the cameras 21 disposed corresponding to the at least two light emergent portions can acquire fringe images with the same viewing angle.
In another optional example, the beam processing apparatus further includes at least one second beam splitting unit, where the second beam splitting unit is configured to split the light acquired by the specified camera so that each camera acquires light including the specified wavelength band, specifically, the second beam splitting unit splits light of a partial wavelength band from the light, where the light of the partial wavelength band is emitted in one direction, or the second beam splitting unit splits light of two partial wavelength bands from the light, where the light of the two partial wavelength bands is emitted in different directions, for example, one beam of red light and one beam of blue light are processed by the second beam splitting unit to form one beam of blue light and one beam of red light and one beam of blue light are emitted in different directions, where the one beam of red light and one beam of blue light are formed after being processed by the second beam splitting unit. The color coding stripes comprise stripes of colors corresponding to the appointed wave bands.
That is, at least one second beam splitting unit is disposed in the beam processing device, and the second beam splitting unit is configured to split the light projected onto the second beam splitting unit, so that a portion of the light in the projected light passes through the second beam splitting unit, and another portion of the light in the projected light is reflected from the surface of the second beam splitting unit (or so that another portion of the light in the projected light is absorbed by the second beam splitting unit), and further the specified camera obtains the light including the specified wavelength.
It should be noted that: the specified camera is the black-and-white camera.
Optionally, in the three-dimensional scanner provided in the embodiment of the present application, the three-dimensional scanner may further include: a heat dissipation system, a heating anti-fog system, a software algorithm system and the like.
Wherein, the heat dissipation system is used for preventing the inside of three-dimensional scanner device from overheating, leads to the condition of scanner damage to take place.
The heating anti-fog system is used for preventing fog surface phenomena of various optical instruments in the three-dimensional scanner, so that the situation that accurate fringe images cannot be obtained is caused.
The software algorithm system is used for performing three-dimensional reconstruction on the target object according to at least one fringe image acquired by the image acquisition device 20.
In order to enable those skilled in the art to more clearly understand the technical solutions of the present application, the following description will be made with reference to specific embodiments.
Embodiment one:
taking fig. 1 as an example, the beam processing apparatus includes a half-reflecting and half-transmitting prism 22, where the half-reflecting and half-transmitting prism 22 includes a first light-emitting portion and a second light-emitting portion, the beam processing apparatus transmits and reflects light through the half-reflecting and half-transmitting prism 22, so as to implement a beam splitting process on the light projected from the light-receiving portion, so that the light is projected from the first light-emitting portion and the second light-emitting portion to the cameras 21 correspondingly disposed on the respective light-emitting portions, respectively, and the image capturing apparatus 20 further includes a first camera 211 correspondingly disposed on the first light-emitting portion, and a second camera 212 correspondingly disposed on the second light-emitting portion, where the first camera 211 generates a first stripe image based on the captured light, the second camera 212 generates a second stripe image based on the captured light, and the first stripe image and the second stripe image include stripes of at least two colors and the stripes of at least two colors are identifiable.
The light beam processing device further comprises a light filter 23, and the light beam processing device performs separation processing on the light acquired by the specified camera through the light filter 23 so that the specified camera acquires the light containing the fifth filtering wave band, wherein at least one camera of the plurality of cameras is the specified camera.
Specifically, the optical filter 23 is disposed between the first light emitting portion and the first camera 211, so that the first camera 211 obtains light of the fifth filtering band, and/or disposed between the second light emitting portion and the second camera 212, so that the second camera 212 obtains light of the fifth filtering band.
It should be noted that: taking the case where the optical filter 23 is disposed between the first light emitting portion and the first camera 211, the first camera 211 is made to acquire the light of the fifth filtering band as an example for explanation: the two color stripes included in the first stripe image are respectively a black stripe and a white stripe, wherein the color stripe is a color code stripe, and the corresponding stripe color is a filtering color corresponding to the optical filter 23.
At this time, at least one stripe of at least two colors contained in the second stripe image has a color of a filter corresponding to the filter 23, so that the second stripe image can identify the coding sequence of the stripe contained in the first stripe image.
Specifically, the first camera is a black-and-white camera, the second camera is a color camera, the black-and-white camera is correspondingly arranged with the optical filter 23, taking the projection device 10 as an example to project red, green and blue color coding stripes (namely color coding stripes comprising red stripes, green stripes and blue stripes), the optical filter 23 is preferably a blue optical filter, the projection device 10 projects the red, green and blue color coding stripes to a target object, the red, green and blue color coding stripes are modulated by the target object and then are transmitted to an image processing device, the red, green and blue color coding stripes are separated by the semi-reflective semi-transparent prism 22, the first red, green and blue color coding stripes are transmitted and reflected, wherein the blue light in the first red, green and blue color coding stripes is collected by the black-and-white camera through the optical filter 23, the black-and-white camera generates a first stripe image comprising blue stripes, the other red, green and blue color coding stripes are collected by the color camera, each stripe in the first stripe image corresponds to the blue stripe in the second stripe image, the second stripe image is used as a coding image, specifically, the second stripe image can be reconstructed by determining the corresponding and reconstructing the first stripe image and the second stripe image, and the first stripe image can be reconstructed by determining the corresponding and the corresponding sequence.
Of course, the setting of the front filter 23 of the black-and-white camera can be canceled, the first stripe image obtained by the black-and-white camera comprises red stripes, green stripes and blue stripes, or the front of the black-and-white camera is provided with the double-color filter 23 for two kinds of light emitted by the three colors of red, green and blue to be collected by the black-and-white camera; the filter 23 may also be disposed in front of the color camera, for example, a red filter is disposed in front of the color camera, the color camera generates a second stripe image containing red stripes, the blue stripes in the first stripe image correspond to the blue stripes in the red-green-blue color coding stripes, the red stripes in the second stripe image correspond to the blue stripes in the red-green-blue color coding stripes, and since the monochromatic filter 23 is disposed in front of the black-and-white camera, only one specific light is emitted, the stripes in the first stripe image collected by the black-and-white camera can be identified and determined, the first stripe image and the second stripe image can be combined to determine the coding sequence of each stripe, the first stripe image and the second stripe image are both used as coding patterns, and the first stripe image is used as a reconstruction pattern; alternatively, a two-color filter 23 is provided before the color camera, and the color camera generates a second stripe image including red stripes and green stripes, the first stripe image and the second stripe image being used as a code image or only the second stripe image being used as a code image, and the first stripe image being used as a reconstruction image, taking the example of providing a red-green filter 23 before the color camera.
In some embodiments, the image capturing device 20 can only identify and determine two of red light, green light and blue light, for some embodiments, the image capturing device 20 cannot completely acquire texture data of a target object under white light, for some embodiments, the image capturing device 20 can identify and determine red light, green light and blue light, and can completely acquire texture data of a target object under white light, thereby realizing acquisition of color three-dimensional data.
It is worth emphasizing that: in this embodiment, the beam processing device transmits and reflects the light through the half-reflecting and half-reflecting prism 22, so as to implement the beam splitting processing on the light projected from the light inlet portion, so that the light is projected from the first light outlet portion and the second light outlet portion to the cameras respectively disposed corresponding to the light outlet portions; that is, the beam processing apparatus realizes the function corresponding to the first beam splitting unit by the half reflecting prism 22.
At the same time, it is also worth emphasizing that: in this embodiment, the beam processing device performs separation processing on the light to be acquired by the specified camera through the optical filter 23 so that the specified camera acquires the light containing the specified wavelength band; that is, the beam processing apparatus realizes the function corresponding to the second beam splitting unit through the filter 23.
Embodiment two:
taking fig. 5 as an example, the beam processing device includes a right-angle two-channel dichroic prism 24, and the right-angle two-channel dichroic prism 24 includes a third light-emitting portion and a fourth light-emitting portion, where the beam processing device performs, through the right-angle two-channel dichroic prism 24, beam splitting processing on a light beam projected from a light-in portion, so that the light beam is projected from the third light-emitting portion and the fourth light-emitting portion to cameras 21 correspondingly disposed in the respective light-emitting portions; correspondingly, the image acquisition device 20 comprises a third camera 213 corresponding to the third light emitting part and a fourth camera 214 corresponding to the fourth light emitting part, the third camera 213 generates a third stripe image based on the acquired light, the fourth camera 214 generates a fourth stripe image based on the acquired light, and the third stripe image and the fourth stripe image comprise stripes with at least two colors and the stripes with at least two colors are identifiable;
in addition, the beam processing device further performs separation processing on the light beam acquired by the specified camera through the right-angle two-channel dichroic prism 24, so that the specified camera acquires the light beam including the specified wavelength band, where the specified camera acquires the light beam including the specified wavelength band includes: the third camera 213 may obtain light of the first filtering band, and/or the fourth camera 214 may obtain light of the second filtering band.
It should be noted that: the light beam processing device is used to separate the acquired light beam by the third camera 213 through the right-angle two-channel dichroic prism 24, so that the third camera 213 acquires the light beam including the first filtering band as an example for explanation: the two color stripes included in the third stripe image are respectively a black stripe and a white stripe, wherein the color stripe is a color code stripe, and the corresponding stripe color is a filtering color corresponding to the optical filter 23.
At this time, at least one stripe of at least two colors included in the fourth stripe image has a filtering color corresponding to the filter 23, so that the fourth stripe image can identify the coding sequence of the stripe included in the third stripe image.
Specifically, the third camera is a black-and-white camera, the fourth camera is a color camera, taking the projection device 10 as an example to project red, green and blue color coding stripes (i.e. color coding stripes including red stripes, green stripes and blue stripes), the projection device 10 projects red, green and blue color coding stripes to a target object, the red, green and blue color coding stripes are decomposed by the right-angle two-channel dichroic prism 24 and are decomposed into a red, green coding stripe and a blue coding stripe, the blue coding stripe is collected by the black-and-white camera, the black-and-white camera generates a third stripe image including blue stripes, the red, green coding stripe is collected by the color camera, the color camera generates a fourth stripe image including red stripes and green stripes, the blue stripes in the third stripe image correspond to each stripe in the fourth stripe image, specifically, the third stripe image corresponds to the red, green and blue color coding stripes after being combined with the fourth stripe image, the fourth stripe image is taken as a coding map, specifically, the fourth stripe image is collected by the color camera, the red stripes and the green stripes in the fourth stripe image and the fourth stripe image can be identified by the identification sequence, and the third stripe image can be reconstructed based on the three-dimensional sequence, and the three-dimensional image can be reconstructed by the corresponding the third stripe image. Of course, in the present embodiment, the black-and-white camera only acquires monochromatic light, so that the third stripe image can also be identified and determined, the third stripe image can be combined with the fourth stripe image to determine the coding sequence of each stripe, and both the third stripe image and the fourth stripe image are used as coding diagrams. In addition, the optical filter 23 may be provided in the present embodiment, or the optical filter 23 may not be provided, and the optical filter 23 may be matched with the right-angle two-channel dichroic prism 24.
It is worth emphasizing that: in this embodiment, the beam processing device performs the beam splitting processing on the light beam projected from the light inlet portion by using the right-angle two-channel dichroic prism 24, so that the light beam is projected from the third light outlet portion and the fourth light outlet portion to the cameras 21 correspondingly disposed in the respective light outlet portions; that is, the beam processing apparatus realizes the function corresponding to the first beam splitting unit by the right-angle two-channel dichroic prism 24.
Similarly, it is also worth emphasizing that: in this embodiment, the beam processing device further performs separation processing on the light to be acquired by the specified camera through the right-angle two-channel dichroic prism 24, so that the specified camera acquires the light containing the specified wavelength band; that is, the beam processing apparatus realizes the function corresponding to the second beam splitting unit by the right-angle two-channel dichroic prism 24.
Embodiment III:
taking fig. 6 as an example, the beam processing device includes a three-way dichroic prism 25, and the three-way dichroic prism 25 includes a fifth light-emitting portion, a sixth light-emitting portion, and a seventh light-emitting portion, where the beam processing device performs a beam splitting process on a light beam projected from a light-in portion through the three-way dichroic prism 25, so that the light beam is respectively projected from the fifth light-emitting portion, the sixth light-emitting portion, and the seventh light-emitting portion to the cameras 21 correspondingly disposed in the respective light-emitting portions;
Correspondingly, the image capturing device 20 includes a fifth camera 215 corresponding to the fifth light emitting portion, a sixth camera 216 corresponding to the sixth light emitting portion, and a seventh camera 217 corresponding to the seventh light emitting portion, the fifth camera 215 generating a fifth stripe image based on the captured light, the sixth camera 216 generating a sixth stripe image based on the captured light, the seventh camera 217 generating a seventh stripe image based on the captured light, and at least two color stripes being identifiable among the fifth stripe image, the sixth stripe image, and the seventh stripe image;
the beam processing device performs separation processing on the light acquired by the specified camera through the three-way dichroic prism 25, so that the specified camera acquires the light including the specified wave band, where the specified camera acquires the light including the specified wave band at least includes: the fifth camera 215 acquires light of a third filter band, the sixth camera 216 acquires light of a fourth filter band, and the third filter band is different from the fourth filter band.
At least one of the fifth camera, the sixth camera and the seventh camera is a black-and-white camera, specifically, the fifth camera is a black-and-white camera, the sixth camera and the seventh camera are color cameras, or the fifth camera and the sixth camera are black-and-white cameras, the seventh camera is a color camera, preferably, the fifth camera 215, the sixth camera 216 and the seventh camera 217 are all black-and-white cameras.
It should be noted that: since the photosensitive bands of the image capturing apparatus 20 in the present application correspond to the stripe colors contained in the color-coded stripes one by one, in the case where the fifth camera 215, the sixth camera 216 and the seventh camera 217 are all black-and-white cameras, the stripe colors contained in the color-coded stripes are three, where at least two stripe colors have a correspondence relationship with the third filtering band and the fourth filtering band.
For example: the color coding stripes consist of red stripes, blue stripes and green stripes; at this time, the filtering color corresponding to the first filtering surface may be red, and the filtering color corresponding to the second filtering surface may be blue; at this time, the obtained fifth stripe image is a black-and-white stripe image, wherein the white stripe corresponds to the red stripe in the color coding stripe; the sixth stripe image is a black and white stripe image, wherein the white stripe corresponds to the blue stripe in the color coded stripe.
For example: the color coding stripes consist of red stripes, blue stripes and yellow stripes; at this time, the filtering color corresponding to the first filtering surface may be red, and the filtering color corresponding to the second filtering surface may be green; at this time, the obtained fifth stripe image is a black-and-white stripe image, wherein the white stripe corresponds to the red stripe and the yellow stripe in the color-coded stripe (in the optical field, the yellow light is formed by combining the green light and the red light); the sixth stripe image obtained is a black-and-white stripe image in which white stripes correspond to yellow stripes in color-coded stripes (in the optical field, yellow light is composed of a combination of green light and red light).
In an alternative example, the beam processing apparatus further performs separation processing on the light to be acquired by the specified camera through the three-way dichroic prism 25, so that the seventh camera 217 acquires light of a sixth filter band, and the sixth filter band is different from the third filter band and the fourth filter band.
For example, color-coded stripes consist of red stripes, blue stripes and green stripes; at this time, the filtering color corresponding to the first filtering surface may be red, the filtering color corresponding to the second filtering surface may be blue, and the filtering color corresponding to the third filtering surface may be green; at this time, the seventh stripe image obtained is a black-and-white stripe image in which the white stripe corresponds to the green stripe in the color-coded stripe.
At this time, any one of the fifth stripe image, the sixth stripe image, and the seventh stripe image may be used as a reconstruction map to perform three-dimensional reconstruction of the target object. For example, the fifth stripe image is used as a reconstruction map for three-dimensional reconstruction of the target object, and the fifth stripe image, the sixth stripe image and the seventh stripe image are used together as an encoding map for determining the respective stripe sequence. Further, it is preferable that the fifth stripe image, the sixth stripe image and the seventh stripe image are all used as the reconstruction map,
It is worth emphasizing that: in this embodiment, the beam processing device performs the beam splitting processing on the light beam projected from the light inlet portion by using the three-way dichroic prism 25, so that the light beam is projected from the fifth light outlet portion, the sixth light outlet portion, and the seventh light outlet portion to the cameras 21 respectively provided corresponding to the light outlet portions; that is, the beam processing apparatus realizes the function corresponding to the first beam splitting unit through the three-way dichroic prism 25.
Similarly, it is also worth emphasizing that: in this embodiment, the beam processing device further performs separation processing on the light to be acquired by the specified camera through the three-way dichroic prism 25, so that the specified camera acquires the light containing the specified wavelength band; that is, the beam processing apparatus realizes the function corresponding to the second beam splitting unit through the three-way dichroic prism 25.
It should be noted that: the first embodiment, the second embodiment and the third embodiment are all exemplified in the present application, so that one skilled in the art can more clearly understand an exemplary schematic of the technical solution of the present application, and the present application is not specifically limited herein. If other specific devices can implement the functional limitation description of the beam processing device in the present application, the specific device can also be used as a technical solution implemented in the present application.
In addition, it is also to be stated that: the first, second and third embodiments listed in the present application may be referred to by reference to each other to implement the function limitation description of the beam processing apparatus in the present application, for example, in the second and third embodiments, after the beam processing apparatus implements the function corresponding to the second beam splitting unit through the right-angle two-channel dichroic prism 24 or the three-channel dichroic prism 25, the beam processing apparatus may still continue to implement the function corresponding to the second beam splitting unit through the optical filter again.
In summary, compared with the prior art, the technical scheme has the following beneficial effects:
1. the stripe extraction algorithm based on space coding realizes the technical aim of carrying out three-dimensional reconstruction on a target object by only one frame of two-dimensional image, and achieves the technical effect of reducing the frame rate of a camera 21 and the operation cost of the algorithm;
2. the color is used as the space coding information, so that the coding information is easy to identify, and the technical effect of improving the identification accuracy is achieved;
3. based on the technical principle of the three-dimensional scanner, the three-dimensional scanner eliminates the requirement of dynamic projection, and can perform pattern projection processing in a simple transmission projection mode; furthermore, under the condition that the three-dimensional scanner performs pattern projection processing in a transmission projection mode, the hardware cost is greatly reduced;
4. When the three-dimensional scanner performs pattern projection processing using laser light as a light source, the brightness and depth of field of the projection device 10 can be improved, and the technical effect of realizing low cost, high brightness and high depth of field can be achieved.
That is, the three-dimensional scanner provided by the application has the advantages of low hardware cost, low real-time frame rate requirement, high brightness and large depth of field of the optical system, and miniaturization of equipment; and the three-dimensional scanner can directly perform dynamic real-time three-dimensional scanning with color textures on materials with characteristics of light reflection, light transmission, light diffusion and the like, such as teeth, gums and the like in the mouth.
According to an embodiment of the present application, a three-dimensional scanning system is provided. Wherein, this three-dimensional scanning system includes:
the three-dimensional scanner is used for projecting light onto a target object, and collecting the light modulated by the target object under the condition that the target object is projected with the light so as to obtain at least one fringe image, wherein the projected light comprises preset light projected in a color coding fringe form, and the color coding fringe is formed by at least two color fringe codes;
the image processor is connected with the three-dimensional scanner and is used for acquiring at least one fringe image acquired by the three-dimensional scanner, determining each fringe sequence according to the fringe image as a coding diagram and performing three-dimensional reconstruction on the target object as a reconstruction diagram.
It should be noted that: the three-dimensional scanner included in the three-dimensional scanning system is the three-dimensional scanner provided by the embodiment of the application.
Optionally, in a case that the three-dimensional scanner collects light modulated by the target object through a plurality of cameras to obtain at least one fringe image, and at least one black-and-white camera is included in the plurality of cameras, the image processor is further configured to: taking a stripe image obtained by at least one black-and-white camera as a reconstruction map to reconstruct the target object in three dimensions; the stripe images obtained by at least a plurality of black and white cameras are used as code diagrams to determine each stripe sequence, and/or the stripe images obtained by at least one color camera are used as code diagrams to determine each stripe sequence.
The three-dimensional scanning system provided by the embodiment of the application projects light onto a target object through a three-dimensional scanner, and collects the light modulated by the target object under the condition that the target object is projected with the light so as to obtain at least one fringe image, wherein the projected light comprises preset light projected in a color coding fringe form, and the color coding fringe is formed by at least two color fringe codes; the image processor is connected with the three-dimensional scanner and is used for acquiring at least one fringe image acquired by the three-dimensional scanner, determining each fringe sequence according to the fringe image as a coding image and performing three-dimensional reconstruction on the target object as a reconstruction image. And further, the technical problems that in the related technology, the hardware cost required by the existing three-dimensional reconstruction method is high, and the popularization and the use of the three-dimensional scanning device are not facilitated are solved.
It should be noted that: because the three-dimensional scanning system is based on a space coding stripe extraction algorithm, the three-dimensional morphology of the target object is obtained. Therefore, the three-dimensional scanning system can realize three-dimensional reconstruction of the target object by only one frame of two-dimensional image at the minimum, so that the frame rate of a camera and the operation cost of an algorithm are greatly reduced, and the three-dimensional scanning system is convenient to popularize and use; specifically, the three-dimensional scanning system does not need to use a camera with higher frame rate, so that the volume of the camera required in the three-dimensional scanning system can be reduced, and the three-dimensional scanning system is more suitable for acquiring the three-dimensional shape of an object in an oral cavity.
And based on the technical characteristics that the three-dimensional scanning system can realize three-dimensional reconstruction of the target object by only needing at least one frame of two-dimensional image, the acquisition time difference between the reconstruction image and the texture image is greatly shortened, the time required for projecting and shooting the three-dimensional reconstruction of the target object is reduced, and the three-dimensional scanning system is also more suitable for acquiring the three-dimensional shape of the object in the oral cavity (the three-dimensional scanning system is convenient for handheld scanning).
In addition, the three-dimensional scanning system provided by the embodiment of the application uses colors as the information of space coding, so that the technical effects that the coded information is easy to identify and the identification accuracy is improved are also realized.
In addition, the three-dimensional scanning system is based on a space coding stripe extraction algorithm to acquire the three-dimensional morphology of the target object, so that the technical effect of canceling the projection requirement of dynamic projection is also realized.
The embodiment of the application also provides a three-dimensional scanning method, and the three-dimensional scanning method is applied to the three-dimensional scanner provided by the embodiment of the application. The three-dimensional scanning method provided in the embodiment of the present application is described below.
Fig. 7 is a flow chart of a three-dimensional scanning method according to an embodiment of the present application. As shown in fig. 7, the three-dimensional scanning method includes:
step S701, projecting preset light rays to a target object in the form of color coding stripes;
step S703, collecting light modulated by the target object, and obtaining at least one fringe image based on the light, wherein the obtained fringe image is used as a coding map to determine each fringe sequence, and is used as a reconstruction map to perform three-dimensional reconstruction on the target object;
step S705, determining the sequence of each stripe in a plurality of stripe images based on the coding diagram;
step S707, performing three-dimensional reconstruction on the reconstruction map based on the sequence, to obtain three-dimensional data of the target object.
According to the three-dimensional scanning method provided by the embodiment of the application, the preset light is projected onto the target object in the form of color coding stripes; collecting light modulated by a target object, and acquiring at least one fringe image based on the light, wherein the acquired fringe image is used as a coding diagram to determine each fringe sequence, and is used as a reconstruction diagram to reconstruct the target object in three dimensions; determining a sequence of each stripe in the plurality of stripe images based on the encoding map; and carrying out three-dimensional reconstruction on the reconstruction map based on the sequence to obtain three-dimensional data of the target object. And further, the technical problems that in the related technology, the hardware cost required by the existing three-dimensional reconstruction method is high, and the popularization and the use of the three-dimensional scanning device are not facilitated are solved.
It should be noted that: since the three-dimensional scanning method mentioned in the embodiment of the application is based on a space coding stripe extraction algorithm, the three-dimensional morphology of the target object is obtained. Therefore, the three-dimensional scanning method can realize three-dimensional reconstruction of the target object by only needing at least one frame of two-dimensional image, so that the frame rate of a camera and the operation cost of an algorithm are greatly reduced, and the three-dimensional scanning method is convenient to popularize and use; specifically, the three-dimensional scanning method does not need to use a camera with higher frame rate, so that the volume of the camera required in the three-dimensional scanning method can be reduced, and the three-dimensional scanning method is more suitable for acquiring the three-dimensional shape of an object in an oral cavity.
And based on the technical characteristics that the three-dimensional scanning method can realize three-dimensional reconstruction of the target object by only needing at least one frame of two-dimensional image, the acquisition time difference between the reconstruction image and the texture image is greatly shortened, the time required for projecting and shooting the three-dimensional reconstruction of the target object is reduced, and the three-dimensional scanning method is also more suitable for acquiring the three-dimensional shape of the object in the oral cavity (the three-dimensional scanning method is convenient for handheld scanning).
In addition, the three-dimensional scanning method provided by the embodiment of the application uses colors as the information of space coding, so that the technical effects that the coded information is easy to identify and the identification accuracy is improved are also realized.
In addition, the three-dimensional scanning method is based on a space coding stripe extraction algorithm to acquire the three-dimensional morphology of the target object, so that the technical effect of canceling the projection requirement of dynamic projection is also achieved.
Optionally, in the three-dimensional scanning method provided in the embodiment of the present application, the three-dimensional scanning method further includes: obtaining texture data of the target object; and acquiring color three-dimensional data of the target object based on the three-dimensional data and texture data of the target object.
Optionally, in the three-dimensional scanning method provided in the embodiment of the present application, the three-dimensional scanning method further includes: projecting illumination light onto a target object and acquiring texture data of the target object based on the illumination light; and acquiring color three-dimensional data of the target object based on the three-dimensional data and texture data of the target object.
Texture data is acquired by a single camera or by a combination of data acquired by multiple cameras.
Specifically, in step S703, light modulated by the target object is collected, and at least two fringe images are obtained based on the same light, wherein at least one fringe image is obtained by a black-and-white camera, and the obtained fringe image is used as a coding map to determine each fringe sequence, and is used as a reconstruction map to reconstruct the target object in three dimensions, preferably, the fringe image obtained by the black-and-white camera is used as the reconstruction map.
Specifically, step S705, the sequence of each stripe in the plurality of stripe images is determined based on the code pattern, the code sequence is determined based on the arrangement information and the color information of each stripe in the code pattern, for example, for four stripes arranged in red, green, red, such as by red (1, 0), green (0, 1), the coding sequence is (1, 0) (0, 1) (1, 0), and five stripes arranged in red, blue, green and red, such as by encoding and decoding red (1, 0), green (0, 1, 0), blue (0, 1), the coding sequence is (1, 0), (0, 1), (0, 1, 0);
Specifically, step S707 is performed to match each stripe of the reconstructed image based on the coding sequence, for binocular reconstruction, the two image acquisition devices are combined, stripe matching is performed to the reconstructed images of the two image acquisition devices, point cloud reconstruction is performed after matching, three-dimensional data of the target object is obtained, for monocular reconstruction, the image acquisition device is combined, stripe matching is performed to the reconstructed images of the image acquisition devices and preset light rays of the projection device, and point cloud reconstruction is performed after matching, so as to obtain three-dimensional data of the target object.
Fig. 8 is a flow chart of a three-dimensional scanning method according to an embodiment of the present application. As shown in fig. 8, the three-dimensional scanning method includes:
step S801, a first image and a second image are acquired, wherein the first image and the second image are stripe images acquired based on the same light beam;
step S803, determining a coding sequence of each stripe based on the first image;
step S805, performing stripe matching on stripes of the second image based on the coding sequence, and implementing three-dimensional reconstruction to obtain three-dimensional data of the target object.
The three-dimensional scanning method further comprises the following steps:
in step S807, texture data is acquired, and color three-dimensional data of the target object is acquired based on the three-dimensional data and the texture data.
Preferably, the first image (second image) is acquired alternately with the texture data.
The following is a specific method:
the projection device projects red, green and blue color coding stripes to a target object at a first moment, the red, green and blue color coding stripes are modulated by the target object and then are transmitted to the image processing device, the red, green and blue color coding stripes are separated into two red, green and blue color coding stripes through a semi-reflective prism, one red, green and blue color coding stripe is collected by a color camera, the color camera generates corresponding red, green and blue color coding stripe images, the other red, green and blue color coding stripe is collected by a black and white camera through a blue filter, the black and white camera generates corresponding blue stripe images, the illumination piece irradiates white light to the target object at a second moment, the white light is collected by the color camera after being reflected by the target object, the color camera generates a texture image, the coding sequence of each stripe is determined based on the red, green and blue color coding stripe images are subjected to stripe matching based on the coding sequence, three-dimensional reconstruction is realized, and true color three-dimensional data of the target object are obtained based on the three-dimensional data and the texture image.
The projection device projects red, green and blue color coding stripes to a target object at a first moment, the red, green and blue color coding stripes are modulated by the target object and then are transmitted to the image processing device, the red, green and blue color coding stripes are separated into two red, green and blue color coding stripes by the semi-reflective and semi-reflective prism, one red, green and blue color coding stripe is collected by the color camera, the color camera generates corresponding red, green and blue color coding stripe images, the projection device projects blue color coding stripes to the target object at a third moment, the blue color coding stripes are modulated by the target object and then are transmitted to the image processing device, the blue color coding stripes are sequentially collected by the black and white camera through the semi-reflective and semi-reflective prism, the blue color coding stripes are corresponding to the blue color coding stripes in the black and white color coding stripes, the illumination piece irradiates white light to the target object at a second moment, the white light is collected by the color camera after being reflected by the target object, the color camera generates texture images, the color coding sequences of the stripes are determined based on the red and blue color coding stripe images, the three-dimensional reconstruction is realized to acquire three-dimensional data of the target object based on the three-dimensional data and the texture images.
The projection device projects red, green and blue color coding stripes to a target object at a first moment, the red, green and blue color coding stripes are modulated by the target object and then are transmitted to the image processing device, the red, green and blue color coding stripes are decomposed into a red, green stripe and a blue stripe through a right-angle two-channel dichroic prism, the red, green and blue stripes are collected by a color camera, the color camera generates a corresponding red, green stripe image, the blue stripe is collected by a black and white camera, the black and white camera generates a corresponding blue stripe image, the illumination piece irradiates white light to the target object at a second moment, the illumination piece is collected by the color camera and the black and white camera after being reflected by the target object, the color camera generates a texture map based on red light and green light, the black and white camera generates a texture map based on blue light, a coding sequence of each stripe is determined based on the red and green stripe image, three-dimensional reconstruction is realized to obtain three-dimensional data of the target object based on the texture map of the color camera and the texture map of the black and white light, and true color three-dimensional data of the target object are obtained based on the three-dimensional data and the texture map of the white light.
The projection device projects red, green and blue color coding stripes to a target object at a first moment, the red, green and blue color coding stripes are modulated by the target object and then are transmitted to the image processing device, the red, green and blue color coding stripes are decomposed into a red stripe, a green stripe and a blue stripe by the three-way dichroic prism, the red stripe is collected by the first black-and-white camera, the first black-and-white camera generates a corresponding red stripe image, the green stripe is collected by the second black-and-white camera, the second black-and-white camera generates a corresponding green stripe image, the blue stripe is collected by the third black-and-white camera, the illumination piece irradiates white light to the target object at a second moment, the white light is collected by the three black-and-white cameras after being reflected by the target object, the method comprises the steps that a first black-and-white camera generates a texture map based on red light, a second black-and-white camera generates a texture map based on green light, a third black-and-white camera generates a texture map based on blue light, a coding sequence of each stripe is determined based on a combination of a red stripe image, a green stripe image and a blue stripe image, stripe matching is carried out on each stripe of the red stripe image, the green stripe image and the blue stripe image based on the coding sequence, three-dimensional reconstruction is achieved to obtain three-dimensional data of a target object, the texture map based on white light is synthesized based on the three black-and-white camera, and true color three-dimensional data of the target object is obtained based on the three-dimensional data and the texture map of white light. The projection device projects a green-blue color coding stripe to a target object at a first moment, the green-blue color coding stripe is modulated by the target object and then is transmitted to the image processing device, the green-blue color coding stripe is decomposed into a green stripe and a blue stripe through a three-way dichroic prism, the green stripe is collected by a second black-white camera, the second black-white camera generates a corresponding green stripe image, the blue stripe is collected by a third black-white camera, the third black-white camera generates a corresponding blue stripe image, the illumination piece irradiates white light to the target object at a second moment, the three black-white cameras collect the white light after being reflected by the target object, the first black-white camera generates a texture map based on red light, the second black-white camera generates a texture map based on green light, the third black-white camera generates a texture map based on blue light, the coding sequence of each stripe is determined based on the combination of the green stripe image and the blue stripe image, three-dimensional reconstruction is carried out on each stripe of the green stripe image and the blue stripe image based on the coding sequence to obtain three-dimensional data of the target object, the three-dimensional data based on the texture map of the three black-white camera is synthesized, and the three-white texture map based on the three-white texture map is obtained, and the three-dimensional true white texture map data of the target object is obtained based on the three-dimensional data.
It should be noted that the steps illustrated in the flowcharts of the figures may be performed in a computer system such as a set of computer executable instructions, and that although a logical order is illustrated in the flowcharts, in some cases the steps illustrated or described may be performed in an order other than that illustrated herein.
An embodiment of the present invention provides a storage medium having a program stored thereon, which when executed by a processor, implements the three-dimensional scanning method.
The embodiment of the invention provides a processor which is used for running a program, wherein the three-dimensional scanning method is executed when the program runs.
It should also be noted that the terms "comprises," "comprising," or any other variation thereof, are intended to cover a non-exclusive inclusion, such that a process, method, article, or apparatus that comprises a list of elements does not include only those elements but may include other elements not expressly listed or inherent to such process, method, article, or apparatus. Without further limitation, an element defined by the phrase "comprising one … …" does not exclude the presence of other like elements in a process, method, article or apparatus that comprises an element.
Also to be described is: the functional units in the embodiments of the present invention may be integrated in one physical unit, or each unit may exist alone physically, or two or more units may be integrated in one unit.
It will be appreciated by those skilled in the art that embodiments of the present application may be provided as a method, system, or computer program product. Accordingly, the present application may take the form of an entirely hardware embodiment, an entirely software embodiment or an embodiment combining software and hardware aspects. Furthermore, the present application may take the form of a computer program product embodied on one or more computer-usable storage media (including, but not limited to, disk storage, CD-ROM, optical storage, and the like) having computer-usable program code embodied therein.
The foregoing is merely exemplary of the present application and is not intended to limit the present application. Various modifications and changes may be made to the present application by those skilled in the art. Any modifications, equivalent substitutions, improvements, etc. which are within the spirit and principles of the present application are intended to be included within the scope of the claims of the present application.
And, in the foregoing embodiments of the present invention, the descriptions of the embodiments are emphasized, and for parts of some embodiments that are not described in detail, reference may be made to related descriptions of other embodiments.

Claims (14)

1. A three-dimensional scanner, comprising:
a projection device (10) for projecting light onto a target object, wherein the light comprises a preset light projected in the form of color-coded stripes, wherein the color-coded stripes consist of at least two color-coded stripes;
image acquisition means (20) for acquiring, in the case of the target object being projected by the projection means (10), light modulated by the target object to acquire at least one fringe image, wherein the acquired fringe image is used as a code map to determine each code sequence and as a reconstruction map to reconstruct the target object in three dimensions;
the image acquisition device (20) further comprises a plurality of cameras (21) and a beam processing device, the beam processing device comprises at least one second beam splitting unit, the second beam splitting unit is used for splitting acquired light rays of the black-and-white camera so that the black-and-white camera acquires the light rays containing a specified wave band, the color coding stripes contain stripes of colors corresponding to the specified wave band, the plurality of cameras (21) at least comprise one black-and-white camera, and the image acquisition device (20) acquires the light rays modulated by the target object through the plurality of cameras (21) to obtain a plurality of stripe images; the light beam processing device comprises a light inlet part and at least two light outlet parts, wherein each camera (21) is respectively arranged corresponding to different light outlet parts, the image acquisition device (20) is used for acquiring light modulated by the target object through the light beam processing device, and a stripe image obtained by at least one black-and-white camera is used as a reconstruction image to reconstruct the target object in a three-dimensional way; and, the stripe image obtained by at least one color camera is used as a coding diagram to determine each coding sequence; wherein the three-dimensional reconstruction of the target object comprises: acquiring a first image and a second image, determining a coding sequence of each stripe image based on the first image, and performing stripe matching on the stripes of the second image based on the coding sequence so as to perform three-dimensional reconstruction on the target object, wherein the first image is a coding image, and the second image is a reconstruction image.
2. The three-dimensional scanner according to claim 1, wherein the beam processing device further comprises at least one first beam splitting unit for splitting light rays projected from the light inlet portion so that the light rays are respectively projected from the at least two light outlet portions to cameras (21) provided correspondingly to the light outlet portions.
3. The three-dimensional scanner of claim 1, wherein the three-dimensional scanner comprises a three-dimensional scanner,
the light beam processing device comprises a right-angle two-channel dichroic prism (24), and the right-angle two-channel dichroic prism (24) comprises a third light-emitting part and a fourth light-emitting part, wherein the light beam processing device performs light splitting processing on light rays projected from a light-in part through the right-angle two-channel dichroic prism (24), so that the light rays are respectively projected from the third light-emitting part and the fourth light-emitting part to cameras (21) correspondingly arranged on the respective light-emitting parts;
the image acquisition device (20) comprises a third camera (213) corresponding to the third light emitting part and a fourth camera (214) corresponding to the fourth light emitting part, the third camera (213) generates a third stripe image based on the acquired light, the fourth camera (214) generates a fourth stripe image based on the acquired light, and stripes of at least two colors are included in the third stripe image and the fourth stripe image and are identifiable;
The light beam processing device separates the acquired light rays of the appointed camera through the right-angle two-channel dichroic prism (24), so that the appointed camera acquires the light rays containing the appointed wave band, wherein the appointed camera acquires the light rays containing the appointed wave band comprises the following steps: the third camera (213) acquires light rays of a first filter band and/or the fourth camera (214) acquires light rays of a second filter band.
4. The three-dimensional scanner of claim 1, wherein the three-dimensional scanner comprises a three-dimensional scanner,
the light beam processing device comprises a three-channel dichroic prism (25), and the three-channel dichroic prism (25) comprises a fifth light-emitting part, a sixth light-emitting part and a seventh light-emitting part, wherein the light beam processing device performs light splitting processing on light rays projected from a light-inlet part through the three-channel dichroic prism (25), so that the light rays are respectively projected from the fifth light-emitting part, the sixth light-emitting part and the seventh light-emitting part to cameras (21) correspondingly arranged on the respective light-emitting parts;
the image acquisition device (20) comprises a fifth camera (215) corresponding to the fifth light emergent part, a sixth camera (216) corresponding to the sixth light emergent part and a seventh camera (217) corresponding to the seventh light emergent part, wherein the fifth camera (215) generates a fifth stripe image based on the acquired light, the sixth camera (216) generates a sixth stripe image based on the acquired light, the seventh camera (217) generates a seventh stripe image based on the acquired light, and stripes of at least two colors are included in the fifth stripe image, the sixth stripe image and the seventh stripe image and are identifiable;
The beam processing device performs separation processing on the light acquired by the appointed camera through the three-channel dichroic prism (25), so that the appointed camera acquires the light containing the appointed wave band, wherein the appointed camera acquires the light containing the appointed wave band at least comprises: the fifth camera (215) acquires light of a third filter band, the sixth camera (216) acquires light of a fourth filter band, and the third filter band is different from the fourth filter band.
5. The three-dimensional scanner of claim 1, wherein the three-dimensional scanner comprises a three-dimensional scanner,
the light beam processing device comprises a half-reflection half-transmission prism (22), and the half-reflection half-transmission prism (22) comprises a first light-emitting part and a second light-emitting part, wherein the light beam processing device performs light splitting processing on light rays projected from a light-inlet part through the half-reflection half-transmission prism (22), so that the light rays are respectively projected from the first light-emitting part and the second light-emitting part to cameras (21) which are correspondingly arranged on the light-emitting parts;
the image acquisition device (20) comprises a first camera (211) corresponding to the first light-emitting part and a second camera (212) corresponding to the second light-emitting part, wherein the first camera (211) generates a first stripe image based on the acquired light, the second camera (212) generates a second stripe image based on the acquired light, and stripes with at least two colors are included in the first stripe image and the second stripe image and are identifiable.
6. The three-dimensional scanner according to any one of claim 5, wherein the beam processing device further comprises a filter (23),
the light beam processing device separates the light rays acquired by the appointed camera through the optical filter (23) so that the appointed camera acquires the light rays containing a fifth filtering wave band, and at least one camera of the plurality of cameras is the appointed camera.
7. The three-dimensional scanner according to any one of claims 1-6, further comprising an illumination member (30), wherein the image acquisition device (20) is further configured to acquire illumination light reflected by the target object in case the target object is illuminated by the illumination member (30) to acquire texture data of the target object.
8. The three-dimensional scanner of claim 7, wherein the image acquisition device (20) is identifiable to determine red, green, and blue light.
9. A three-dimensional scanning system, comprising:
the three-dimensional scanner is used for projecting light onto a target object, and collecting the light modulated by the target object under the condition that the target object is projected with the light so as to obtain at least one fringe image, wherein the projected light comprises preset light projected in a color coding fringe form, and the color coding fringe is formed by at least two color fringe codes;
The image processor is connected with the three-dimensional scanner and is used for acquiring at least one fringe image acquired by the three-dimensional scanner, determining each fringe sequence according to the fringe image as a coding image and performing three-dimensional reconstruction on the target object as a reconstruction image;
wherein the three-dimensional scanner is the three-dimensional scanner of any one of claims 1 to 8.
10. The three-dimensional scanning system of claim 9, wherein in the case where the three-dimensional scanner collects light modulated by the target object with a plurality of cameras to obtain at least one fringe image, and at least one black-and-white camera is included in the plurality of cameras, the image processor is further configured to:
taking a stripe image obtained by at least one black-and-white camera as a reconstruction map to reconstruct the target object in three dimensions;
the stripe images obtained by at least a plurality of black and white cameras are used as code diagrams to determine each stripe sequence, and/or the stripe images obtained by at least one color camera are used as code diagrams to determine each stripe sequence.
11. A three-dimensional scanning method, characterized in that the three-dimensional scanning method is applied to the three-dimensional scanner according to any one of the above claims 1 to 8, the three-dimensional scanning method comprising:
Projecting preset light rays to a target object in the form of color coding stripes;
collecting light modulated by the target object, and acquiring at least one fringe image based on the light, wherein the acquired fringe image is used as a coding diagram to determine each fringe sequence and is used as a reconstruction diagram to reconstruct the target object in three dimensions;
determining a sequence of each stripe in the plurality of stripe images based on the encoding map;
and carrying out three-dimensional reconstruction on the reconstruction map based on the sequence to obtain three-dimensional data of the target object.
12. The three-dimensional scanning method according to claim 11, characterized in that the three-dimensional scanning method further comprises:
projecting illumination light onto a target object and acquiring texture data of the target object based on the illumination light;
and acquiring color three-dimensional data of the target object based on the three-dimensional data and texture data of the target object.
13. A three-dimensional scanning method, characterized in that the three-dimensional scanning method is applied to the three-dimensional scanner according to any one of the above claims 1 to 8, the three-dimensional scanning method comprising:
acquiring a first image and a second image, wherein the first image and the second image are stripe images acquired based on the same light beam;
Determining a coding sequence of each stripe based on the first image;
and performing stripe matching on the stripes of the second image based on the coding sequence to realize three-dimensional reconstruction so as to acquire three-dimensional data of a target object.
14. The three-dimensional scanning method according to claim 13, characterized in that the three-dimensional scanning method further comprises:
texture data is acquired, and color three-dimensional data of the target object is acquired based on the three-dimensional data and the texture data.
CN201911018729.0A 2019-10-24 2019-10-24 Three-dimensional scanner and three-dimensional scanning method Active CN112710253B (en)

Priority Applications (8)

Application Number Priority Date Filing Date Title
CN201911018729.0A CN112710253B (en) 2019-10-24 2019-10-24 Three-dimensional scanner and three-dimensional scanning method
US17/771,470 US20220364853A1 (en) 2019-10-24 2020-10-26 Three-Dimensional Scanner and Three-Dimensional Scanning Method
PCT/CN2020/123684 WO2021078300A1 (en) 2019-10-24 2020-10-26 Three-dimensional scanner and three-dimensional scanning method
JP2022524057A JP7298025B2 (en) 2019-10-24 2020-10-26 Three-dimensional scanner and three-dimensional scanning method
CA3158933A CA3158933A1 (en) 2019-10-24 2020-10-26 Three-dimensional scanner and three-dimensional scanning method
KR1020227017511A KR20220084402A (en) 2019-10-24 2020-10-26 3D Scanners and 3D Scanning Methods
EP20878731.7A EP4050302A4 (en) 2019-10-24 2020-10-26 Three-dimensional scanner and three-dimensional scanning method
AU2020371142A AU2020371142B2 (en) 2019-10-24 2020-10-26 Three-dimensional scanner and three-dimensional scanning method

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN201911018729.0A CN112710253B (en) 2019-10-24 2019-10-24 Three-dimensional scanner and three-dimensional scanning method

Publications (2)

Publication Number Publication Date
CN112710253A CN112710253A (en) 2021-04-27
CN112710253B true CN112710253B (en) 2023-06-06

Family

ID=75540321

Family Applications (1)

Application Number Title Priority Date Filing Date
CN201911018729.0A Active CN112710253B (en) 2019-10-24 2019-10-24 Three-dimensional scanner and three-dimensional scanning method

Country Status (1)

Country Link
CN (1) CN112710253B (en)

Families Citing this family (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN114268703A (en) * 2021-12-27 2022-04-01 安徽淘云科技股份有限公司 Imaging adjusting method and device during screen scanning, storage medium and equipment
CN114521982A (en) * 2022-02-21 2022-05-24 资阳联耀医疗器械有限责任公司 Intraoral scanner, intraoral scanning implementation method and storage medium
CN116982940B (en) * 2023-09-26 2024-02-27 北京朗视仪器股份有限公司 Oral cavity scanning system and method

Citations (8)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN1576779A (en) * 2003-06-30 2005-02-09 韦尔豪泽公司 Method and system for three-dimensionally imaging an apical dome of a plant
CN102494609A (en) * 2011-11-18 2012-06-13 李志扬 Three-dimensional photographing process based on laser probe array and device utilizing same
CN105407344A (en) * 2014-09-09 2016-03-16 深圳市绎立锐光科技开发有限公司 Stereo image projection device and stereoscopic display glasses
WO2017203756A1 (en) * 2016-05-26 2017-11-30 Ckd株式会社 Three-dimensional-measurement device
CN109283186A (en) * 2018-10-12 2019-01-29 成都精工华耀科技有限公司 A kind of double spectrum two-dimensionals of track visualization inspection and three-dimensional fusion imaging system
CN109489583A (en) * 2018-11-19 2019-03-19 先临三维科技股份有限公司 Projection arrangement, acquisition device and the 3 D scanning system with it
CN110381300A (en) * 2018-04-13 2019-10-25 豪威科技股份有限公司 There are four the imaging systems of imaging sensor for tool
TW201944771A (en) * 2018-04-09 2019-11-16 美商豪威科技股份有限公司 Imaging system having four image sensors

Patent Citations (8)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN1576779A (en) * 2003-06-30 2005-02-09 韦尔豪泽公司 Method and system for three-dimensionally imaging an apical dome of a plant
CN102494609A (en) * 2011-11-18 2012-06-13 李志扬 Three-dimensional photographing process based on laser probe array and device utilizing same
CN105407344A (en) * 2014-09-09 2016-03-16 深圳市绎立锐光科技开发有限公司 Stereo image projection device and stereoscopic display glasses
WO2017203756A1 (en) * 2016-05-26 2017-11-30 Ckd株式会社 Three-dimensional-measurement device
TW201944771A (en) * 2018-04-09 2019-11-16 美商豪威科技股份有限公司 Imaging system having four image sensors
CN110381300A (en) * 2018-04-13 2019-10-25 豪威科技股份有限公司 There are four the imaging systems of imaging sensor for tool
CN109283186A (en) * 2018-10-12 2019-01-29 成都精工华耀科技有限公司 A kind of double spectrum two-dimensionals of track visualization inspection and three-dimensional fusion imaging system
CN109489583A (en) * 2018-11-19 2019-03-19 先临三维科技股份有限公司 Projection arrangement, acquisition device and the 3 D scanning system with it

Also Published As

Publication number Publication date
CN112710253A (en) 2021-04-27

Similar Documents

Publication Publication Date Title
CN112985307B (en) Three-dimensional scanner, system and three-dimensional reconstruction method
CN112710253B (en) Three-dimensional scanner and three-dimensional scanning method
CN109489583B (en) Projection device, acquisition device and three-dimensional scanning system with same
US20190353472A1 (en) Multiple channel locating
US10412352B2 (en) Projector apparatus with distance image acquisition device and projection mapping method
EP3531066A1 (en) Three-dimensional scanning method including a plurality of lasers with different wavelengths, and scanner
US10782126B2 (en) Three-dimensional scanning method containing multiple lasers with different wavelengths and scanner
US20040125205A1 (en) System and a method for high speed three-dimensional imaging
KR101824328B1 (en) 3-D scanner and the scanning method using the chromatic aberration
US11369269B2 (en) Short-wave infrared and 3D topographic imager
WO2021078300A1 (en) Three-dimensional scanner and three-dimensional scanning method
JP3818028B2 (en) 3D image capturing apparatus and 3D image capturing method
CN112712583A (en) Three-dimensional scanner, three-dimensional scanning system and three-dimensional scanning method
CN112930468B (en) Three-dimensional measuring device
CN102722031A (en) Optical system of laser true three-dimensional display
KR100902176B1 (en) 3d scanner using the polygon mirror
US20230320825A1 (en) Method and intraoral scanner for detecting the topography of the surface of a translucent object, in particular a dental object
US20240036448A1 (en) Ultraminiature pattern projector
CN117629901A (en) Compact imaging method and device for obtaining hyperspectral and three-dimensional morphology
RU2543688C2 (en) Camera and optical system for obtaining 3d images (versions)

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant