CN112710253A - Three-dimensional scanner and three-dimensional scanning method - Google Patents

Three-dimensional scanner and three-dimensional scanning method Download PDF

Info

Publication number
CN112710253A
CN112710253A CN201911018729.0A CN201911018729A CN112710253A CN 112710253 A CN112710253 A CN 112710253A CN 201911018729 A CN201911018729 A CN 201911018729A CN 112710253 A CN112710253 A CN 112710253A
Authority
CN
China
Prior art keywords
light
image
stripe
camera
target object
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Granted
Application number
CN201911018729.0A
Other languages
Chinese (zh)
Other versions
CN112710253B (en
Inventor
马超
赵晓波
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Shining 3D Technology Co Ltd
Original Assignee
Shining 3D Technology Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Priority to CN201911018729.0A priority Critical patent/CN112710253B/en
Application filed by Shining 3D Technology Co Ltd filed Critical Shining 3D Technology Co Ltd
Priority to EP20878731.7A priority patent/EP4050302A4/en
Priority to KR1020227017511A priority patent/KR20220084402A/en
Priority to US17/771,470 priority patent/US12007224B2/en
Priority to JP2022524057A priority patent/JP7298025B2/en
Priority to AU2020371142A priority patent/AU2020371142B2/en
Priority to PCT/CN2020/123684 priority patent/WO2021078300A1/en
Priority to CA3158933A priority patent/CA3158933A1/en
Publication of CN112710253A publication Critical patent/CN112710253A/en
Application granted granted Critical
Publication of CN112710253B publication Critical patent/CN112710253B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Classifications

    • GPHYSICS
    • G01MEASURING; TESTING
    • G01BMEASURING LENGTH, THICKNESS OR SIMILAR LINEAR DIMENSIONS; MEASURING ANGLES; MEASURING AREAS; MEASURING IRREGULARITIES OF SURFACES OR CONTOURS
    • G01B11/00Measuring arrangements characterised by the use of optical techniques
    • G01B11/24Measuring arrangements characterised by the use of optical techniques for measuring contours or curvatures
    • G01B11/25Measuring arrangements characterised by the use of optical techniques for measuring contours or curvatures by projecting a pattern, e.g. one or more lines, moiré fringes on the object
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B5/00Measuring for diagnostic purposes; Identification of persons
    • A61B5/0059Measuring for diagnostic purposes; Identification of persons using light, e.g. diagnosis by transillumination, diascopy, fluorescence
    • A61B5/0082Measuring for diagnostic purposes; Identification of persons using light, e.g. diagnosis by transillumination, diascopy, fluorescence adapted for particular medical purposes
    • A61B5/0088Measuring for diagnostic purposes; Identification of persons using light, e.g. diagnosis by transillumination, diascopy, fluorescence adapted for particular medical purposes for oral or dental tissue
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B5/00Measuring for diagnostic purposes; Identification of persons
    • A61B5/72Signal processing specially adapted for physiological signals or for diagnostic purposes
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01BMEASURING LENGTH, THICKNESS OR SIMILAR LINEAR DIMENSIONS; MEASURING ANGLES; MEASURING AREAS; MEASURING IRREGULARITIES OF SURFACES OR CONTOURS
    • G01B11/00Measuring arrangements characterised by the use of optical techniques
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01BMEASURING LENGTH, THICKNESS OR SIMILAR LINEAR DIMENSIONS; MEASURING ANGLES; MEASURING AREAS; MEASURING IRREGULARITIES OF SURFACES OR CONTOURS
    • G01B11/00Measuring arrangements characterised by the use of optical techniques
    • G01B11/24Measuring arrangements characterised by the use of optical techniques for measuring contours or curvatures
    • G01B11/25Measuring arrangements characterised by the use of optical techniques for measuring contours or curvatures by projecting a pattern, e.g. one or more lines, moiré fringes on the object
    • G01B11/254Projection of a pattern, viewing through a pattern, e.g. moiré
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T17/00Three dimensional [3D] modelling, e.g. data description of 3D objects
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/40Analysis of texture
    • G06T7/49Analysis of texture based on structural texture description, e.g. using primitives or placement rules

Landscapes

  • Engineering & Computer Science (AREA)
  • Health & Medical Sciences (AREA)
  • Physics & Mathematics (AREA)
  • Life Sciences & Earth Sciences (AREA)
  • General Physics & Mathematics (AREA)
  • Computer Vision & Pattern Recognition (AREA)
  • General Health & Medical Sciences (AREA)
  • Public Health (AREA)
  • Veterinary Medicine (AREA)
  • Theoretical Computer Science (AREA)
  • Animal Behavior & Ethology (AREA)
  • Surgery (AREA)
  • Molecular Biology (AREA)
  • Biophysics (AREA)
  • Pathology (AREA)
  • Biomedical Technology (AREA)
  • Heart & Thoracic Surgery (AREA)
  • Medical Informatics (AREA)
  • Computer Graphics (AREA)
  • Oral & Maxillofacial Surgery (AREA)
  • Dentistry (AREA)
  • Audiology, Speech & Language Pathology (AREA)
  • Software Systems (AREA)
  • Geometry (AREA)
  • Artificial Intelligence (AREA)
  • Physiology (AREA)
  • Psychiatry (AREA)
  • Signal Processing (AREA)
  • Length Measuring Devices By Optical Means (AREA)

Abstract

The application discloses a three-dimensional scanner and a three-dimensional scanning method. The three-dimensional scanner includes: the projection device is used for projecting light rays onto a target object, wherein the light rays comprise preset light rays projected in a color coding stripe mode, and the color coding stripe is formed by at least two color stripe codes; and the image acquisition device is used for acquiring light modulated by the target object under the condition that the target object is projected by the projection device to acquire at least one stripe image, wherein the acquired stripe image is used as an encoding graph to determine each stripe sequence and is used as a reconstruction graph to carry out three-dimensional reconstruction on the target object. By the method and the device, the technical problems that the existing three-dimensional reconstruction method in the related art is high in hardware cost and not beneficial to popularization and application of a three-dimensional scanning device are solved.

Description

Three-dimensional scanner and three-dimensional scanning method
Technical Field
The application relates to the field of three-dimensional scanning, in particular to a three-dimensional scanner and a three-dimensional scanning method.
Background
In the field of three-dimensional scanning of the inside of the oral cavity, the existing three-dimensional scanner generally performs three-dimensional reconstruction processing by using the following method: firstly, sinusoidal stripes based on time coding are subjected to phase de-matching, and then three-dimensional reconstruction and splicing fusion are carried out to obtain the three-dimensional appearance of an object; secondly, acquiring the three-dimensional appearance of the object based on an algorithm of time-coded stripe center line extraction, three-dimensional reconstruction and splicing fusion; and thirdly, acquiring the three-dimensional appearance of the object based on a microscopic confocal three-dimensional imaging principle.
However, the above methods have various disadvantages, which are not suitable for popularizing the three-dimensional scanning device in the oral cavity, and the specific disadvantages are as follows:
firstly, the three-dimensional reconstruction method based on time coding is difficult to realize small-volume handheld scanning, so that the three-dimensional reconstruction method cannot be applied to the field of three-dimensional scanning in the oral cavity;
secondly, when the microscopic confocal three-dimensional imaging principle is used for three-dimensional reconstruction, the cost of required hardware is high, and the popularization and the use of three-dimensional scanning equipment are also not facilitated.
Aiming at the technical problems that the existing three-dimensional reconstruction method in the related art needs higher hardware cost and is not beneficial to popularization and use of a three-dimensional scanning device, an effective solution is not provided at present.
Disclosure of Invention
The application provides a three-dimensional scanner and a three-dimensional scanning method, which are used for solving the technical problems that the hardware cost required by the existing three-dimensional reconstruction method in the related art is high, and the popularization and the use of a three-dimensional scanning device are not facilitated.
According to one aspect of the present application, a three-dimensional scanner is provided. The three-dimensional scanner includes: the projection device is used for projecting light rays onto a target object, wherein the light rays comprise preset light rays projected in a color coding stripe form, and the color coding stripe is formed by at least two color stripe codes; and the image acquisition device is used for acquiring light modulated by the target object under the condition that the target object is projected by the projection device to acquire at least one stripe image, wherein the acquired stripe image is used as an encoding graph to determine each stripe sequence and is used as a reconstruction graph to carry out three-dimensional reconstruction on the target object.
Optionally, the image acquisition device further includes a plurality of cameras, where at least one of the plurality of cameras includes a black-and-white camera, where the image acquisition device acquires the light modulated by the target object through the plurality of cameras to obtain a plurality of fringe images, where the fringe image obtained by at least one of the black-and-white cameras is used as a reconstruction map to perform three-dimensional reconstruction on the target object; and at least a plurality of black and white camera stripe images are used as code patterns to determine each stripe sequence, and/or at least one color camera stripe image is used as a code pattern to determine each stripe sequence.
Optionally, the image capturing device further includes a light beam processing device, where the light beam processing device includes a light inlet portion and at least two light outlet portions, where each camera is respectively disposed corresponding to a different light outlet portion, and the image capturing device collects light modulated by the target object through the light beam processing device.
Optionally, the light beam processing apparatus further includes at least one first light beam splitting unit, where the first light beam splitting unit is configured to perform light splitting processing on the light beams projected from the light inlet portion, so that the light beams are respectively projected from the at least two light outlet portions to cameras correspondingly disposed on the light outlet portions.
Optionally, the light beam processing apparatus further includes at least one second light beam separation unit, where the second light beam separation unit is configured to separate light rays to be obtained by a specified camera, so that the specified camera obtains light rays including a specified wavelength band, where the color-coded stripes include stripes of colors corresponding to the specified wavelength band.
Optionally, the designated camera is the black-and-white camera.
Optionally, the light beam processing device includes a right-angle two-channel dichroic prism, and the right-angle two-channel dichroic prism includes a third light emitting portion and a fourth light emitting portion, where the light beam processing device implements, through the right-angle two-channel dichroic prism, light splitting processing on light rays projected from the light inlet portion, so that the light rays are projected from the third light emitting portion and the fourth light emitting portion to cameras respectively corresponding to the light emitting portions; the image acquisition device comprises a third camera and a fourth camera, the third camera is arranged corresponding to the third light-emitting part, the fourth camera is arranged corresponding to the fourth light-emitting part, the third camera generates a third stripe image based on the acquired light, the fourth camera generates a fourth stripe image based on the acquired light, and the third stripe image and the fourth stripe image comprise stripes with at least two colors and the stripes with at least two colors can be identified; the light beam processing device realizes separation processing of light rays acquired by the appointed camera through the right-angle two-channel color separation prism, so that the appointed camera acquires the light rays containing the appointed waveband, wherein the acquiring of the light rays containing the appointed waveband by the appointed camera comprises: the third camera acquires light in a first filtering band, and/or the fourth camera acquires light in a second filtering band.
Optionally, the light beam processing device includes a three-channel dichroic prism, and the three-channel dichroic prism includes a fifth light-emitting portion, a sixth light-emitting portion, and a seventh light-emitting portion, where the light beam processing device implements, through the three-channel dichroic prism, light splitting processing on light rays projected from the light-entering portion, so that the light rays are projected from the fifth light-emitting portion, the sixth light-emitting portion, and the seventh light-emitting portion to cameras respectively corresponding to the light-emitting portions; the image acquisition device comprises a fifth camera arranged corresponding to the fifth light-emitting part, a sixth camera arranged corresponding to the sixth light-emitting part and a seventh camera arranged corresponding to the seventh light-emitting part, the fifth camera generates a fifth stripe image based on the acquired light, the sixth camera generates a sixth stripe image based on the acquired light, the seventh camera generates a seventh stripe image based on the acquired light, and the fifth stripe image, the sixth stripe image and the seventh stripe image comprise stripes of at least two colors and the stripes of at least two colors are identifiable; the light beam processing device separates light rays acquired by the appointed camera through the three-channel color separation prism, so that the appointed camera acquires the light rays containing the appointed waveband, wherein the acquiring of the light rays containing the appointed waveband by the appointed camera at least comprises the following steps: the fifth camera acquires light rays in a third filtering wave band, the sixth camera acquires light rays in a fourth filtering wave band, and the third filtering wave band is different from the fourth filtering wave band.
Optionally, the light beam processing device includes a semi-reflective and semi-transparent prism, and the semi-reflective and semi-transparent prism includes a first light emitting portion and a second light emitting portion, where the light beam processing device implements light splitting processing on light rays projected from the light inlet portion through the semi-reflective and semi-transparent prism, so that the light rays are respectively projected from the first light emitting portion and the second light emitting portion to cameras respectively corresponding to the light emitting portions; the image acquisition device comprises a first camera and a second camera, the first camera is arranged corresponding to the first light-emitting portion, the second camera is arranged corresponding to the second light-emitting portion, the first camera generates a first stripe image based on acquired light, the second camera generates a second stripe image based on acquired light, and the first stripe image and the second stripe image comprise stripes of at least two colors and the stripes of the at least two colors can be identified.
Optionally, the light beam processing apparatus further includes an optical filter, where the light beam processing apparatus separates light rays obtained by the designated camera through the optical filter, so that the designated camera obtains light rays including a fifth filtering waveband, and at least one of the plurality of cameras is the designated camera.
Optionally, the three-dimensional scanner further includes an illuminator, wherein the image acquisition device is further configured to acquire the illumination light reflected by the target object to acquire the texture data of the target object when the target object is illuminated by the illuminator.
Optionally, the image acquisition device can identify and determine red light, green light and blue light.
According to another aspect of the present application, there is provided a three-dimensional scanning system comprising: the three-dimensional scanner is used for projecting light rays onto a target object and collecting the light rays modulated by the target object under the condition that the target object is projected with the light rays so as to obtain at least one fringe image, wherein the projected light rays comprise preset light rays projected in a color coding fringe form, and the color coding fringes at least consist of two color fringe codes; and the image processor is connected with the three-dimensional scanner and used for acquiring at least one stripe image acquired by the three-dimensional scanner, determining each stripe sequence according to the stripe image as an encoding image and performing three-dimensional reconstruction on the target object as a reconstruction image.
Wherein the three-dimensional scanner is any of the three-dimensional scanners described above.
Optionally, in a case that the three-dimensional scanner acquires light modulated by the target object through a plurality of cameras to obtain at least one stripe image, and at least one of the plurality of cameras includes a black-and-white camera, the image processor is further configured to: taking a stripe image obtained by at least one black-and-white camera as a reconstruction image to carry out three-dimensional reconstruction on the target object; the stripe images obtained by at least a plurality of black and white cameras are used as coding patterns to determine each stripe sequence, and/or the stripe images obtained by at least one color camera are used as coding patterns to determine each stripe sequence.
According to another aspect of the present application, a three-dimensional scanning method is provided. The three-dimensional scanning method comprises the following steps: projecting preset light rays to a target object in a color coding stripe mode; acquiring light modulated by the target object, and acquiring at least one fringe image based on the light, wherein the acquired fringe image is used as an encoding graph to determine each fringe sequence and is used as a reconstruction graph to perform three-dimensional reconstruction on the target object; determining a sequence of stripes in the plurality of stripe images based on the coding pattern; and performing three-dimensional reconstruction on the reconstruction map based on the sequence to acquire three-dimensional data of the target object.
Wherein the three-dimensional scanning method is applied to the three-dimensional scanner of any item above.
Optionally, the three-dimensional scanning method further includes: projecting illumination light onto a target object and acquiring texture data of the target object based on the illumination light; and acquiring the color three-dimensional data of the target object based on the three-dimensional data and the texture data of the target object.
According to another aspect of the present application, a three-dimensional scanning method is provided. The three-dimensional scanning method comprises the following steps: acquiring a first image and a second image, wherein the first image and the second image are fringe images acquired based on the same light beam; determining a coded sequence of stripes based on the first image; and performing stripe matching on the stripes of the second image based on the coding sequence, and realizing three-dimensional reconstruction to obtain three-dimensional data of the target object.
Wherein the three-dimensional scanning method is applied to the three-dimensional scanner of any item above.
Optionally, the three-dimensional scanning method further includes: and acquiring texture data, and acquiring color three-dimensional data of the target object based on the three-dimensional data and the texture data.
The three-dimensional scanner provided by the embodiment of the application projects light rays onto a target object through the projection device, wherein the light rays comprise preset light rays projected in a color coding stripe form, and the color coding stripe is formed by at least two color stripe codes; the image acquisition device acquires light modulated by the target object under the condition that the target object is projected by the projection device to acquire at least one stripe image, wherein the photosensitive wave band of the image acquisition device corresponds to the stripe colors contained in the color coding stripes one by one, and the acquired stripe image is used as a coding image to determine each stripe sequence and is used as a reconstruction image to carry out three-dimensional reconstruction on the target object. And the technical problems that the existing three-dimensional reconstruction method needs higher hardware cost and is not beneficial to popularization and application of a three-dimensional scanning device in the related technology are solved.
It should be noted that: the three-dimensional scanner mentioned in the embodiment of the application is based on a stripe extraction algorithm of spatial coding to acquire the three-dimensional appearance of the target object. Therefore, the three-dimensional scanner can realize the three-dimensional reconstruction of the target object only by one frame of two-dimensional image at least, thereby greatly reducing the frame rate of the camera and the operation cost of the algorithm and facilitating the popularization and the use of the three-dimensional scanner; specifically, the three-dimensional scanner does not need to use a camera with a higher frame rate, so that the volume of the camera required in the three-dimensional scanner can be reduced to a certain extent, and the three-dimensional scanner is more suitable for acquiring the three-dimensional appearance of the object in the oral cavity.
And based on the technical characteristic that the three-dimensional scanner can realize the three-dimensional reconstruction of the target object by only one frame of two-dimensional image at least, the acquisition time difference between a reconstruction image and a texture image is greatly shortened, the time required for carrying out the three-dimensional reconstruction on the target object to carry out projection and shooting is reduced, and the three-dimensional scanner is also more suitable for acquiring the three-dimensional appearance of the object in the oral cavity (the three-dimensional scanner is convenient for carrying out handheld scanning).
In addition, the three-dimensional scanner provided by the embodiment of the application utilizes the color as the space coding information, so that the technical effects that the coding information is easy to identify and the identification accuracy is improved are also realized.
In addition, the three-dimensional scanner mentioned in the embodiment of the application is based on a stripe extraction algorithm of spatial coding to obtain the three-dimensional appearance of the target object, and the technical effect of canceling the projection requirement of dynamic projection is also realized.
Drawings
The accompanying drawings, which are incorporated in and constitute a part of this application, illustrate embodiments of the application and, together with the description, serve to explain the application and are not intended to limit the application. In the drawings:
FIG. 1 is a first schematic diagram of an alternative three-dimensional scanner provided in accordance with an embodiment of the present application;
fig. 2 is a schematic diagram illustrating the diffusion and contrast of three colors of red, green and blue on an object according to an embodiment of the present disclosure;
FIG. 3 is a schematic view of a positional relationship between an illuminating member and a reflector according to an embodiment of the present application;
FIG. 4 is a schematic diagram of a beam path in a beam processing apparatus according to an embodiment of the present application;
FIG. 5 is a second schematic diagram of an alternative three-dimensional scanner provided in accordance with an embodiment of the present application;
FIG. 6 is a third schematic diagram of an alternative three-dimensional scanner provided in accordance with an embodiment of the present application;
FIG. 7 is a first flowchart of an alternative three-dimensional scanning method provided in an embodiment of the present application; and
fig. 8 is a second flowchart of an alternative three-dimensional scanning method according to an embodiment of the present application.
Wherein the figures include the following reference numerals:
10. a projection device; 20. an image acquisition device; 30. an illuminating member; 40. a reflective mirror; 11. a light source emitter; 12. a color grating sheet; 13. a first imaging lens; 14. a beam coupling system; 15. a light bar; 16. a phase modulation element; 17. a drive motor; 21. a camera; 22. a semi-reflecting and semi-transmitting prism; 23. an optical filter; 24. a right-angle two-channel dichroic prism; 25. a three-channel color separation prism; 26. a second imaging lens; 111. a DLP transmitter; 112. a laser transmitter; 211. a first camera; 212. a second camera; 213. a third camera; 214. a fourth camera; 215. a fifth camera; 216. a sixth camera; 217. and a seventh camera.
Detailed Description
It should be noted that the embodiments and features of the embodiments in the present application may be combined with each other without conflict. The present application will be described in detail below with reference to the embodiments with reference to the attached drawings.
In order to make the technical solutions better understood by those skilled in the art, the technical solutions in the embodiments of the present application will be clearly and completely described below with reference to the drawings in the embodiments of the present application, and it is obvious that the described embodiments are only partial embodiments of the present application, but not all embodiments. All other embodiments, which can be derived by a person skilled in the art from the embodiments given herein without making any creative effort, shall fall within the protection scope of the present application.
It should be noted that the terms "first," "second," and the like in the description and claims of this application and in the drawings described above are used for distinguishing between similar elements and not necessarily for describing a particular sequential or chronological order. It should be understood that the data so used may be interchanged under appropriate circumstances such that embodiments of the application described herein may be used. Furthermore, the terms "comprises," "comprising," and "having," and any variations thereof, are intended to cover a non-exclusive inclusion, such that a process, method, system, article, or apparatus that comprises a list of steps or elements is not necessarily limited to those steps or elements expressly listed, but may include other steps or elements not expressly listed or inherent to such process, method, article, or apparatus.
According to an embodiment of the present application, a three-dimensional scanner is provided.
Fig. 1 is a schematic diagram of a three-dimensional scanner according to an embodiment of the present application. As shown in fig. 1, the three-dimensional scanner includes the following components:
the projection device 10 is used for projecting light rays onto a target object, wherein the light rays comprise preset light rays projected in a color coding stripe form, and the color coding stripe is formed by at least two color stripe codes; that is, at least two color stripes are encoded and ordered to be combined into a color encoded stripe.
It should be noted that: the color coding stripes can be composed of various pure color stripe codes and also can be composed of various non-pure color stripe codes; however, in order to perform the distinguishing process for each color stripe, a color-coded stripe composed of a plurality of pure-color stripe codes, for example, red, green, blue, cyan, magenta, and yellow, is preferable, and specifically, the R, G, B component of each color stripe in the color-coded stripe is preferably 0 or 255, and only two components at most will be 255 at the same time.
It should also be noted that: because different colors have different diffusion and light transmission properties on the tooth surface, in order to obtain a high-quality stripe pattern (the stripes are more uniform, and the contrast between the stripes is also more uniform), the widths of the stripes in the color coding stripes are respectively set to different values in the application, so that the diffusion properties of red, green and blue on a target object are adjusted, the mutual interference among the color stripes is reduced, and the extraction precision of the color stripes is improved.
Specifically, as shown in fig. 2, the RGB three colors have different diffusion and contrast on the object, and at this time, the widths of the color stripes are adjusted to realize that the RGB three colors have uniform diffusion performance and average contrast of the color stripes, thereby improving the stripe extraction accuracy.
Alternatively, the projection device 10 may adopt a transmission projection mode.
Specifically, after the light source emitter 11 emits at least two light rays with different wave bands, the light rays with the at least two different wave bands are collimated and converged, the light rays penetrate through the MASK pattern, and the pattern is projected onto the target object through the first imaging lens 13.
That is, the projection apparatus 10 includes: the image capturing device comprises a light source emitter 11, a color grating sheet 12 and a first imaging lens 13, wherein the light source emitter 11 is used for emitting light rays of at least two different wavebands, the color grating sheet 12 and the first imaging lens 13 are arranged on a transmission path of the light rays, the light rays penetrate through MASK patterns on the color grating sheet 12 and project the patterns to a target object through the first imaging lens 13, and color types contained in the MASK patterns on the color grating sheet 12 correspond to waveband types contained in the light rays transmitted through the MASK patterns one to one.
In an optional example, the light source emitter 11 may be a DLP emitter 111, and may also be a laser emitter 112, where the laser emitted by the laser emitter 112 has the following characteristics: directional light emission, extremely high brightness, extremely pure color and good coherence.
Taking the laser emitter 112 as an example, it should be noted that: laser light is prone to having unsuitable aperture and divergence angles, and uneven light field emphasis. Therefore, the projection apparatus 10 provided in the embodiment of the present application processes the laser beam through the beam coupling system 14 and the optical rod 15 to adjust the aperture and the divergence angle of the laser beam, and outputs a light field with uniform intensity.
In the case that the aperture and the divergence angle of the laser beam are small, the beam coupling system 14 may be composed of a collimating system and a converging lens, or an optical system having an equivalent function to the collimating system and the converging lens. For the case of large divergence angles of the laser light, the beam coupling system 14 may be composed of three or four or more lens elements to form a more complex converging system.
Wherein, the light rod 15 can be a slender hexahedral prism, a cylindrical prism or a pyramidal prism; the emergent end face of the optical rod 15 is parallel to the incident end face, and the emergent end face and the incident end face can be rectangular or square; the light bar 15 can be a solid bar in which light is transmitted inside a solid transparent medium, or a hollow bar in which light is reflected for multiple times in a space surrounded by four solid interfaces; plating antireflection films on the emergent end face and the incident end face of the solid rod, and plating a reflection film or not plating a film on the surfaces of the solid rod; the hollow bar is plated with the reflection increasing film on the inner surface. Specifically, the light may be reflected and mixed on the inner surface of the light rod 15 for multiple times, so as to output a light field with uniform intensity.
That is, the projection apparatus 10 further includes: a light beam coupling system 14 and a light rod 15, wherein the light beam coupling system 14 and the light rod 15 are disposed on a transmission path of the light, and at least two light beams with different wave bands emitted by the light source emitter 11 are projected onto the color grating sheet 12 through the light beam coupling system 14 and the light rod 15, respectively.
Taking the laser emitter 112 as an example, it should be noted that: diffraction spots in the projected pattern can occur due to the coherence of the laser light itself. Therefore, in the case of the projection apparatus 10 provided in the embodiment of the present application, where the laser light source emitter 11 is adopted, the projection apparatus 10 further includes: a phase modulating element 16 and a drive motor 17. Specifically, as shown in fig. 2, the phase modulation element is disposed on a transmission path of the laser beam, wherein after the laser beam with at least two different wave bands is emitted from the light source emitter 11, the phase modulation element disposed on the transmission path of the laser beam modulates the phase of the laser beam in real time, and the phase modulation element is driven by the driving motor 17 to rotate around the rotation axis at a certain speed.
The phase modulation element may be a transparent optical material sheet, a micro-optical element or a random phase plate.
The phase modulation element may be located before the beam coupling system 14 or may be located after the beam coupling system 14.
Taking fig. 1 as an example, a plurality of components that may be included in the projection apparatus 10 are illustrated: the projection apparatus 10 includes: three laser emitters 112, two semi-reflecting and semi-transmitting beam splitters, a phase modulation element 16 (and a driving motor 17 connected with the phase modulation element 16), a beam coupling system 14, a light bar 15, a color grating sheet 12, and a first imaging lens 13.
Wherein the projection device 10 emits laser beams through three laser emitters 112, for example, one laser emitter emits a red laser beam, one laser emitter emits a green laser beam, and one laser emitter emits a blue laser beam; the laser beam respectively passes through the two semi-reflecting and semi-transmitting spectroscopes to realize the technical effect of light beam convergence; the converged laser beam transmits the rotating phase modulation element 16 to avoid the occurrence of diffraction spots of the projection pattern due to the coherence of the laser beam; further, the laser beam passes through the beam coupling system 14 and the optical rod 15 respectively to adjust the aperture and the divergence angle of the laser beam and output an optical field with uniform intensity; finally, the laser beam is made to transmit through the color grating 12 to generate a predetermined light projected in the form of color-coded stripes, and the predetermined light is projected onto the target object through the first imaging lens 13. Of course, the projection device 10 may be provided with only two laser emitters 112, so long as at least two laser beams of different colors are ensured to be emitted for forming the color stripes.
Further, the three-dimensional scanner may further include: the reflective mirror 40, wherein the reflective mirror 40 is used to change a transmission path of light, in this embodiment, the reflective mirror may be used to reflect the predetermined light generated by the projection apparatus 10 to change the transmission path of the predetermined light, and the predetermined light is reflected to the target object through the reflective mirror 40 and reflected to the image capturing apparatus 20 through the target object, so as to reduce the installation constraints of the projection apparatus 10 and the image capturing apparatus 20 and reduce the size of the space required for using the projection apparatus 10 and the image capturing apparatus 20. For example, the following steps are carried out: the space required for the projection device 10 to project the preset light to the target object is as follows: if the projection apparatus 10 is used in the oral cavity and the projection apparatus 10 does not include the reflective mirror 40, the two required spaces are arranged linearly, which brings inconvenience to the use of the projection apparatus 10; if the projection apparatus 10 is applied to the inside of the oral cavity and the projection apparatus 10 includes the reflective mirror 40, the two required spaces are folded, and in this case, the projection apparatus 10 can preferably use the space inside the oral cavity to achieve a good projection effect.
Alternatively, the projection device 10 may be a DLP projector.
Specifically, the DLP projector adopts DLP projection technology (Digital Light processing projection technology, abbreviated as Digital Light processing) and uses a Digital Micromirror Device (DMD, abbreviated as Digital micro mirror Device) as a main key processing element to implement the Digital optical processing. It should be noted that: by adopting the DLP projector as the projection device 10, the technical effects of acquiring images with high contrast and keeping the colors of the pictures bright are achieved.
In an alternative example, the pixel size of the projection module provided by the embodiments of the present application is 7-8 microns. Specifically, in a case where the three-dimensional scanner provided in the embodiment of the present application is applied to the field of three-dimensional scanning of teeth, a 2048X1152 array may be built in the digital micromirror device in the projection apparatus 10 at most, and when the digital micromirror device projects a preset light onto a single tooth (about 15mm), a color coding stripe with a single pixel size of about 7.3um may be obtained. It should be noted that: the smaller pixel size reduces interference between adjacent fringe images on the tooth.
For example, the following steps are carried out: the projection apparatus 10 provided in the embodiment of the present application may adopt a DLP lightcraft, specifically, an optical engine of the DLP lightcraft may be an RGB LED light source engine developed specifically for a DLP3000 DMD by yangming optics, wherein the DLP3000 DMD is installed at the end of the light source engine, the DLP3000 DMD of a 0.3WVGA chipset consists of 415,872 micromirrors, the micromirror pitch is 7.6 μm, a micromirror matrix of 608x684 is formed, and a WVGA (854x480) resolution image can be generated at most.
An image acquisition device 20 for acquiring light reflected by the target object, in the present embodiment, for acquiring light modulated by the target object to acquire at least one fringe image in the case where the target object is projected by the projection device 10, wherein the acquired fringe image serves as an encoding map to determine each fringe sequence and as a reconstruction map to three-dimensionally reconstruct the target object, and for acquiring illumination light reflected by the target object in the case where the target object is illuminated by the illumination member 30.
It should be noted that: since the target object is projected with light by the projection device 10, predetermined light included in the projected light is also projected onto the target object; at the moment, the preset light is projected onto the target object in the form of color coding stripes, and the color coding stripes are also mapped on the target object; further, the image capturing device 20 captures the color-coded stripes mapped on the target object to obtain at least one stripe image.
That is, the light modulated by the target object is: the target object modulates the preset light in the shape of the target object, so that the color coding stripe corresponding to the preset light changes correspondingly based on the shape of the target object, and at this time, the image acquisition device 20 acquires the changed color coding stripe to generate at least one stripe image.
Preferably, the image capturing device 20 acquires at least two stripe images synchronously, and the at least two stripe images correspond to the same modulated color-coded stripe. Specifically, the projection device 10 projects a color-coded stripe to the target object, the color-coded stripe is modulated by the target object and then synchronously acquired by the image acquisition device 20, and the image acquisition device 20 generates at least two stripe images in real time.
The three-dimensional scanner provided by the embodiment of the application projects light rays onto a target object through the projection device 10, wherein the light rays comprise preset light rays projected in a color coding stripe form, and the color coding stripe is composed of coding stripes with at least two colors; the image acquisition device 20 acquires light modulated by a target object to acquire at least one stripe image under the condition that the target object is projected by the projection device 10, wherein a photosensitive waveband of the image acquisition device 20 corresponds to a stripe color included in a color coding stripe, the image acquisition device can acquire coding stripes of at least two colors in the color coding stripe, generally, the projection device is configured with the image acquisition device, colors included in preset light of the projection device can be acquired by the image acquisition device, and the acquired stripe image is used as a coding map to determine each stripe sequence and is used as a reconstruction map to perform three-dimensional reconstruction on the target object. And the technical problems that the existing three-dimensional reconstruction method needs higher hardware cost and is not beneficial to popularization and application of a three-dimensional scanning device in the related technology are solved.
It should be noted that: the three-dimensional scanner mentioned in the embodiment of the application is based on a stripe extraction algorithm of spatial coding to acquire the three-dimensional appearance of the target object. Therefore, the three-dimensional scanner can realize the three-dimensional reconstruction of the target object only by one frame of two-dimensional image at least, thereby greatly reducing the frame rate of the camera 21 and the operation cost of the algorithm and facilitating the popularization and the use of the three-dimensional scanner; specifically, the three-dimensional scanner does not need to use the camera 21 with a higher frame rate, so that the volume of the camera 21 required in the three-dimensional scanner can be reduced to a certain extent, and the three-dimensional scanner is further suitable for acquiring the three-dimensional appearance of the object in the oral cavity.
And based on the technical characteristic that the three-dimensional scanner can realize the three-dimensional reconstruction of the target object by only one frame of two-dimensional image at least, the acquisition time difference between a reconstruction image and a texture image is greatly shortened, the time required for carrying out the three-dimensional reconstruction on the target object to carry out projection and shooting is reduced, and the three-dimensional scanner is also more suitable for acquiring the three-dimensional appearance of the object in the oral cavity (the three-dimensional scanner is convenient for carrying out handheld scanning).
In addition, the three-dimensional scanner provided by the embodiment of the application utilizes the color as the space coding information, so that the technical effects that the coding information is easy to identify and the identification accuracy is improved are also realized.
In addition, the three-dimensional scanner mentioned in the embodiment of the application is based on a stripe extraction algorithm of spatial coding to obtain the three-dimensional appearance of the target object, and the technical effect of canceling the projection requirement of dynamic projection is also realized.
Optionally, in the three-dimensional scanner provided in the embodiment of the present application, the image acquisition device 20 further includes a plurality of cameras 21, where the plurality of cameras 21 at least include one black-and-white camera, where the image acquisition device 20 processes light modulated by the target object through the plurality of cameras 21 to obtain a plurality of fringe images, and the fringe image obtained by at least one black-and-white camera is used as a reconstruction map to perform three-dimensional reconstruction on the target object; and at least a plurality of black and white camera stripe images are used as code patterns to determine each stripe sequence, and/or at least one color camera stripe image is used as a code pattern to determine each stripe sequence.
It should be noted that: as stripe information included in at least one stripe image of the code pattern, a code sequence capable of determining each stripe is required; that is, the code pattern is composed of a stripe image that defines the code sequence of each stripe.
That is, the pre-designed color-coded stripe image is projected onto the target object (e.g. teeth or gums) by the projection device 10, and the image capture device 20 is controlled to rapidly capture the image of the target object with the projected pattern, wherein the cameras 21 included in the image capture device 20 respectively capture different stripe images, for example: the camera A is a color camera and acquires color stripe images, and the camera B is a black-and-white camera and acquires black-and-white stripe images. At the moment, the color stripe image and the black and white stripe image are transmitted to the computer terminal, the computer terminal takes the color stripe image as coding information, and takes the black and white stripe image as a reconstruction image, so as to obtain the three-dimensional shape of the target object.
It should be noted that: since the black-and-white camera has a higher imaging resolution than the color camera, if the image capturing device 20 captures the stripe image by only one color camera, the resolution may be low. In order to avoid the situation that the resolution is low and the three-dimensional reconstruction is difficult, in the above embodiment, the image acquisition device 20 includes a plurality of cameras 21, and at least one black-and-white camera is included in the plurality of cameras 21, and the black-and-white stripe image with the higher imaging resolution is used as the reconstruction map to acquire the three-dimensional topography of the target object. The camera 21 included in the image capturing device 20 may be a CCD camera for example: assuming that the color coding stripe corresponding to the preset light consists of two color stripe codes (e.g., red and blue), at this time, the image acquisition device 20 acquires different stripe images through different CCD cameras, for example, the color CCD camera acquires a stripe image containing two colors of red and blue, the black and white CCD camera acquires a stripe image containing one color of blue (a blue filter is disposed in front of the black and white CCD camera), at this time, the stripe image acquired by the color CCD camera is used to identify and match the sequence codes of each blue stripe, and then a three-dimensional reconstruction algorithm and a stitching fusion algorithm are performed according to the acquired sequence codes and the stripe image acquired by the black and white CCD camera, so as to construct the three-dimensional shape of the target object. It should be noted that: the CCD camera has the characteristics of small volume, light weight, no influence of a magnetic field and vibration and impact resistance, so that the volume of the three-dimensional scanner can be correspondingly reduced under the condition that the three-dimensional scanner adopts the 2CCD camera to acquire a fringe image, the three-dimensional scanner is convenient to use by hands, and the three-dimensional scanner is applied to an environment to be scanned (such as an oral cavity) with a small space.
It should be noted that: the black-and-white CCD camera is optionally provided with a color filter 23, which is not specifically limited in this embodiment of the present application. However, if the optical filter 23 with the specified color is arranged in front of the black-and-white CCD camera, the black-and-white CCD camera can acquire the fringe image with the specified color, and at this time, the fringe image with the specified color is only included, which is more beneficial to the subsequent three-dimensional reconstruction algorithm and the splicing fusion algorithm to construct the three-dimensional shape of the target object.
It should be noted that: the form of the camera is not particularly limited in the present application, and technicians may make corresponding replacements according to technical requirements, for example, the camera may be a CCD camera or a CMOS camera.
Optionally, in the three-dimensional scanner provided in the embodiment of the present application, the light-sensitive wavelength band configured by the image capturing device 20 included in the three-dimensional scan at least includes: and the plurality of specified wave bands correspond to the stripe colors contained in the color coding stripes. That is, in an alternative example, a color camera is disposed in the image capturing device 20, and the color camera can capture a plurality of stripe colors of the color-coded stripes corresponding to the preset light so as to determine each stripe sequence. The designated band described in the present application may be a designated band or a plurality of designated bands.
In addition, as shown in fig. 3, the three-dimensional scanner may further include an illuminating element 30, where the illuminating element 30 is used to illuminate the target object for acquiring a texture map of the target object subsequently, and the illuminating element 30 is preferably a white LED lamp to realize true color scanning, that is, acquiring a three-dimensional model with a color consistent with or substantially consistent with that of the target object. The illuminator 30 may be disposed at the outer circumference of the reflective mirror 40; the illuminating member 30 may be disposed at another position of the scanner and configured to cooperate with the reflective mirror 40 to reflect the illuminating light to the target object through the reflective mirror 40, for example, the illuminating member 30 is disposed at a side of the first imaging lens 13 close to the light source emitter 11, so that the illuminating light and the light projected by the light source emitter 11 can both pass through the first imaging lens 13 and be reflected to the target object through the reflective mirror 40. Specifically, the three-dimensional scanner includes a holding portion and an entrance portion disposed at a front end of the holding portion, the projection device 10 and the image capture device 20 are both mounted on the holding portion, the reflective mirror 40 is mounted on the entrance portion, and the illuminating element 30 may be mounted on the entrance portion or the holding portion.
It should be noted that the image capturing device 20 can identify and determine red light, green light, and blue light, so that the image capturing device 20 can capture a texture map of the target object based on the illumination light.
Optionally, in the three-dimensional scanner provided in the embodiment of the present application, the three-dimensional scanner may further include: the timing control circuit is connected with the projection device 10, the illuminating part 30 and the image acquisition device 20, the timing control circuit is used for controlling the projection device 10 to project light rays onto a target object and synchronously controlling the image acquisition device 20 to acquire a plurality of fringe images, the timing control circuit is used for controlling the illuminating part 30 to irradiate the target object and synchronously controlling the image acquisition device 20 to acquire a texture map, and preferably, the timing control circuit is used for controlling the projection device 10 and the illuminating part 30 to alternately project light rays onto the target object.
Optionally, in the three-dimensional scanner provided in the embodiment of the present application, the image capturing device 20 further includes a light beam processing device, where the light beam processing device includes an light entering portion and at least two light exiting portions, where each camera 21 is respectively disposed corresponding to a different light exiting portion, and the image capturing device 20 collects light modulated by the target object through the light beam processing device.
That is, the image capturing device 20 sets the light beam processing device so that the fringe patterns acquired by the plurality of cameras 21 have completely consistent fields of view and angles, respectively; that is, the plurality of cameras 21 can receive the coaxial lights incident from the same second imaging lens 26, and the coaxial lights are projected to the plurality of cameras 21, respectively. Specifically, as shown in fig. 4, the image light of the target object is incident through the light-entering portion of the light beam processing device; at this time, the light beam processing device performs a light splitting process on the image light of the target object, and the image light is made to be emitted from at least two light emitting parts respectively so as to be projected to the plurality of cameras 21; at this time, the fringe images acquired by the plurality of cameras 21 are all fringe images acquired from the same viewing angle and acquired based on the same modulated color coding fringe, and fringe sequences in the fringe images have relevance based on the same modulated color coding fringe, so that a subsequent algorithm can conveniently perform three-dimensional reconstruction on the fringe images.
In an optional example, the light beam processing apparatus further includes at least one first light beam splitting unit, where the first light beam splitting unit is configured to split the light beams projected from the light inlet portion so that the light beams are projected from the at least two light outlet portions to the camera 21 corresponding to the light outlet portions, specifically, the first light beam splitting unit splits the light beams of the respective colors into light beams in two directions, for example, one red light beam and one blue light beam are processed by the first light beam splitting unit to form two red light beams and two blue light beams, and the two red light beams and the two blue light beams are respectively emitted in different directions.
That is, at least one first light beam separation unit is arranged in the light beam processing device, and the first light beam separation unit is used for performing light splitting processing on light projected from the light inlet part, so that image light of the target object can be projected from the at least two light outlet parts respectively, and the cameras 21 correspondingly arranged on the at least two light outlet parts can acquire fringe images at the same visual angle.
In another optional example, the light beam processing apparatus further comprises at least one second light beam splitting unit for splitting light to be acquired by the designated camera, so that each camera can acquire the light rays containing the specified wavelength band, specifically, the second beam splitting unit splits the light rays of the partial wavelength band from the light rays, the light of the partial wave band is emitted towards one direction, or the second beam splitting unit splits the light of two partial wave bands from the light, the two light of the specified wave bands are emitted from different directions respectively, for example, a beam of red and blue light is processed by the second beam splitting unit to form a beam of blue light which is emitted in one direction, or, a red light beam and a blue light beam are formed after the red light beam and the blue light beam are processed by the second light beam separation unit, and the red light beam and the blue light beam are respectively emitted towards different directions. The color coding stripes comprise stripes of colors corresponding to the specified wave bands.
That is, at least one second light beam separation unit is disposed in the light beam processing apparatus, and the second light beam separation unit is configured to separate the light beams projected to the second light beam separation unit, so that the light beams in a partial wavelength band in the projected light beams pass through the second light beam separation unit, and the light beams in another partial wavelength band are reflected from the surface of the second light beam separation unit (or the light beams in another partial wavelength band are absorbed by the second light beam separation unit), so that the designated camera acquires the light beams including the designated wavelength band.
It should be noted that: the designated camera is the black and white camera.
Optionally, in the three-dimensional scanner provided in the embodiment of the present application, the three-dimensional scanner may further include: a heat dissipation system, a heating antifogging system, a software algorithm system and the like.
The heat dissipation system is used for preventing the interior of the three-dimensional scanner device from being overheated, so that the scanner is damaged.
The heating anti-fog system is used for preventing each optical instrument in the three-dimensional scanner from generating a fog surface phenomenon, so that the situation that accurate stripe images cannot be acquired occurs.
Wherein, the software algorithm system is configured to perform three-dimensional reconstruction on the target object according to the at least one fringe image acquired by the image acquisition device 20.
In order to make the technical solutions of the present application more clearly understood by those skilled in the art, the following description will be given with reference to specific embodiments.
The first embodiment is as follows:
taking fig. 1 as an example, the light beam processing device includes a semi-reflective and semi-transparent prism 22, and the semi-reflective and semi-transparent prism 22 includes a first light-emitting portion and a second light-emitting portion, wherein the light beam processing device transmits and reflects light through the semi-reflecting and semi-transmitting prism 22, so as to realize the light splitting processing of the light projected from the light inlet portion, so that the light is projected from the first light outlet portion and the second light outlet portion to the cameras 21 correspondingly arranged to the respective light outlet portions, the image capturing device 20 further includes a first camera 211 disposed corresponding to the first light emitting portion, and a second camera 212 disposed corresponding to the second light emitting portion, wherein the first camera 211 generates a first stripe image based on the collected light, the second camera 212 generates a second stripe image based on the collected light, and the first stripe image and the second stripe image include stripes of at least two colors and the stripes of the at least two colors are identifiable.
In addition, the light beam processing device further includes an optical filter 23, and the light beam processing device performs separation processing on the light rays obtained by the designated camera through the optical filter 23, so that the designated camera can obtain the light rays containing a fifth filtering wave band, wherein at least one of the plurality of cameras is the designated camera.
Specifically, the optical filter 23 is disposed between the first light emitting portion and the first camera 211 so that the first camera 211 acquires light in the fifth filter wavelength band, and/or disposed between the second light emitting portion and the second camera 212 so that the second camera 212 acquires light in the fifth filter wavelength band.
It should be noted that: the following description will be given by taking an example in which the optical filter 23 is disposed between the first light emitting portion and the first camera 211, so that the first camera 211 acquires light rays in a fifth filter wavelength band: the stripes of two colors included in the first stripe image are respectively a black stripe and a white stripe, wherein the white stripe is in the color coding stripe, and the corresponding stripe color is the filter color corresponding to the filter 23.
At this time, at least one of the stripes of at least two colors included in the second stripe image is a filter color corresponding to the filter 23, so that the second stripe image can identify the code sequence of the stripes included in the first stripe image.
Specifically, the first camera is a black-and-white camera, the second camera is a color camera, the black-and-white camera is disposed corresponding to the optical filter 23, taking the example that the projection device 10 projects red, green and blue color coding stripes (i.e. color coding stripes including red stripes, green stripes and blue stripes), the optical filter 23 is preferably a blue optical filter, the projection device 10 projects the red, green and blue color coding stripes to the target object, the red, green and blue color coding stripes are modulated by the target object and then transmitted to the image processing device, the red, green and blue color coding stripes are separated by the transflective prism 22, a red, green and blue color coding stripe is transmitted and reflected, blue light in the red, green and blue color coding stripe passes through the optical filter 23 and is collected by the black-and-white camera, the black-and-white camera generates a first stripe image including the blue stripes, another red, green and blue color coding, The first stripe image is used as a reconstruction image, and each stripe of the first stripe image can be identified and matched through the coding sequence of the second stripe image based on the corresponding relation of the stripes of the first stripe image and the second stripe image, so that three-dimensional reconstruction is realized.
Certainly, the setting of the front optical filter 23 of the black-and-white camera may also be cancelled, the first stripe image acquired by the black-and-white camera includes red stripes, green stripes and blue stripes, or the bicolor optical filter 23 is set in front of the black-and-white camera for two of the red, green and blue lights to be emitted and collected by the black-and-white camera; the color camera generates a second stripe image containing red stripes, a blue stripe in the first stripe image corresponds to a blue stripe in red, green and blue color coding stripes, a red stripe in the second stripe image corresponds to a blue stripe in red, green and blue color coding stripes, and a monochromatic filter 23 is arranged in front of the black and white camera and only provides a specified light to emit, so the stripes in the first stripe image collected by the black and white camera can be identified and determined, the first stripe image and the second stripe image can be combined to determine a coding sequence of each stripe, the first stripe image and the second stripe image are both used as coding images, and the first stripe image is used as a reconstruction image; or, a dichroic filter 23 is disposed in front of the color camera, and taking the red-green filter 23 disposed in front of the color camera as an example, the color camera generates a second stripe image including red stripes and green stripes, the first stripe image and the second stripe image are both used as encoding patterns or only the second stripe image is used as an encoding pattern, and the first stripe image is used as a reconstruction pattern.
In some embodiments, the image capture device 20 can only identify and determine two of red light, green light, and blue light, for this portion of embodiments, the image capture device 20 cannot completely acquire texture data of the target object under white light, and for some embodiments, the image capture device 20 can identify and determine red light, green light, and blue light, and can completely acquire texture data of the target object under white light, thereby achieving acquisition of color three-dimensional data.
It is worth emphasizing that: in this embodiment, the light beam processing device transmits and reflects light through the half-reflecting and half-transmitting prism 22 to perform light splitting processing on the light projected from the light inlet portion, so that the light is projected from the first light outlet portion and the second light outlet portion to the cameras corresponding to the respective light outlet portions; that is, the light beam processing apparatus realizes the function corresponding to the first light beam splitting unit through the half-reflecting and half-transmitting prism 22.
At the same time, it is also worth emphasizing: in this embodiment, the light beam processing device performs separation processing on the light to be acquired by the designated camera through the optical filter 23, so that the designated camera acquires the light containing the designated wavelength band; that is, the beam processing apparatus realizes the function corresponding to the second beam splitting unit through the filter 23.
Example two:
taking fig. 5 as an example, the light beam processing device includes a right-angle two-channel dichroic prism 24, and the right-angle two-channel dichroic prism 24 includes a third light emitting portion and a fourth light emitting portion, wherein the light beam processing device implements a light splitting process on the light projected from the light inlet portion through the right-angle two-channel dichroic prism 24, so that the light is projected from the third light emitting portion and the fourth light emitting portion to the cameras 21 corresponding to the respective light emitting portions; correspondingly, the image capturing device 20 includes a third camera 213 disposed corresponding to the third light-emitting portion, and a fourth camera 214 disposed corresponding to the fourth light-emitting portion, the third camera 213 generates a third fringe image based on the collected light, the fourth camera 214 generates a fourth fringe image based on the collected light, and the third fringe image and the fourth fringe image include fringes of at least two colors, and the fringes of the at least two colors are identifiable;
in addition, the light beam processing apparatus further performs separation processing on light rays to be acquired by the designated camera through the right-angle two-channel dichroic prism 24, so that the designated camera acquires light rays containing a designated wavelength band, where the acquiring of the light rays containing the designated wavelength band by the designated camera includes: the third camera 213 acquires light in a first filter wavelength band and/or the fourth camera 214 acquires light in a second filter wavelength band.
It should be noted that: the following description will be given by taking an example that the light beam processing device separates the light beam obtained by the third camera 213 through the right-angle two-channel dichroic prism 24, so that the third camera 213 obtains the light beam including the first filtering wavelength band: the stripes of the two colors included in the third stripe image are respectively a black stripe and a white stripe, wherein the white stripe is in the color coding stripe, and the corresponding stripe color is the filter color corresponding to the filter 23.
At this time, at least one of the stripes of at least two colors included in the fourth stripe image is a filter color corresponding to the filter 23, so that the fourth stripe image can identify the code sequence of the stripes included in the third stripe image.
Specifically, the third camera is a black-and-white camera, the fourth camera is a color camera, taking the example that the projection device 10 projects red, green and blue color coding stripes (i.e. color coding stripes including red stripes, green stripes and blue stripes) as an example, the projection device 10 projects the red, green and blue color coding stripes to a target object, the red, green and blue color coding stripes are modulated by the target object and transmitted to the image processing device, the red, green and blue color coding stripes are decomposed by the right-angle two-channel color separation prism 24 into a red, green and blue coding stripes, the blue coding stripes are collected by the black-and-white camera, the black-and-white camera generates a third stripe image including blue stripes, the red, green and blue coding stripes are collected by the color camera, the color camera generates a fourth stripe image including red stripes and green stripes, the blue stripes in the third stripe image correspond to the stripes in the fourth stripe image, the third stripe image and the fourth stripe image are combined and then correspond to the red, green and blue color coding stripes, the fourth stripe image is used as a coding image, specifically, the fourth stripe image is collected by a color camera, and the red stripe and the green stripe in the fourth stripe image can be identified and determined, so that the coding sequence of each stripe in the fourth stripe image can be determined, the third stripe image is used as a reconstruction image, and each stripe of the third stripe image can be identified and matched through the coding sequence of the fourth stripe based on the corresponding relation of the third stripe image and the fourth stripe image, so that three-dimensional reconstruction is realized. Certainly, in this embodiment, the black-and-white camera only obtains monochromatic light, so the third fringe image can also be identified and determined, the third fringe image can be combined with the fourth fringe image to determine the coding sequence of each fringe, and both the third fringe image and the fourth fringe image are used as coding patterns. In addition, in this embodiment, the filter 23 may be provided, or the filter 23 may not be provided, and the filter 23 may be provided in cooperation with the two-channel dichroic prism 24.
It is worth emphasizing that: in this embodiment, the light beam processing device performs a light splitting process on the light projected from the light inlet portion through the two-channel dichroic prism 24, so that the light is projected from the third light outlet portion and the fourth light outlet portion to the cameras 21 corresponding to the respective light outlet portions; that is, the light beam processing apparatus realizes the function corresponding to the first light beam splitting unit through the right-angle two-channel dichroic prism 24.
For the same reason, it is also worth emphasizing: in this embodiment, the light beam processing apparatus further performs separation processing on the light rays obtained by the designated camera through the right-angle two-channel dichroic prism 24, so that the designated camera obtains the light rays containing the designated wavelength band; that is, the light beam processing apparatus realizes the function corresponding to the second light beam splitting unit through the right-angle two-channel dichroic prism 24.
Example three:
taking fig. 6 as an example, the light beam processing device includes a three-channel dichroic prism 25, and the three-channel dichroic prism 25 includes a fifth light-emitting portion, a sixth light-emitting portion, and a seventh light-emitting portion, wherein the light beam processing device implements a light splitting process on the light projected from the light-entering portion through the three-channel dichroic prism 25, so that the light is projected to the cameras 21 respectively corresponding to the light-emitting portions from the fifth light-emitting portion, the sixth light-emitting portion, and the seventh light-emitting portion;
correspondingly, the image capturing device 20 includes a fifth camera 215 disposed corresponding to the fifth light-emitting portion, a sixth camera 216 disposed corresponding to the sixth light-emitting portion, and a seventh camera 217 disposed corresponding to the seventh light-emitting portion, the fifth camera 215 generates a fifth stripe image based on the captured light, the sixth camera 216 generates a sixth stripe image based on the captured light, the seventh camera 217 generates a seventh stripe image based on the captured light, and the fifth stripe image, the sixth stripe image, and the seventh stripe image include stripes of at least two colors and the stripes of at least two colors are identifiable;
the light beam processing device separates light rays obtained by the designated camera through the three-channel dichroic prism 25, so that the designated camera can obtain light rays containing a designated waveband, wherein the step of obtaining light rays containing the designated waveband by the designated camera at least comprises: the fifth camera 215 acquires light in a third filtering wavelength band, and the sixth camera 216 acquires light in a fourth filtering wavelength band, where the third filtering wavelength band is different from the fourth filtering wavelength band.
Wherein, at least one of the fifth camera, the sixth camera and the seventh camera is a black-and-white camera, specifically, the fifth camera is a black-and-white camera, the sixth camera and the seventh camera are color cameras, or the fifth camera and the sixth camera are black-and-white cameras, and the seventh camera is a color camera, preferably, the fifth camera 215, the sixth camera 216 and the seventh camera 217 are black-and-white cameras.
It should be noted that: since the light sensing wavelength band of the image capturing device 20 of the present application corresponds to the stripe colors included in the color coding stripes one by one, under the condition that the fifth camera 215, the sixth camera 216, and the seventh camera 217 are black and white cameras, the stripe colors included in the color coding stripes are three, wherein at least two stripe colors have a corresponding relationship with the third filtering wavelength band and the fourth filtering wavelength band.
For example: the color coding stripes consist of red stripes, blue stripes and green stripes; at this time, the filtering color corresponding to the first filtering surface may be red, and the filtering color corresponding to the second filtering surface may be blue; at this time, the acquired fifth stripe image is a black and white stripe image, wherein the white stripe corresponds to a red stripe in the color coding stripe; the sixth stripe image is a black and white stripe image, wherein the white stripe corresponds to a blue stripe in the color-coded stripes.
For example: the color coding stripes consist of red stripes, blue stripes and yellow stripes; at this time, the filtering color corresponding to the first filtering surface may be red, and the filtering color corresponding to the second filtering surface may be green; at this time, the fifth stripe image is a black and white stripe image, where the white stripe corresponds to a red stripe and a yellow stripe in the color coding stripe (in the optical field, the yellow light is formed by combining a green light and a red light); the sixth stripe image is a black and white stripe image, wherein the white stripe corresponds to a yellow stripe in the color-coded stripe (in the optical field, the yellow light is formed by combining a green light and a red light).
In an optional example, the light beam processing apparatus further performs, through the three-channel dichroic prism 25, separation processing on light to be obtained by a given camera, so that the seventh camera 217 obtains light in a sixth filtering wavelength band, where the sixth filtering wavelength band is different from the third filtering wavelength band and the fourth filtering wavelength band.
For example, the color-coded stripes consist of red, blue, and green stripes; at this time, the filtering color corresponding to the first filtering surface may be red, the filtering color corresponding to the second filtering surface may be blue, and the filtering color corresponding to the third filtering surface may be green; at this time, the seventh stripe image is a black and white stripe image, where the white stripe corresponds to the green stripe in the color-coded stripes.
At this time, any one of the fifth fringe image, the sixth fringe image, and the seventh fringe image may be used as a reconstruction map to perform three-dimensional reconstruction on the target object. For example, the fifth fringe image serves as a reconstruction map to reconstruct the target object in three dimensions, and the fifth fringe image, the sixth fringe image, and the seventh fringe image collectively serve as an encoding map to determine the respective fringe sequences. Furthermore, the fifth fringe image, the sixth fringe image and the seventh fringe image are preferably used as reconstruction maps,
it is worth emphasizing that: in this embodiment, the light beam processing device implements, through the three-channel dichroic prism 25, a light splitting process on the light projected from the light inlet portion, so that the light is projected from the fifth light outlet portion, the sixth light outlet portion, and the seventh light outlet portion to the cameras 21 corresponding to the respective light outlet portions; that is, the light beam processing apparatus realizes the function corresponding to the first light beam splitting unit through the three-channel dichroic prism 25.
For the same reason, it is also worth emphasizing: in this embodiment, the light beam processing apparatus further implements separation processing on the light rays obtained by the designated camera through the three-channel dichroic prism 25, so that the designated camera obtains the light rays containing the designated wavelength band; that is, the light beam processing apparatus realizes the function corresponding to the second light beam splitting unit through the three-channel dichroic prism 25.
It should be noted that: the first embodiment, the second embodiment and the third embodiment are all listed in the present application, so that a person skilled in the art can more clearly understand an exemplary illustration of the technical solution of the present application, and the present application is not limited specifically herein. If other specific devices can achieve the function limitation description of the light beam processing device in the present application, the same can also be taken as an implementation technical solution in the present application.
Further, it should be noted that: for example, in the second embodiment and the third embodiment, after the light beam processing device realizes the function corresponding to the second light beam splitting unit through the right-angle two-channel dichroic prism 24 or the three-channel dichroic prism 25, the light beam processing device can continue to realize the function corresponding to the second light beam splitting unit again through the optical filter.
In summary, compared with the prior art, the invention has the following beneficial effects:
1. the stripe extraction algorithm based on the spatial coding realizes the technical purpose that the target object can be subjected to three-dimensional reconstruction only by one frame of two-dimensional image, and achieves the technical effects of reducing the frame rate of the camera 21 and the operation cost of the algorithm;
2. the color is used as the information of the space coding, so that the coded information is easy to identify, and the technical effect of improving the identification accuracy is achieved;
3. based on the technical principle of the three-dimensional scanner, the three-dimensional scanner cancels the requirement of dynamic projection, and can also perform pattern projection processing in a simple transmission projection mode; furthermore, under the condition that the three-dimensional scanner performs pattern projection processing in a transmission projection mode, the hardware cost is greatly reduced;
4. in the case where the three-dimensional scanner uses laser as a light source to perform pattern projection processing, the brightness and the depth of field of the projection apparatus 10 can be improved, and the technical effects of low cost, high brightness, and high depth of field can be achieved.
That is, the three-dimensional scanner provided by the application has the advantages of low hardware cost, low real-time frame rate requirement, high brightness and large depth of field of an optical system, and miniaturization of equipment; and the three-dimensional scanner can directly perform dynamic real-time three-dimensional scanning with colored textures on materials with the characteristics of light reflection, light transmission, light diffusion and the like, such as teeth, gum and the like in the mouth.
According to an embodiment of the present application, a three-dimensional scanning system is provided. Wherein, this three-dimensional scanning system includes:
the three-dimensional scanner is used for projecting light rays onto a target object and collecting the light rays modulated by the target object under the condition that the target object is projected with the light rays so as to obtain at least one fringe image, wherein the projected light rays comprise preset light rays projected in a color coding fringe form, and the color coding fringes at least consist of two color fringe codes;
and the image processor is connected with the three-dimensional scanner and used for acquiring at least one stripe image acquired by the three-dimensional scanner, determining each stripe sequence according to the stripe image as an encoding image and performing three-dimensional reconstruction on the target object as a reconstruction image.
It should be noted that: the three-dimensional scanner included in the three-dimensional scanning system is the three-dimensional scanner provided in the embodiments of the present application.
Optionally, in a case that the three-dimensional scanner acquires light modulated by the target object through a plurality of cameras to obtain at least one stripe image, and at least one of the plurality of cameras includes a black-and-white camera, the image processor is further configured to: taking a stripe image obtained by at least one black-and-white camera as a reconstruction image to carry out three-dimensional reconstruction on the target object; the stripe images obtained by at least a plurality of black and white cameras are used as coding patterns to determine each stripe sequence, and/or the stripe images obtained by at least one color camera are used as coding patterns to determine each stripe sequence.
The three-dimensional scanning system provided by the embodiment of the application projects light onto a target object through a three-dimensional scanner, and collects the light modulated by the target object under the condition that the target object is projected with the light, so as to obtain at least one stripe image, wherein the projected light comprises preset light projected in a color coding stripe form, and the color coding stripe is composed of at least two color stripe codes; and the image processor is connected with the three-dimensional scanner and used for acquiring at least one stripe image acquired by the three-dimensional scanner, determining each stripe sequence according to the stripe image as an encoding image and performing three-dimensional reconstruction on the target object as a reconstruction image. And the technical problems that the existing three-dimensional reconstruction method needs higher hardware cost and is not beneficial to popularization and application of a three-dimensional scanning device in the related technology are solved.
It should be noted that: the three-dimensional scanning system mentioned in the embodiment of the application is based on a stripe extraction algorithm of spatial coding to acquire the three-dimensional appearance of the target object. Therefore, the three-dimensional scanning system can realize the three-dimensional reconstruction of the target object only by one frame of two-dimensional image at least, thereby greatly reducing the frame rate of the camera and the operation cost of the algorithm and facilitating the popularization and the use of the three-dimensional scanning system; specifically, the three-dimensional scanning system does not need to use a camera with a higher frame rate, so that the volume of the camera required in the three-dimensional scanning system can be reduced to a certain extent, and the three-dimensional scanning system is more suitable for acquiring the three-dimensional appearance of the object in the oral cavity.
And based on the technical characteristic that the three-dimensional scanning system can realize the three-dimensional reconstruction of the target object by only one frame of two-dimensional image at least, the acquisition time difference between a reconstruction image and a texture image is greatly shortened, the time required for carrying out the three-dimensional reconstruction on the target object to carry out projection and shooting is reduced, and the three-dimensional scanning system is also more suitable for acquiring the three-dimensional appearance of the object in the oral cavity (the three-dimensional scanning system is convenient for carrying out handheld scanning).
In addition, the three-dimensional scanning system provided by the embodiment of the application utilizes the color as the space coding information, so that the technical effects that the coding information is easy to identify and the identification accuracy is improved are also realized.
In addition, the three-dimensional scanning system mentioned in the embodiment of the present application is based on a stripe extraction algorithm of spatial coding to obtain the three-dimensional topography of the target object, so that a technical effect of canceling the projection requirement of dynamic projection is also achieved.
The embodiment of the present application further provides a three-dimensional scanning method, and it should be noted that the three-dimensional scanning method of the embodiment of the present application is applied to the three-dimensional scanner provided in the embodiment of the present application. The following describes a three-dimensional scanning method provided in an embodiment of the present application.
Fig. 7 is a flowchart of a three-dimensional scanning method according to an embodiment of the present application. As shown in fig. 7, the three-dimensional scanning method includes:
step S701, projecting preset light rays to a target object in a color coding stripe mode;
step S703, collecting light modulated by the target object, and acquiring at least one fringe image based on the light, wherein the acquired fringe image is used as an encoding image to determine each fringe sequence, and is used as a reconstruction image to perform three-dimensional reconstruction on the target object;
step S705, determining the sequence of each stripe in a plurality of stripe images based on the coding graph;
step S707, three-dimensional reconstruction is performed on the reconstructed map based on the sequence, and three-dimensional data of the target object is acquired.
According to the three-dimensional scanning method provided by the embodiment of the application, the preset light is projected onto the target object in the form of the color coding stripes; acquiring light modulated by a target object, and acquiring at least one fringe image based on the light, wherein the acquired fringe image is used as a coding image to determine each fringe sequence and is used as a reconstruction image to perform three-dimensional reconstruction on the target object; determining a sequence of each stripe in the plurality of stripe images based on the coding pattern; and performing three-dimensional reconstruction on the reconstructed image based on the sequence to obtain three-dimensional data of the target object. And the technical problems that the existing three-dimensional reconstruction method needs higher hardware cost and is not beneficial to popularization and application of a three-dimensional scanning device in the related technology are solved.
It should be noted that: the three-dimensional scanning method mentioned in the embodiment of the application is based on a stripe extraction algorithm of space coding to acquire the three-dimensional appearance of the target object. Therefore, the three-dimensional scanning method can realize the three-dimensional reconstruction of the target object only by one frame of two-dimensional image at least, thereby greatly reducing the frame rate of the camera and the operation cost of the algorithm and facilitating the popularization and the use of the three-dimensional scanning method; specifically, the three-dimensional scanning method does not need to use a camera with a higher frame rate, so that the volume of the camera required in the three-dimensional scanning method can be reduced to a certain extent, and the three-dimensional scanning method is more suitable for acquiring the three-dimensional appearance of the object in the oral cavity.
And based on the technical characteristic that the three-dimensional reconstruction of the target object can be realized by only one frame of two-dimensional image at least, the acquisition time difference between a reconstruction image and a texture image is greatly shortened, the time required for the three-dimensional reconstruction of the target object to be projected and shot is reduced, and the three-dimensional scanning method is also more suitable for acquiring the three-dimensional shape of the object in the oral cavity (the three-dimensional scanning method is convenient for handheld scanning).
In addition, the three-dimensional scanning method provided by the embodiment of the application utilizes the color as the space coding information, so that the technical effects that the coding information is easy to identify and the identification accuracy is improved are also realized.
In addition, the three-dimensional scanning method mentioned in the embodiment of the present application is based on a stripe extraction algorithm of spatial coding to obtain the three-dimensional topography of the target object, so that a technical effect of canceling the projection requirement of dynamic projection is also achieved.
Optionally, in the three-dimensional scanning method provided in the embodiment of the present application, the three-dimensional scanning method further includes: acquiring texture data of the target object; and acquiring the color three-dimensional data of the target object based on the three-dimensional data and the texture data of the target object.
Optionally, in the three-dimensional scanning method provided in the embodiment of the present application, the three-dimensional scanning method further includes: projecting illumination light onto a target object and acquiring texture data of the target object based on the illumination light; and acquiring the color three-dimensional data of the target object based on the three-dimensional data and the texture data of the target object.
The texture data is acquired by a single camera, or the data acquired by a plurality of cameras is synthesized.
Specifically, in step S703, light modulated by the target object is collected, and at least two stripe images are acquired based on the same light, where at least one stripe image is acquired by a black-and-white camera, the acquired stripe images are used as an encoding map to determine each stripe sequence, and are used as a reconstruction map to perform three-dimensional reconstruction on the target object, and preferably, the stripe images acquired by the black-and-white camera are used as the reconstruction map.
Specifically, step S705 determines a sequence of each stripe in the plurality of stripe images based on the code pattern, and determines a code sequence based on the arrangement information and the color information of each stripe in the code pattern, for example, if four stripes arranged in red, green, and red are coded and decoded by red (1,0) and green (0,1), the code sequence is (1,0) (0,1) (1,0), and if five stripes arranged in red, blue, green, and red are coded and decoded by red (1,0,0), green (0,1,0), and blue (0,0,1), the code sequence is (1,0,0), (0,0,1, 0), (0,1, 0);
specifically, in step S707, stripe matching is performed on each stripe of the reconstructed image based on the coding sequence, for binocular reconstruction, two image acquisition devices are provided in combination with the present embodiment, stripe matching is performed on the reconstructed images of the two image acquisition devices, point cloud reconstruction is performed after matching, and three-dimensional data of the target object is obtained, for monocular reconstruction, one image acquisition device is provided in combination with the present embodiment, stripe matching is performed on the reconstructed image of the image acquisition device and the preset light of the projection device, and point cloud reconstruction is performed after matching, and three-dimensional data of the target object is obtained.
Fig. 8 is a flowchart of a three-dimensional scanning method according to an embodiment of the present application. As shown in fig. 8, the three-dimensional scanning method includes:
step S801, acquiring a first image and a second image, wherein the first image and the second image are fringe images acquired based on the same light beam;
step S803, determining the coding sequence of each stripe based on the first image;
and step S805, performing stripe matching on the stripes of the second image based on the coding sequence, and realizing three-dimensional reconstruction to acquire three-dimensional data of the target object.
The three-dimensional scanning method further comprises the following steps:
in step S807, texture data is acquired, and color three-dimensional data of the target object is acquired based on the three-dimensional data and the texture data.
Preferably, the first image (second image) is acquired alternately with the texture data.
The following is explained by a specific method:
the projection device projects red, green and blue color coding stripes to a target object at a first moment, the red, green and blue color coding stripes are transmitted to the image processing device after being modulated by the target object, the red, green and blue color coding stripes are separated into two red, green and blue color coding stripes by a semi-reflecting and semi-transmitting prism, one red, green and blue color coding stripe is collected by a color camera, the color camera generates a corresponding red, green and blue color coding stripe image, the other red, green and blue color coding stripe is collected by a black and white camera through a blue filter, the black and white camera generates a corresponding blue stripe image, the illumination piece irradiates white light to the target object at a second moment, the white light is collected by the color camera after being reflected by the target object, the color camera generates a texture map, the coding sequence of each stripe is determined based on the red, green and blue color coding stripe image, the stripe matching, and acquiring true color three-dimensional data of the target object based on the three-dimensional data and the texture map.
The projector projects red, green and blue color coding stripes to a target object at a first moment, the red, green and blue color coding stripes are transmitted to the image processor after being modulated by the target object, the red, green and blue color coding stripes are separated into two red, green and blue color coding stripes by a semi-reflecting and semi-transmitting prism, one red, green and blue color coding stripe is collected by a color camera, the color camera generates corresponding red, green and blue color coding stripe images, the projector projects blue color coding stripes to the target object at a third moment, the red, green and blue color coding stripes are transmitted to the image processor after being modulated by the target object, the blue color coding stripes are collected by a black and white camera through the semi-reflecting and semi-transmitting prism and a blue color filter in sequence, the black and white camera generates corresponding blue stripe images, wherein the blue color coding stripes correspond to the blue stripes in the red, green and blue color coding stripes, the color camera generates a texture map, a coding sequence of each stripe is determined based on a red, green and blue color coding stripe image, stripe matching is carried out on each stripe of the blue stripe image based on the coding sequence, three-dimensional reconstruction is achieved, three-dimensional data of a target object are obtained, and true color three-dimensional data of the target object are obtained based on the three-dimensional data and the texture map.
The projection device projects red, green and blue color coding stripes to a target object at a first moment, the red, green and blue color coding stripes are modulated by the target object and then transmitted to the image processing device, the red, green and blue color coding stripes are decomposed into a red, green and blue stripe by a right-angle two-channel color separation prism, the red, green and blue stripe is collected by a color camera, the color camera generates a corresponding red, green and blue stripe image, the blue stripe is collected by a black and white camera, the black and white camera generates a corresponding blue stripe image, the illumination piece irradiates white light to the target object at a second moment, the white light is collected by the color camera and the black and white camera after being reflected by the target object, the color camera generates texture images based on the red light and the green light, the black and white camera generates texture images based on the blue light, coding sequences of the stripes are determined based on the red and green stripe images, the stripes, and synthesizing a texture map based on white light based on the texture map of the color camera and the texture map of the black and white camera, and acquiring true color three-dimensional data of the target object based on the three-dimensional data and the texture map of the white light.
The projection device projects red, green and blue color coding stripes to a target object at a first moment, the red, green and blue color coding stripes are transmitted to the image processing device after being modulated by the target object, the red, green and blue color coding stripes are decomposed into a red stripe, a green stripe and a blue stripe by a three-channel color separation prism, the red stripe is collected by a first black and white camera, the first black and white camera generates a corresponding red stripe image, the green stripe is collected by a second black and white camera, the second black and white camera generates a corresponding green stripe image, the blue stripe is collected by a third black and white camera, the third black and white camera generates a corresponding blue stripe image, the illuminating piece irradiates white light to the target object at a second moment, the white light is collected by the three black and white cameras after being reflected by the target object, the first black and white camera generates a texture map based on red light, the second black and white camera generates a texture map based on green light, the third, determining a coding sequence of each stripe based on the combination of the red stripe image, the green stripe image and the blue stripe image, performing stripe matching on each stripe of the red stripe image, the green stripe image and the blue stripe image based on the coding sequences to realize three-dimensional reconstruction to obtain three-dimensional data of a target object, synthesizing a texture map based on white light based on texture maps of three black and white cameras, and obtaining true color three-dimensional data of the target object based on the three-dimensional data and the texture map of the white light. The projection device projects green and blue color coding stripes to a target object at a first moment, the green and blue color coding stripes are transmitted to the image processing device after being modulated by the target object, the green and blue color coding stripes are decomposed into a green stripe and a blue stripe by a three-channel color separation prism, the green stripe is collected by a second black and white camera, the second black and white camera generates a corresponding green stripe image, the blue stripe is collected by a third black and white camera, the third black and white camera generates a corresponding blue stripe image, the illumination piece irradiates white light to the target object at a second moment, the white light is collected by three black and white cameras after being reflected by the target object, the first black and white camera generates a texture map based on red light, the second black and white camera generates a texture map based on green light, the third black and white camera generates a texture map based on blue light, the coding sequence of each stripe is determined based on the combination of the green stripe image and the blue stripe image, the stripe matching is carried out on the green, the method comprises the steps of realizing three-dimensional reconstruction to obtain three-dimensional data of a target object, synthesizing a texture map based on white light based on texture maps of three black and white cameras, and obtaining true color three-dimensional data of the target object based on the three-dimensional data and the texture map of the white light.
It should be noted that the steps illustrated in the flowcharts of the figures may be performed in a computer system such as a set of computer-executable instructions and that, although a logical order is illustrated in the flowcharts, in some cases, the steps illustrated or described may be performed in an order different than presented herein.
An embodiment of the present invention provides a storage medium on which a program is stored, the program implementing the three-dimensional scanning method when executed by a processor.
The embodiment of the invention provides a processor, which is used for running a program, wherein the three-dimensional scanning method is executed when the program runs.
It should also be noted that the terms "comprises," "comprising," or any other variation thereof, are intended to cover a non-exclusive inclusion, such that a process, method, article, or apparatus that comprises a list of elements does not include only those elements but may include other elements not expressly listed or inherent to such process, method, article, or apparatus. Without further limitation, an element defined by the phrase "comprising an … …" does not exclude the presence of other identical elements in the process, method, article, or apparatus that comprises the element.
It should also be noted that: each functional unit in the embodiments of the present invention may be integrated into one physical unit, or each unit may exist alone physically, or two or more units are integrated into one unit.
As will be appreciated by one skilled in the art, embodiments of the present application may be provided as a method, system, or computer program product. Accordingly, the present application may take the form of an entirely hardware embodiment, an entirely software embodiment or an embodiment combining software and hardware aspects. Furthermore, the present application may take the form of a computer program product embodied on one or more computer-usable storage media (including, but not limited to, disk storage, CD-ROM, optical storage, and the like) having computer-usable program code embodied therein.
The above are merely examples of the present application and are not intended to limit the present application. Various modifications and changes may occur to those skilled in the art. Any modification, equivalent replacement, improvement, etc. made within the spirit and principle of the present application should be included in the scope of the claims of the present application.
In addition, in the above embodiments of the present invention, the description of each embodiment has a respective emphasis, and for parts that are not described in detail in a certain embodiment, reference may be made to the related description of other embodiments.

Claims (18)

1. A three-dimensional scanner, comprising:
the projection device (10) is used for projecting light rays onto a target object, wherein the light rays comprise preset light rays projected in the form of color coding stripes, and the color coding stripes are formed by at least two color stripe codes;
an image acquisition device (20) for acquiring light modulated by the target object to acquire at least one fringe image in a case where the target object is projected with light by the projection device (10), wherein the acquired fringe image serves as an encoding map to determine each fringe sequence, and serves as a reconstruction map to three-dimensionally reconstruct the target object.
2. The three-dimensional scanner according to claim 1, wherein the image capturing device (20) further comprises a plurality of cameras (21), the plurality of cameras (21) comprises at least one black and white camera, wherein the image capturing device (20) captures the light modulated by the target object by the plurality of cameras (21) to obtain a plurality of fringe images, and wherein the fringe image obtained by at least one of the black and white cameras is used as a reconstruction map to perform three-dimensional reconstruction on the target object; and at least a plurality of black and white camera stripe images are used as code patterns to determine each stripe sequence, and/or at least one color camera stripe image is used as a code pattern to determine each stripe sequence.
3. The three-dimensional scanner according to claim 2, wherein the image capturing device (20) further comprises a light beam processing device, the light beam processing device comprising an light entrance portion and at least two light exit portions, wherein each camera (21) is respectively arranged corresponding to a different light exit portion, and the image capturing device (20) is configured to capture the light modulated by the target object through the light beam processing device.
4. The three-dimensional scanner according to claim 3, wherein the light beam processing device further comprises at least one first light beam splitting unit, and the first light beam splitting unit is configured to split the light beams projected from the light inlet portion, so that the light beams are projected from the at least two light outlet portions to the cameras (21) corresponding to the light outlet portions respectively.
5. The three-dimensional scanner according to claim 4, wherein the light beam processing apparatus further comprises at least one second light beam splitting unit, and the second light beam splitting unit is configured to split light to be acquired by a given camera, so that the given camera acquires light including a given wavelength band, wherein the color-coded stripes include stripes of colors corresponding to the given wavelength band.
6. The three-dimensional scanner according to claim 5, wherein the designated camera is the black and white camera.
7. The three-dimensional scanner of claim 5,
the light beam processing device comprises a right-angle two-channel dichroic prism (24), and the right-angle two-channel dichroic prism (24) comprises a third light-emitting part and a fourth light-emitting part, wherein the light beam processing device realizes light splitting processing on light rays projected from the light-entering part through the right-angle two-channel dichroic prism (24), so that the light rays are projected to cameras (21) which are correspondingly arranged on the light-emitting parts respectively from the third light-emitting part and the fourth light-emitting part;
the image acquisition device (20) comprises a third camera (213) arranged corresponding to the third light-emitting part and a fourth camera (214) arranged corresponding to the fourth light-emitting part, the third camera (213) generates a third fringe image based on the acquired light, the fourth camera (214) generates a fourth fringe image based on the acquired light, and the third fringe image and the fourth fringe image comprise fringes with at least two colors and the fringes with at least two colors can be identified;
the light beam processing device realizes separation processing of light rays acquired by the appointed camera through the right-angle two-channel color separation prism (24), so that the appointed camera acquires the light rays containing the appointed waveband conveniently, wherein the acquiring of the light rays containing the appointed waveband by the appointed camera comprises the following steps: the third camera (213) acquires light of a first filter wavelength band and/or the fourth camera (214) acquires light of a second filter wavelength band.
8. The three-dimensional scanner of claim 5,
the light beam processing device comprises a three-channel dichroic prism (25), the three-channel dichroic prism (25) comprises a fifth light-emitting part, a sixth light-emitting part and a seventh light-emitting part, and the light beam processing device realizes light splitting processing on light rays projected from the light-entering part through the three-channel dichroic prism (25), so that the light rays are projected to cameras (21) which are correspondingly arranged on the light-emitting parts respectively from the fifth light-emitting part, the sixth light-emitting part and the seventh light-emitting part;
the image acquisition device (20) comprises a fifth camera (215) arranged corresponding to the fifth light-emitting part, a sixth camera (216) arranged corresponding to the sixth light-emitting part, and a seventh camera (217) arranged corresponding to the seventh light-emitting part, wherein the fifth camera (215) generates a fifth stripe image based on the acquired light, the sixth camera (216) generates a sixth stripe image based on the acquired light, the seventh camera (217) generates a seventh stripe image based on the acquired light, and the fifth stripe image, the sixth stripe image and the seventh stripe image comprise stripes with at least two colors and the stripes with at least two colors are identifiable;
the light beam processing device realizes separation processing of light rays acquired by the appointed camera through the three-channel color separation prism (25), so that the appointed camera acquires the light rays containing the appointed waveband, wherein the acquisition of the light rays containing the appointed waveband by the appointed camera at least comprises the following steps: the fifth camera (215) acquires light in a third filter band of wavelengths and the sixth camera (216) acquires light in a fourth filter band of wavelengths, the third filter band of wavelengths being different from the fourth filter band of wavelengths.
9. The three-dimensional scanner of claim 5,
the light beam processing device comprises a semi-reflective and semi-transparent prism (22), the semi-reflective and semi-transparent prism (22) comprises a first light-emitting part and a second light-emitting part, and the light beam processing device realizes light splitting processing on light rays projected from the light-inlet part through the semi-reflective and semi-transparent prism (22), so that the light rays are projected to cameras (21) which are arranged corresponding to the light-emitting parts respectively from the first light-emitting part and the second light-emitting part;
the image acquisition device (20) comprises a first camera (211) arranged corresponding to the first light-emitting portion and a second camera (212) arranged corresponding to the second light-emitting portion, the first camera (211) generates a first stripe image based on acquired light, the second camera (212) generates a second stripe image based on acquired light, and the first stripe image and the second stripe image comprise stripes of at least two colors and the stripes of the at least two colors are identifiable.
10. The three-dimensional scanner according to any of claims 1-9, wherein said beam processing means further comprises a filter (23),
the light beam processing device separates the acquired light rays by the appointed cameras through the optical filter (23), so that the appointed cameras acquire the light rays containing a fifth filtering wave band, and at least one of the cameras is the appointed camera.
11. The three-dimensional scanner according to any one of claims 1-9, further comprising an illuminator (30), wherein the image acquisition device (20) is further configured to acquire the illumination light reflected by the target object to acquire the texture data of the target object when the target object is illuminated by the illuminator (30).
12. The three-dimensional scanner according to claim 11, wherein the image acquisition device (20) identifies red, green and blue light.
13. A three-dimensional scanning system, comprising:
the three-dimensional scanner is used for projecting light rays onto a target object and collecting the light rays modulated by the target object under the condition that the target object is projected with the light rays so as to obtain at least one fringe image, wherein the projected light rays comprise preset light rays projected in a color coding fringe form, and the color coding fringes at least consist of two color fringe codes;
the image processor is connected with the three-dimensional scanner and used for acquiring at least one stripe image acquired by the three-dimensional scanner, determining each stripe sequence according to the stripe image as an encoding image and performing three-dimensional reconstruction on the target object as a reconstruction image;
wherein the three-dimensional scanner is the three-dimensional scanner of any one of claims 1-12.
14. The three-dimensional scanning system of claim 13, wherein in the case where the three-dimensional scanner acquires light modulated by the target object through a plurality of cameras to obtain at least one fringe image, and at least one of the plurality of cameras comprises a black and white camera, the image processor is further configured to:
taking a stripe image obtained by at least one black-and-white camera as a reconstruction image to carry out three-dimensional reconstruction on the target object;
the stripe images obtained by at least a plurality of black and white cameras are used as coding patterns to determine each stripe sequence, and/or the stripe images obtained by at least one color camera are used as coding patterns to determine each stripe sequence.
15. A three-dimensional scanning method, which is applied to the three-dimensional scanner according to any one of claims 1 to 12, the three-dimensional scanning method comprising:
projecting preset light rays to a target object in a color coding stripe mode;
acquiring light modulated by the target object, and acquiring at least one fringe image based on the light, wherein the acquired fringe image is used as an encoding graph to determine each fringe sequence and is used as a reconstruction graph to perform three-dimensional reconstruction on the target object;
determining a sequence of stripes in the plurality of stripe images based on the coding pattern;
and performing three-dimensional reconstruction on the reconstruction map based on the sequence to acquire three-dimensional data of the target object.
16. The three-dimensional scanning method of claim 15, further comprising:
projecting illumination light onto a target object and acquiring texture data of the target object based on the illumination light;
and acquiring the color three-dimensional data of the target object based on the three-dimensional data and the texture data of the target object.
17. A three-dimensional scanning method, which is applied to the three-dimensional scanner according to any one of claims 1 to 12, the three-dimensional scanning method comprising:
acquiring a first image and a second image, wherein the first image and the second image are fringe images acquired based on the same light beam;
determining a coded sequence of stripes based on the first image;
and performing stripe matching on the stripes of the second image based on the coding sequence, and realizing three-dimensional reconstruction to obtain three-dimensional data of the target object.
18. The three-dimensional scanning method of claim 17, further comprising:
and acquiring texture data, and acquiring color three-dimensional data of the target object based on the three-dimensional data and the texture data.
CN201911018729.0A 2019-10-24 2019-10-24 Three-dimensional scanner and three-dimensional scanning method Active CN112710253B (en)

Priority Applications (8)

Application Number Priority Date Filing Date Title
CN201911018729.0A CN112710253B (en) 2019-10-24 2019-10-24 Three-dimensional scanner and three-dimensional scanning method
KR1020227017511A KR20220084402A (en) 2019-10-24 2020-10-26 3D Scanners and 3D Scanning Methods
US17/771,470 US12007224B2 (en) 2019-10-24 2020-10-26 Three-dimensional scanner and three-dimensional scanning method
JP2022524057A JP7298025B2 (en) 2019-10-24 2020-10-26 Three-dimensional scanner and three-dimensional scanning method
EP20878731.7A EP4050302A4 (en) 2019-10-24 2020-10-26 Three-dimensional scanner and three-dimensional scanning method
AU2020371142A AU2020371142B2 (en) 2019-10-24 2020-10-26 Three-dimensional scanner and three-dimensional scanning method
PCT/CN2020/123684 WO2021078300A1 (en) 2019-10-24 2020-10-26 Three-dimensional scanner and three-dimensional scanning method
CA3158933A CA3158933A1 (en) 2019-10-24 2020-10-26 Three-dimensional scanner and three-dimensional scanning method

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN201911018729.0A CN112710253B (en) 2019-10-24 2019-10-24 Three-dimensional scanner and three-dimensional scanning method

Publications (2)

Publication Number Publication Date
CN112710253A true CN112710253A (en) 2021-04-27
CN112710253B CN112710253B (en) 2023-06-06

Family

ID=75540321

Family Applications (1)

Application Number Title Priority Date Filing Date
CN201911018729.0A Active CN112710253B (en) 2019-10-24 2019-10-24 Three-dimensional scanner and three-dimensional scanning method

Country Status (1)

Country Link
CN (1) CN112710253B (en)

Cited By (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN114268703A (en) * 2021-12-27 2022-04-01 安徽淘云科技股份有限公司 Imaging adjusting method and device during screen scanning, storage medium and equipment
CN114521982A (en) * 2022-02-21 2022-05-24 资阳联耀医疗器械有限责任公司 Intraoral scanner, intraoral scanning implementation method and storage medium
CN116982940A (en) * 2023-09-26 2023-11-03 北京朗视仪器股份有限公司 Oral cavity scanning system and method
CN116408575B (en) * 2021-12-31 2024-06-04 广东美的白色家电技术创新中心有限公司 Method, device and system for locally scanning and eliminating workpiece reflection interference

Citations (8)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN1576779A (en) * 2003-06-30 2005-02-09 韦尔豪泽公司 Method and system for three-dimensionally imaging an apical dome of a plant
CN102494609A (en) * 2011-11-18 2012-06-13 李志扬 Three-dimensional photographing process based on laser probe array and device utilizing same
CN105407344A (en) * 2014-09-09 2016-03-16 深圳市绎立锐光科技开发有限公司 Stereo image projection device and stereoscopic display glasses
WO2017203756A1 (en) * 2016-05-26 2017-11-30 Ckd株式会社 Three-dimensional-measurement device
CN109283186A (en) * 2018-10-12 2019-01-29 成都精工华耀科技有限公司 A kind of double spectrum two-dimensionals of track visualization inspection and three-dimensional fusion imaging system
CN109489583A (en) * 2018-11-19 2019-03-19 先临三维科技股份有限公司 Projection arrangement, acquisition device and the 3 D scanning system with it
CN110381300A (en) * 2018-04-13 2019-10-25 豪威科技股份有限公司 There are four the imaging systems of imaging sensor for tool
TW201944771A (en) * 2018-04-09 2019-11-16 美商豪威科技股份有限公司 Imaging system having four image sensors

Patent Citations (8)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN1576779A (en) * 2003-06-30 2005-02-09 韦尔豪泽公司 Method and system for three-dimensionally imaging an apical dome of a plant
CN102494609A (en) * 2011-11-18 2012-06-13 李志扬 Three-dimensional photographing process based on laser probe array and device utilizing same
CN105407344A (en) * 2014-09-09 2016-03-16 深圳市绎立锐光科技开发有限公司 Stereo image projection device and stereoscopic display glasses
WO2017203756A1 (en) * 2016-05-26 2017-11-30 Ckd株式会社 Three-dimensional-measurement device
TW201944771A (en) * 2018-04-09 2019-11-16 美商豪威科技股份有限公司 Imaging system having four image sensors
CN110381300A (en) * 2018-04-13 2019-10-25 豪威科技股份有限公司 There are four the imaging systems of imaging sensor for tool
CN109283186A (en) * 2018-10-12 2019-01-29 成都精工华耀科技有限公司 A kind of double spectrum two-dimensionals of track visualization inspection and three-dimensional fusion imaging system
CN109489583A (en) * 2018-11-19 2019-03-19 先临三维科技股份有限公司 Projection arrangement, acquisition device and the 3 D scanning system with it

Cited By (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN114268703A (en) * 2021-12-27 2022-04-01 安徽淘云科技股份有限公司 Imaging adjusting method and device during screen scanning, storage medium and equipment
CN116408575B (en) * 2021-12-31 2024-06-04 广东美的白色家电技术创新中心有限公司 Method, device and system for locally scanning and eliminating workpiece reflection interference
CN114521982A (en) * 2022-02-21 2022-05-24 资阳联耀医疗器械有限责任公司 Intraoral scanner, intraoral scanning implementation method and storage medium
CN116982940A (en) * 2023-09-26 2023-11-03 北京朗视仪器股份有限公司 Oral cavity scanning system and method
CN116982940B (en) * 2023-09-26 2024-02-27 北京朗视仪器股份有限公司 Oral cavity scanning system and method

Also Published As

Publication number Publication date
CN112710253B (en) 2023-06-06

Similar Documents

Publication Publication Date Title
CN112710253B (en) Three-dimensional scanner and three-dimensional scanning method
US11528463B2 (en) Method and apparatus for colour imaging a three-dimensional structure
CN112985307B (en) Three-dimensional scanner, system and three-dimensional reconstruction method
CN109489583B (en) Projection device, acquisition device and three-dimensional scanning system with same
US10782126B2 (en) Three-dimensional scanning method containing multiple lasers with different wavelengths and scanner
CN112712583B (en) Three-dimensional scanner, three-dimensional scanning system, and three-dimensional scanning method
KR101824328B1 (en) 3-D scanner and the scanning method using the chromatic aberration
WO2021078300A1 (en) Three-dimensional scanner and three-dimensional scanning method
JP3818028B2 (en) 3D image capturing apparatus and 3D image capturing method
CN112930468B (en) Three-dimensional measuring device
US20230320825A1 (en) Method and intraoral scanner for detecting the topography of the surface of a translucent object, in particular a dental object
US20240004175A1 (en) Intraoral scanner with optical system for minimizing stray light
KR100902176B1 (en) 3d scanner using the polygon mirror
RU2543688C2 (en) Camera and optical system for obtaining 3d images (versions)
CN117664026A (en) Three-dimensional contour measurement device and method based on three-color laser
JPH063122A (en) Three-dimensional camera apparatus

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant