EP4050302B1 - Dreidimensionaler scanner und dreidimensionales scanverfahren - Google Patents

Dreidimensionaler scanner und dreidimensionales scanverfahren

Info

Publication number
EP4050302B1
EP4050302B1 EP20878731.7A EP20878731A EP4050302B1 EP 4050302 B1 EP4050302 B1 EP 4050302B1 EP 20878731 A EP20878731 A EP 20878731A EP 4050302 B1 EP4050302 B1 EP 4050302B1
Authority
EP
European Patent Office
Prior art keywords
light
stripe
image
camera
target object
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Active
Application number
EP20878731.7A
Other languages
English (en)
French (fr)
Other versions
EP4050302A4 (de
EP4050302A1 (de
EP4050302C0 (de
Inventor
Xiaobo ZHAO
Chao Ma
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Shining 3D Technology Co Ltd
Original Assignee
Shining 3D Technology Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Priority claimed from CN201911018729.0A external-priority patent/CN112710253B/zh
Priority claimed from CN201911018772.7A external-priority patent/CN112712583B/zh
Application filed by Shining 3D Technology Co Ltd filed Critical Shining 3D Technology Co Ltd
Publication of EP4050302A1 publication Critical patent/EP4050302A1/de
Publication of EP4050302A4 publication Critical patent/EP4050302A4/de
Application granted granted Critical
Publication of EP4050302B1 publication Critical patent/EP4050302B1/de
Publication of EP4050302C0 publication Critical patent/EP4050302C0/de
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Classifications

    • GPHYSICS
    • G01MEASURING; TESTING
    • G01BMEASURING LENGTH, THICKNESS OR SIMILAR LINEAR DIMENSIONS; MEASURING ANGLES; MEASURING AREAS; MEASURING IRREGULARITIES OF SURFACES OR CONTOURS
    • G01B11/00Measuring arrangements characterised by the use of optical techniques
    • G01B11/24Measuring arrangements characterised by the use of optical techniques for measuring contours or curvatures
    • G01B11/25Measuring arrangements characterised by the use of optical techniques for measuring contours or curvatures by projecting a pattern, e.g. one or more lines, moiré fringes on the object
    • G01B11/2518Projection by scanning of the object
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01BMEASURING LENGTH, THICKNESS OR SIMILAR LINEAR DIMENSIONS; MEASURING ANGLES; MEASURING AREAS; MEASURING IRREGULARITIES OF SURFACES OR CONTOURS
    • G01B11/00Measuring arrangements characterised by the use of optical techniques
    • G01B11/24Measuring arrangements characterised by the use of optical techniques for measuring contours or curvatures
    • G01B11/25Measuring arrangements characterised by the use of optical techniques for measuring contours or curvatures by projecting a pattern, e.g. one or more lines, moiré fringes on the object
    • G01B11/2545Measuring arrangements characterised by the use of optical techniques for measuring contours or curvatures by projecting a pattern, e.g. one or more lines, moiré fringes on the object with one projection direction and several detection directions, e.g. stereo
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01BMEASURING LENGTH, THICKNESS OR SIMILAR LINEAR DIMENSIONS; MEASURING ANGLES; MEASURING AREAS; MEASURING IRREGULARITIES OF SURFACES OR CONTOURS
    • G01B11/00Measuring arrangements characterised by the use of optical techniques
    • G01B11/02Measuring arrangements characterised by the use of optical techniques for measuring length, width or thickness
    • G01B11/022Measuring arrangements characterised by the use of optical techniques for measuring length, width or thickness by means of tv-camera scanning
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01BMEASURING LENGTH, THICKNESS OR SIMILAR LINEAR DIMENSIONS; MEASURING ANGLES; MEASURING AREAS; MEASURING IRREGULARITIES OF SURFACES OR CONTOURS
    • G01B11/00Measuring arrangements characterised by the use of optical techniques
    • G01B11/24Measuring arrangements characterised by the use of optical techniques for measuring contours or curvatures
    • G01B11/25Measuring arrangements characterised by the use of optical techniques for measuring contours or curvatures by projecting a pattern, e.g. one or more lines, moiré fringes on the object
    • G01B11/2509Color coding
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01BMEASURING LENGTH, THICKNESS OR SIMILAR LINEAR DIMENSIONS; MEASURING ANGLES; MEASURING AREAS; MEASURING IRREGULARITIES OF SURFACES OR CONTOURS
    • G01B11/00Measuring arrangements characterised by the use of optical techniques
    • G01B11/24Measuring arrangements characterised by the use of optical techniques for measuring contours or curvatures
    • G01B11/25Measuring arrangements characterised by the use of optical techniques for measuring contours or curvatures by projecting a pattern, e.g. one or more lines, moiré fringes on the object
    • G01B11/2518Projection by scanning of the object
    • G01B11/2527Projection by scanning of the object with phase change by in-plane movement of the patern
    • GPHYSICS
    • G06COMPUTING OR CALCULATING; COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T17/00Three dimensional [3D] modelling, e.g. data description of 3D objects
    • G06T17/30Polynomial surface description
    • GPHYSICS
    • G06COMPUTING OR CALCULATING; COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/50Depth or shape recovery
    • G06T7/55Depth or shape recovery from multiple images
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N13/00Stereoscopic video systems; Multi-view video systems; Details thereof
    • H04N13/20Image signal generators
    • H04N13/204Image signal generators using stereoscopic image cameras
    • H04N13/243Image signal generators using stereoscopic image cameras using three or more 2D image sensors
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N13/00Stereoscopic video systems; Multi-view video systems; Details thereof
    • H04N13/20Image signal generators
    • H04N13/204Image signal generators using stereoscopic image cameras
    • H04N13/25Image signal generators using stereoscopic image cameras using two or more image sensors with different characteristics other than in their location or field of view, e.g. having different resolutions or colour pickup characteristics; using image signals from one sensor to control the characteristics of another sensor
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N13/00Stereoscopic video systems; Multi-view video systems; Details thereof
    • H04N13/20Image signal generators
    • H04N13/204Image signal generators using stereoscopic image cameras
    • H04N13/254Image signal generators using stereoscopic image cameras in combination with electromagnetic radiation sources for illuminating objects

Definitions

  • the present application relates to the field of three-dimensional scanning, and in particular, to a three-dimensional scanner and a three-dimensional scanning method.
  • US 2014/049535 A1 relates to a method for composing an enhanced color sequence for projecting onto an object to measure three-dimensional shape of said object.
  • US 2019/254529 A1 relates to a structured light scanning method for structured light scanning of an intra-oral scene.
  • CN 109489583A relates to a projection apparatus.
  • existing three-dimensional scanners usually perform three-dimensional reconstruction processing as follows. Firstly, sinusoidal stripes are de-matched based on time coding, and then three-dimensional reconstruction and splicing fusion are performed to obtain a three-dimensional shape of an object. Secondly, a three-dimensional shape of an object is obtained according to an algorithm of extracting a stripe center line and performing three-dimensional reconstruction and splicing fusion based on time coding. Thirdly, a three-dimensional shape of an object is obtained based on the principle of microscopic confocal three-dimensional imaging.
  • the above-mentioned modes all have defects, and are not suitable for the promotion and use of an intra-oral three-dimensional scanning device.
  • the specific defects are as follows: Firstly, it is difficult for a three-dimensional reconstruction method based on time coding to realize handheld scanning with small volume, and thus cannot be used in the field of intra-oral three-dimensional scanning.
  • the three-dimensional reconstruction method based on time coding also needs to be supported by a high-frame rate camera and a high-speed algorithm, and thus the generation cost of three-dimensional scanning equipment is high, which is not conducive to promotion and use.
  • the hardware cost required for three-dimensional reconstruction based on the principle of microscopic confocal three-dimensional imaging is high, which is not conducive to the promotion and use of the three-dimensional scanning equipment either.
  • the present application provides a three-dimensional scanner and a three-dimensional scanning method, which are intended to solve the technical problem that existing three-dimensional reconstruction methods in the related art are high in hardware cost and are thus not conducive to the promotion and use of a three-dimensional scanning device.
  • a three-dimensional scanner includes: an image projection device, configured to project light onto a target object, wherein the light includes predetermined light projected in the form of a color-coded stripe that is formed by coding stripes of at least two colors; and an image acquisition device, configured to acquire light modulated by the target object so as to obtain at least one stripe image in the case where light is projected onto the target object by the image projection device, wherein the obtained stripe image is taken as a coding image to determine respective stripe sequences and as a reconstruction image to perform three-dimensional reconstruction on the target object.
  • a three-dimensional scanning method includes: projecting predetermined light onto a target object in the form of a color-coded stripe; acquiring light modulated by the target object, and obtaining at least one stripe image based on the light, where the obtained stripe image is taken as a coding image to determine respective stripe sequences and as a reconstruction image to perform three-dimensional reconstruction on the target object; determining sequences of respective stripes in the plurality of stripe images based on the coding image; and performing three-dimensional reconstruction on the reconstruction image based on the sequences, and obtaining three-dimensional data of the target object.
  • a three-dimensional scanning method includes: obtaining a first image and a second image, where the first image and the second image are stripe images obtained based on a same beam; determining coding sequences of respective stripes based on the first image; and matching stripes of the second image based on the coding sequences to realize three-dimensional reconstruction so as to obtain three-dimensional data of a target object.
  • the three-dimensional scanning method is applied to the three-dimensional scanning method described in any one of the above.
  • the three-dimensional scanner mentioned in the embodiments of the present application is to obtain a three-dimensional shape of a target object according to a stripe extraction algorithm based on spatial coding. Therefore, the three-dimensional scanner needs only one frame of two-dimensional image to realize three-dimensional reconstruction of a target object, thereby greatly reducing the frame rate of a camera and the operation cost of an algorithm, and facilitating the promotion and use of the three-dimensional scanner. Specifically, since the three-dimensional scanner does not need to use a camera with a high frame rate, the volume of the camera required in the three-dimensional scanner can be reduced, thereby making the three-dimensional scanner more suitable for obtaining a three-dimensional shape of an intra-oral object.
  • the three-dimensional scanner realizes three-dimensional reconstruction of the target object only with one frame of two-dimensional image at a minimum, an obtaining time difference between a reconstruction image and a texture image is greatly shortened, time required for projecting and photographing in the three-dimensional reconstruction of the target object is reduced, and likewise, the three-dimensional scanner is more suitable for obtaining the three-dimensional shape of an intra-oral object (facilitating handheld scanning by the three-dimensional scanner).
  • the three-dimensional scanner mentioned in the embodiments of the present application is to obtain a three-dimensional shape of a target object according to a stripe extraction algorithm based on spatial coding, and the technical effect of eliminating the projection requirements of dynamic projection is also achieved.
  • a three-dimensional scanner as described above is provided.
  • the three-dimensional scanner includes: an image projection device, configured to respectively project, in each predetermined period, a predetermined stripe pattern corresponding to the predetermined period onto a target object, wherein stripes of each predetermined stripe pattern are disposed according to arrangement of predetermined color-coded stripes, each predetermined stripe pattern includes stripes of at least one color in the predetermined color-coded stripes, and a plurality of predetermined stripe patterns include stripes of at least two colors in the predetermined color-coded stripes, and the stripes in the predetermined stripe patterns are arranged in the same way as stripes of the same color in the predetermined color-coded stripes; and an image acquisition device, configured to acquire light modulated by the target object so as to obtain a plurality of stripe images in the case where predetermined stripe patterns are projected onto the target object, wherein the obtained stripe images are taken as coding images to determine respective stripe sequences and as reconstruction images to perform three-dimensional reconstruction on the target object.
  • a three-dimensional scanning method as described above is provided.
  • the three-dimensional scanning method includes: obtaining a first image and a second image, wherein the first image and the second image are stripe images obtained based on a same light transmitting portion; determining coding sequences of respective stripes based on the first image; and matching stripes of the second image based on the coding sequences to realize three-dimensional reconstruction so as to obtain three-dimensional data of a target object.
  • the present application achieves the technical effects of eliminating the projection requirements of dynamic projection and realizing three-dimensional reconstruction of a target object only with a few two-dimensional images, and solves the technical problem that three-dimensional reconstruction methods in the related art are high in hardware cost and are thus not conducive to the promotion and use of a three-dimensional scanning device.
  • a three-dimensional scanner is provided.
  • the above color-coded stripe may be formed by coding a plurality of pure-color stripes and may also be formed by coding a plurality of non-pure-color stripes.
  • the color-coded stripe formed by coding a plurality of pure-color stripes such as red, green, blue, cyan, magenta, and yellow is preferable.
  • R, G and B components of each color stripe in the color-coded stripe are preferably 0 or 255, and at most only two components will be 255 at the same time.
  • the above image projection device 10 includes: a light source emitter 12, a color grating sheet 13 and a first imaging lens 14.
  • the light source emitter 12 is configured to emit light of at least two different bands.
  • the color grating sheet 13 and the first imaging lens 14 are arranged on a transfer path of the light.
  • the light is transmitted through a MASK pattern on the color grating sheet 13.
  • the pattern is projected onto a target object through the first imaging lens 14.
  • Color categories contained in the MASK pattern on the color grating sheet 13 correspond to band categories contained in the light transmitted therethrough one by one.
  • the above image projection device 10 further includes the beam coupling system 15 and the light bar 16.
  • the beam coupling system 15 and the light bar 16 are arranged on a transfer path of light. At least two beams of light of different bands, emitted from the light source emitter 12, are respectively projected onto the color grating sheet 13 through the beam coupling system 15 and the light bar 16.
  • the image projection device 10 further includes a phase modulation element 17 and a drive motor 18.
  • the phase modulation element is arranged on a transfer path of laser light. After the light source emitter 12 emits at least two beams of laser light of different bands, the phase modulation element located on the transfer path of the laser light performs real-time phase modulation on the laser light.
  • the phase modulation element is driven by the driving motor 18 to rotate at a certain speed around a rotation axis.
  • the phase modulation element may be located in front of the beam coupling system 15 or may also be located behind the beam coupling system 15.
  • the image projection device 10 includes: three laser emitters, two partial-reflection partial-transmission beam splitters, the phase modulation element 17 (and the drive motor 18 connected to the phase modulation element 17), the beam coupling system 15, the light bar 16, the color grating sheet 13, and the first imaging lens 14.
  • the above image projection device 10 may adopt a DLP projector.
  • light modulated by the target object is: predetermined light modulated by the target object in its own shape, so that a color-coded stripe corresponding to the predetermined light is changed correspondingly based on the shape of the target object.
  • the image acquisition device 20 acquires the changed color-coded stripe to generate at least one stripe image.
  • the image projection device 10 projects light onto a target object.
  • the light includes predetermined light projected in the form of a color-coded stripe that is composed of coded strips of at least two colors.
  • the image acquisition device 20 acquires light modulated by the target object so as to obtain at least one stripe image in the case where light is projected onto the target object by the image projection device 10.
  • Photosensitive bands of the image acquisition device 20 correspond to stripe colors contained in the color-coded stripe.
  • the image acquisition device can obtain coded stripes of at least two colors in the color-coded stripes.
  • the image projection device is arranged in combination with the image acquisition device. Colors contained in the predetermined light of the image projection device may all be acquired by the image acquisition device.
  • the obtained stripe image is taken as a coding image to determine respective stripe sequences and as a reconstruction image to perform three-dimensional reconstruction on the target object.
  • the three-dimensional scanner mentioned in the embodiments of the present application is to obtain a three-dimensional shape of a target object according to a stripe extraction algorithm based on spatial coding. Therefore, the three-dimensional scanner needs only one frame of two-dimensional image to realize three-dimensional reconstruction of a target object, thereby greatly reducing the frame rate of a camera 21 and the operation cost of an algorithm, and facilitating the promotion and use of the three-dimensional scanner. Specifically, since the three-dimensional scanner does not need to use the camera 21 with a high frame rate, the volume of the camera 21 required in the three-dimensional scanner can be reduced, thereby making the three-dimensional scanner more suitable for obtaining a three-dimensional shape of an intra-oral object.
  • the three-dimensional scanner provided by the embodiments of the present application uses color as spatial coding information, the technical effects of facilitating identification of coding information and improving identification accuracy are also achieved.
  • the image acquisition device 20 further includes a plurality of cameras 21.
  • the plurality of cameras 21 include at least one monochrome camera.
  • the image acquisition device 20 processes the light modulated by the target object through the plurality of cameras 21 to obtain a plurality of stripe images.
  • a stripe image obtained by the at least one monochrome camera is taken as a reconstruction image to perform three-dimensional reconstruction on the target object.
  • Stripe images obtained by at least a plurality of monochrome cameras are taken as coding images to determine respective stripe sequences, and/or, a stripe image obtained by at least one color camera is taken as a coding image to determine respective stripe sequences.
  • a pre-designed color-coded stripe image is projected onto a target object (e.g. a tooth or a gum) by the image projection device 10, while the image acquisition device 20 is controlled to rapidly acquire an image of the target object with a projected pattern.
  • the cameras 21 included in the image acquisition device 20 respectively acquire different stripe images.
  • camera A is a color camera and obtains a color stripe image
  • camera B is a monochrome camera and obtains a monochrome stripe image.
  • the color stripe image and the monochrome stripe image are transferred to a computer terminal.
  • the computer terminal takes the color stripe image as coding information and the monochrome stripe image as a reconstruction image so as to obtain a three-dimensional shape of the target object.
  • the image acquisition device 20 includes a plurality of cameras 21.
  • the plurality of cameras 21 include at least one monochrome camera, and a monochrome stripe image with high imaging resolution is taken as a reconstruction image to obtain a three-dimensional shape of the target object.
  • the camera 21 included in the image acquisition device 20 may be a CCD camera, it is assumed that the color-coded stripe corresponding to the predetermined light is formed by coding stripes of two colors (such as: red and blue).
  • the image acquisition device 20 obtains different stripe images through different CCD cameras.
  • a stripe image containing red and blue colors is obtained by a color CCD camera
  • a stripe image containing a blue color is obtained by a monochrome CCD camera (a blue filter is arranged in front of the monochrome CCD camera).
  • the stripe image obtained by the color CCD camera is used for identifying and matching sequence codes of respective blue stripes.
  • a three-dimensional reconstruction algorithm and a splicing fusion algorithm are performed according to the obtained sequence codes and the stripe image obtained by the monochrome CCD camera so as to construct a three-dimensional shape of the target object.
  • a light filter 22d of a specified color in front of the monochrome CCD camera, which is not specifically limited by the embodiments of the present application.
  • the monochrome CCD camera may obtain a stripe image of a specified color. At this moment, the inclusion of only a stripe image of a specified color would be more conducive to subsequently performing the three-dimensional reconstruction algorithm and the splicing fusion algorithm to construct a three-dimensional shape of the target object.
  • photosensitive bands configured by the image acquisition device 20 included in the three-dimensional scanner at least include a plurality of specified bands, and the plurality of specified bands correspond to stripe colors included in the color-coded stripe. That is, in an optional example, the image acquisition device 20 is provided with a color camera capable of acquiring a plurality of stripe colors in the color-coded stripes corresponding to the predetermined light in order to determine respective stripe sequences.
  • the specified band in the present application may be a specified band or a plurality of specified bands.
  • the three-dimensional scanner may further include an illumination member 30.
  • the illumination member 30 is configured to illuminate the target object so as to acquire a texture image of the target object subsequently.
  • the illumination member 30 is preferably a white LED lamp, so as to realize true-color scanning, i.e. to obtain a three-dimensional model with the same color or basically the same color as the target object.
  • the illumination member 30 may be arranged on the outer periphery of the reflector 40.
  • the illumination member may also be arranged in other parts of the scanner, and is arranged in cooperation with the reflector 40. Illumination light is reflected to the target object through the reflector 40.
  • the illumination member 30 is located on a side of the first imaging lens 14 close to the light source emitter 12, and light projected by the illumination member and the light source emitter 12 may pass through the first imaging lens 14 and may be reflected onto the target object by the reflector 40.
  • the three-dimensional scanner includes a grip portion and an entrance portion arranged at a front end of the grip portion.
  • the image projection device 10 and the image acquisition device 20 are both installed on the grip portion.
  • the reflector 40 is installed on the entrance portion.
  • the illumination member 30 may be installed on the entrance portion or may also be installed on the grip portion.
  • the image acquisition device 20 may identify and determine red light, green light and blue light, so that the image acquisition device 20 may acquire a texture image of the target object under the illumination light.
  • the three-dimensional scanner may further include a timing control circuit.
  • the timing control circuit is connected to the image projection device 10, the illumination member 30 and the image acquisition device 20.
  • the timing control circuit is configured to control the image projection device 10 to project light onto the target object, and synchronously control the image acquisition device 20 to obtain a plurality of stripe images.
  • the timing control circuit is configured to control the illumination member 30 to illuminate the target object, and synchronously control the image acquisition device 20 to obtain a texture image.
  • the timing control circuit is configured to control the image projection device 10 and the illumination member 30 to alternately project light onto the target object.
  • the image acquisition device 20 further includes a beam processing device.
  • the beam processing device includes a light input portion and at least two light output portions.
  • the respective cameras 21 correspond to different light output portions.
  • the image acquisition device 20 acquires the light modulated by the target object through the beam processing device.
  • the image acquisition device 20 is provided with the beam processing device so that the plurality of cameras 21 respectively obtain stripe patterns at completely consistent fields of view and angles. That is, the plurality of cameras 21 may receive coaxial light incident from the same second imaging lens 23. The coaxial light is projected onto the above plurality of cameras 21 respectively.
  • image light of the target object enters the light input portion of the beam processing device.
  • the beam processing device separates the image light of the target object so that the image light is emitted out from the at least two light output portions respectively to be projected onto the plurality of cameras 21.
  • stripe images acquired by the plurality of cameras 21 are all stripe images obtained in the same perspective and based on the same modulated color-coded stripe. Stripe sequences in the respective stripe images are correlated based on the same modulated color-coded stripe, thereby facilitating three-dimensional reconstruction of the stripe images by subsequent algorithms.
  • the beam processing device further includes at least one first beam separation unit configured to separate light projected from the light input portion so that the light is projected from the at least two light output portions to the cameras 21 corresponding to the light output portions respectively.
  • the first beam separation unit separates light of each color into light in two directions. For example, a beam of red and blue light is processed by the first beam separation unit to form two beams of red and blue light, which are emitted out in different directions respectively.
  • the beam processing device is provided with at least one first beam separation unit configured to separate light projected from the light input portion so that image light of the target object can be projected from the at least two light output portions respectively and the cameras 21 corresponding to the at least two light output portions can obtain stripe images in the same perspective.
  • the beam processing device further includes at least one second beam separation unit configured to separate light to be obtained by a specified camera so that the specified camera obtains light containing a specified band.
  • the second beam separation unit separates light of a partial band from the light, and the light of a partial band is emitted out in one direction.
  • the second beam separation unit separates light of two partial bands from the light, and the light of two partial specified bands is emitted out in different directions respectively. For example, a beam of red and blue light is processed by the second beam separation unit to form a beam of blue light to be emitted out in one direction.
  • a beam of red and blue light is processed by the second beam separation unit to form a beam of red light and a beam of blue light, which are emitted out in different directions respectively.
  • the color-coded stripe includes a stripe of a color corresponding to the specified band.
  • the specified camera is the monochrome camera.
  • the three-dimensional scanner may further include: a heat dissipation system, a heating anti-fog system, a software algorithm system, etc.
  • the heat dissipation system is configured to prevent damage to the scanner caused by overheating inside the three-dimensional scanning device.
  • the heating anti-fog system is configured to prevent failure to obtain accurate stripe images caused by the fogging phenomenon of each optical instrument in the three-dimensional scanner.
  • the beam processing device includes a partial-reflection partial-transmission prism 22c, and the partial-reflection partial-transmission prism 22c includes a first light output portion and a second light output portion.
  • the beam processing device transmits and reflects light through the partial-reflection partial-transmission prism 22c, and thus separates light projected from the light input portion so that the light is respectively projected from the first light output portion and the second light output portion to the cameras 21 corresponding to the respective light output portions.
  • the image acquisition device 20 further includes a first camera 211 corresponding to the first light output portion, and a second camera 212 corresponding to the second light output portion.
  • the first camera 211 generates a first stripe image based on the acquired light.
  • the second camera 212 generates a second stripe image based on the acquired light.
  • the first stripe image and the second stripe image include identifiable stripes of at least two colors.
  • the beam processing device further includes a light filter 22d.
  • the beam processing device separates light to be obtained by a specified camera through the light filter 22d so that the specified camera obtains light containing a fifth filter band. At least one of the plurality of cameras is the specified camera.
  • the light filter 22d is arranged between the first light output portion and the first camera 211 so that the first camera 211 obtains light of a fifth filter band, and/or, arranged between the second light output portion and the second camera 212 so that the second camera 212 obtains light of a fifth filter band.
  • the color of at least one of stripes of at least two colors included in the second stripe image is the filter color corresponding to the light filter 22d, so that the second stripe image may identify coding sequences of the stripes included in the first stripe image.
  • One red/green/blue color-coded stripe is transmitted while the other red/green/blue color-coded stripe is reflected.
  • blue light therein is acquired by the monochrome camera, and the monochrome camera generates a first stripe image including blue stripes.
  • the other red/green/blue color-coded stripe is acquired by the color camera, and the color camera generates a second stripe image including red stripes, green stripes and blue stripes.
  • the respective stripes in the first stripe image correspond to the blue stripes in the second stripe image, and the second stripe image is taken as a coding image.
  • the red stripes, the green stripes and the blue stripes in the second stripe image are all identifiable and determinable, thereby determining coding sequences of the respective stripes in the second stripe image.
  • the first stripe image is taken as a reconstruction image.
  • the respective stripes of the first stripe image may be identified and matched by coding sequences of second stripes to realize three-dimensional reconstruction based on a stripe correspondence between the first stripe image and the second stripe image.
  • the arrangement of the light filter 22d in front of the monochrome camera may also be eliminated.
  • the first stripe image obtained by the monochrome camera includes red stripes, green stripes and blue stripes.
  • a double-color light filter 22d is arranged in front of the monochrome camera for light of two colors of red, green and blue colors to be emitted out and acquired by the monochrome camera.
  • the light filter 22d may also be arranged in front of the color camera.
  • the color camera generates a second stripe image including red stripes.
  • the blue stripes in the first stripe image correspond to the blue stripes in the red/green/blue color-coded stripe
  • the red stripes in the second stripe image correspond to the blue stripes in the red/green/blue color-coded stripe. Since a single-color light filter 22d is arranged in front of the monochrome camera for emission of light of only one color, the stripes in the first stripe image acquired by the monochrome camera may also be identified and determined. The first stripe image and the second stripe image may be combined to determine coding sequences of the respective stripes. The first stripe image and the second stripe image are both taken as coding images, and the first stripe image is taken as a reconstruction image. Alternatively, a double-color light filter 22d is arranged in front of the color camera.
  • the color camera In an example where a red/green light filter 22d is arranged in front of the color camera, the color camera generates a second stripe image including red stripes and green stripes.
  • the first stripe image and the second stripe image are both taken as coding images or only the second stripe image is taken as a coding image, and the first stripe image is taken as a reconstruction image.
  • the image acquisition device 20 can only identify and determine two of red light, green light and blue light. In these embodiments, the image acquisition device 20 cannot completely obtain texture data of the target object under white light. In some embodiments, the image acquisition device 20 can identify and determine red light, green light and blue light, and may completely obtain texture data of the target object under white light, so as to obtain color three-dimensional data.
  • the beam processing device separates light projected from the light input portion by transmitting and reflecting the light through the partial-reflection partial-transmission prism 22c so that the light is respectively projected from the first light output portion and the second light output portion to the cameras corresponding to the respective light output portions. That is, the beam processing device realizes the function corresponding to the first beam separation unit through the partial-reflection partial-transmission prism 22c.
  • the beam processing device separates light to be obtained by a specified camera through the light filter 22d so that the specified camera obtains light containing a specified band. That is, the beam processing device realizes the function corresponding to the second beam separation unit through the light filter 22d.
  • the beam processing device includes a right-angled two-channel dichroic prism 22a, and the right-angled two-channel dichroic prism 22a includes a third light output portion and a fourth light output portion.
  • the beam processing device separates light projected from the light input portion through the right-angled two-channel dichroic prism 22a so that the light is respectively projected from the third light output portion and the fourth light output portion to cameras 21 corresponding to the respective light output portions.
  • the image acquisition device 20 includes a third camera 213 corresponding to the third light output portion, and a fourth camera 214 corresponding to the fourth light output portion.
  • the third camera 213 generates a third stripe image based on the acquired light.
  • the fourth camera 214 generates a fourth stripe image based on the acquired light.
  • the third stripe image and the fourth stripe image both include identifiable stripes of at least two colors.
  • the beam processing device also separates light to be obtained by a specified camera through the right-angled two-channel dichroic prism 22a so that the specified camera obtains light containing a specified band.
  • the operation of obtaining light containing a specified band by the specified camera includes: obtaining light of a first filter band by the third camera 213, and/or obtaining light of a second filter band by the fourth camera 214.
  • stripes of two colors included in the third stripe image are a black stripe and a white stripe respectively.
  • the white stripe is in the color-coded stripe, and a corresponding stripe color is a filter color corresponding to the light filter 22d.
  • the color of at least one of stripes of at least two colors included in the fourth stripe image is the filter color corresponding to the light filter 22d, so that the fourth stripe image may identify coding sequences of the stripes included in the third stripe image.
  • the third camera is a monochrome camera
  • the fourth camera is a color camera.
  • the image projection device 10 projects a red/green/blue color-coded stripe (i.e. color-coded stripe including red stripes, green stripes and blue stripes)
  • the red/green/blue color-coded stripe is projected onto the target object by the image projection device 10, modulated by the target object, and then transferred to an image processing device.
  • the red/green/blue color-coded stripe is decomposed by the right-angled two-channel dichroic prism 22a into a red/green coded stripe and a blue coded stripe.
  • the blue coded stripe is acquired by the monochrome camera, and the monochrome camera generates a third stripe image including blue stripes.
  • the red/green coded stripe is acquired by the color camera, and the color camera generates a fourth stripe image including red stripes and green stripes.
  • the blue stripes in the third stripe image correspond to the respective stripes in the fourth stripe image.
  • the third stripe image and the fourth stripe image are combined to correspond to the red/green/blue color-coded stripe, and the fourth stripe image is taken as a coding image.
  • the red stripes and the green stripes in the fourth stripe image are all identifiable and determinable, thereby determining coding sequences of the respective stripes in the fourth stripe image.
  • the third stripe image is taken as a reconstruction image.
  • the respective stripes of the third stripe image may be identified and matched by coding sequences of fourth stripe image to realize three-dimensional reconstruction based on a stripe correspondence between the third stripe image and the fourth stripe image.
  • the monochrome camera obtains only single-color light. Therefore, the third stripe image may also be identified and determined.
  • the third stripe image may be combined with the fourth stripe image to determine coding sequences of the respective stripes.
  • the third stripe image and the fourth stripe image are both taken as coding images.
  • the light filter 22d may be arranged or the light filter 22d may not be arranged in the present embodiment.
  • the light filter 22d may be arranged in cooperation with the right-angled two-channel dichroic prism 22a.
  • the beam processing device separates light projected from the light input portion through the right-angled two-channel dichroic prism 22a so that the light is respectively projected from the third light output portion and the fourth light output portion to the cameras 21 corresponding to the respective light output portions. That is, the beam processing device realizes the function corresponding to the first beam separation unit through the right-angled two-channel dichroic prism 22a.
  • the beam processing device also separates light to be obtained by a specified camera through the right-angled two-channel dichroic prism 22a so that the specified camera obtains light containing a specified band. That is, the beam processing device realizes the function corresponding to the second beam separation unit through the right-angled two-channel dichroic prism 22a.
  • the beam processing device includes a three-channel dichroic prism 22b, and the three-channel dichroic prism 22b includes a fifth light output portion, a sixth light output portion, and a seventh light output portion.
  • the beam processing device separates light projected from the light input portion through the three-channel dichroic prism 22b so that the light is respectively projected from the fifth light output portion, the sixth light output portion, and the seventh light output portion to cameras 21 corresponding to the respective light output portions.
  • the image acquisition device 20 includes a fifth camera 215 corresponding to the fifth light output portion, a sixth camera 216 corresponding to the sixth light output portion, and a seventh camera 217 corresponding to the seventh light output portion.
  • the fifth camera 215 generates a fifth stripe image based on the acquired light.
  • the sixth camera 216 generates a sixth stripe image based on the acquired light.
  • the seventh camera 217 generates a seventh stripe image based on the acquired light.
  • the fifth stripe image, the sixth stripe image, and the seventh stripe image include identifiable stripes of at least two colors.
  • the beam processing device separates light to be obtained by a specified camera through the three-channel dichroic prism 22b so that the specified camera obtains light containing a specified band.
  • the operation of obtaining light containing a specified band by the specified camera at least includes: obtaining light of a third filter band by the fifth camera 215, and obtaining light of a fourth filter band by the sixth camera 216, the third filter band being different from the fourth filter band.
  • At least one of the fifth camera, the sixth camera and the seventh camera is a monochrome camera.
  • the fifth camera is a monochrome camera
  • the sixth camera and the seventh camera are color cameras.
  • the fifth camera and the sixth camera are monochrome cameras
  • the seventh camera is a color camera.
  • the above fifth camera 215, sixth camera 216 and seventh camera 217 are all monochrome cameras.
  • photosensitive bands of the image acquisition device 20 of the present application correspond to stripe colors contained in a color-coded stripe one by one, in the case where the fifth camera 215, the sixth camera 216 and the seventh camera 217 are all monochrome cameras, there are three stripe colors contained in the color-coded stripe. At least two stripe colors have a corresponding relationship with the third filter band and the fourth filter band.
  • the color-coded stripe is composed of red stripes, blue stripes and green stripes.
  • a filter color corresponding to a first filter face may be red
  • a filter color corresponding to a second filter face may be blue.
  • the obtained fifth stripe image is a monochrome stripe image.
  • White stripes correspond to the red stripes in the color-coded stripe.
  • the obtained sixth stripe image is a monochrome stripe image.
  • White stripes correspond to the blue stripes in the color-coded stripe.
  • the color-coded stripe is composed of red stripes, blue stripes and yellow stripes.
  • a filter color corresponding to a first filter face may be red
  • a filter color corresponding to a second filter face may be green.
  • the obtained fifth stripe image is a monochrome stripe image.
  • White stripes correspond to the red stripes and the yellow stripes in the color-coded stripe (in the field of optics, yellow light is formed by combining green light and red light).
  • the obtained sixth stripe image is a monochrome stripe image.
  • White stripes correspond to the yellow stripes in the color-coded stripe (in the field of optics, yellow light is formed by combining green light and red light).
  • the beam processing device also separates light to be obtained by a specified camera through the three-channel dichroic prism 22b so that the seventh camera 217 obtains light of a sixth filter band and the sixth filter band is different from the third filter band and the fourth filter band.
  • the color-coded stripe is composed of red stripes, blue stripes and green stripes.
  • a filter color corresponding to a first filter face may be red
  • a filter color corresponding to a second filter face may be blue
  • a filter color corresponding to a third filter face may be green.
  • the obtained seventh stripe image is a monochrome stripe image.
  • White stripes correspond to the green stripes in the color-coded stripe.
  • any one of the fifth stripe image, the sixth stripe image and the seventh stripe image may be taken as a reconstruction image to perform three-dimensional reconstruction on the target object.
  • the fifth stripe image is taken as a reconstruction image to perform three-dimensional reconstruction on the target object
  • the fifth stripe image, the sixth stripe image and the seventh stripe image are taken together as a coding image to determine the respective stripe sequences.
  • the fifth stripe image, the sixth stripe image and the seventh stripe image are all taken as reconstruction images.
  • the beam processing device separates light projected from the light input portion through the three-channel dichroic prism 22b so that the light is respectively projected from the fifth light output portion, the sixth light output portion and the seventh light output portion to the cameras 21 corresponding to the respective light output portions. That is, the beam processing device realizes the function corresponding to the first beam separation unit through the three-channel dichroic prism 22b.
  • the beam processing device also separates light to be obtained by a specified camera through the three-channel dichroic prism 22b so that the specified camera obtains light containing a specified band. That is, the beam processing device realizes the function corresponding to the second beam separation unit through the three-channel dichroic prism 22b.
  • Embodiments I, II and III listed in the present application are all illustrative examples to enable a person skilled in the art to more clearly understand the technical solution of the present application.
  • the present application is not specifically limited herein. If other specific devices may realize the functional definition description of the beam processing device in the present application, the devices may also serve as an executable technical solution of the present application.
  • Embodiments I, II and III listed in the present application may all be combined with reference to each other to realize the functional definition description of the beam processing device in the present application.
  • the beam processing device may still continue to realize the function corresponding to the second beam separation unit again through the light filter.
  • the three-dimensional scanner provided by the present application has the advantages of low hardware cost, low real-time frame rate requirements, high brightness and large depth of field of an optical system, and device miniaturization. Further, the three-dimensional scanner can directly perform dynamic real-time three-dimensional scanning with color texture on materials characterized by light reflection, transmission and diffusion such as intra-oral teeth and gums.
  • a three-dimensional scanning system includes:
  • the three-dimensional scanner included in the three-dimensional scanning system is the above three-dimensional scanner provided by the embodiments of the present application.
  • the image processor is further configured to: take a stripe image obtained by the at least one monochrome camera as a reconstruction image to perform three-dimensional reconstruction on the target object; and take stripe images obtained by at least a plurality of monochrome cameras as coding images to determine respective stripe sequences, and/or, take a stripe image obtained by at least one color camera as a coding image to determine respective stripe sequences.
  • a three-dimensional scanner projects light onto a target object and acquires light modulated by the target object so as to obtain at least one stripe image in the case where light is projected onto the target object.
  • the projected light includes predetermined light projected in the form of a color-coded stripe that is formed by coding stripes of at least two colors.
  • An image processor is connected to the three-dimensional scanner, and is configured to obtain at least one stripe image obtained by the three-dimensional scanner, and take the stripe image as a coding image to determine respective stripe sequences and as a reconstruction image to perform three-dimensional reconstruction on the target object.
  • the three-dimensional scanning system mentioned in the embodiments of the present application is to obtain a three-dimensional shape of a target object according to a stripe extraction algorithm based on spatial coding. Therefore, the three-dimensional scanning system needs only one frame of two-dimensional image to realize three-dimensional reconstruction of a target object, thereby greatly reducing the frame rate of a camera and the operation cost of an algorithm, and facilitating the promotion and use of the three-dimensional scanning system. Specifically, since the three-dimensional scanning system does not need to use a camera with a high frame rate, the volume of the camera required in the three-dimensional scanning system can be reduced, thereby making the three-dimensional scanning system more suitable for obtaining a three-dimensional shape of an intra-oral object.
  • the three-dimensional scanning system can realize three-dimensional reconstruction of the target object only with one frame of two-dimensional image at a minimum, an obtaining time difference between a reconstruction image and a texture image is greatly shortened, time required for projecting and photographing in the three-dimensional reconstruction of the target object is reduced, and likewise, the three-dimensional scanning system is more suitable for obtaining the three-dimensional shape of an intra-oral object (facilitating handheld scanning by the three-dimensional scanning system).
  • the three-dimensional scanning system provided by the embodiments of the present application uses color as spatial coding information, the technical effects of facilitating identification of coding information and improving identification accuracy are also achieved.
  • the three-dimensional scanning system mentioned in the embodiments of the present application is to obtain a three-dimensional shape of a target object according to a stripe extraction algorithm based on spatial coding. Therefore, the technical effect of eliminating the projection requirements of dynamic projection is also achieved.
  • Embodiments of the present application also provide a three-dimensional scanning method. It should be noted that the three-dimensional scanning method in the embodiments of the present application is applied to the above three-dimensional scanner provided in the embodiments of the present application. The three-dimensional scanning method provided by the embodiments of the present application will be described below.
  • FIG. 7 is a flowchart of a three-dimensional scanning method according to an embodiment of the present application. As shown in FIG. 7 , the three-dimensional scanning method includes the following steps.
  • step S701 predetermined light is projected onto a target object in the form of a color-coded stripe.
  • step S703 light modulated by the target object is acquired, and at least one stripe image is obtained based on the light.
  • the obtained stripe image is taken as a coding image to determine respective stripe sequences and as a reconstruction image to perform three-dimensional reconstruction on the target object.
  • step S705 sequences of respective stripes in the plurality of stripe images are determined based on the coding image.
  • step S707 three-dimensional reconstruction is performed on the reconstruction image based on the sequences, and three-dimensional data of the target object is obtained.
  • predetermined light is projected onto a target object in the form of a color-coded stripe.
  • Light modulated by the target object is acquired, and at least one stripe image is obtained based on the light.
  • the obtained stripe image is taken as a coding image to determine respective stripe sequences and as a reconstruction image to perform three-dimensional reconstruction on the target object.
  • Sequences of respective stripes in the plurality of stripe images are determined based on the coding image.
  • Three-dimensional reconstruction is performed on the reconstruction image based on the sequences, and three-dimensional data of the target object is obtained.
  • the three-dimensional scanning method mentioned in the embodiments of the present application is to obtain a three-dimensional shape of a target object according to a stripe extraction algorithm based on spatial coding. Therefore, the three-dimensional scanning method needs only one frame of two-dimensional image to realize three-dimensional reconstruction of a target object, thereby greatly reducing the frame rate of a camera and the operation cost of an algorithm, and facilitating the promotion and use of the three-dimensional scanning method. Specifically, since the three-dimensional scanning method does not need to use a camera with a high frame rate, the volume of the camera required in the three-dimensional scanning method can be reduced, thereby making the three-dimensional scanning method more suitable for obtaining a three-dimensional shape of an intra-oral object.
  • the three-dimensional scanning method can realize three-dimensional reconstruction of the target object only with one frame of two-dimensional image at a minimum, an obtaining time difference between a reconstruction image and a texture image is greatly shortened, time required for projecting and photographing in the three-dimensional reconstruction of the target object is reduced, and likewise, the three-dimensional scanning method is more suitable for obtaining the three-dimensional shape of an intra-oral object (facilitating handheld scanning by the three-dimensional scanning method).
  • the three-dimensional scanning method provided by the embodiments of the present application uses color as spatial coding information, the technical effects of facilitating identification of coding information and improving identification accuracy are also achieved.
  • the three-dimensional scanning method mentioned in the embodiments of the present application is to obtain a three-dimensional shape of a target object according to a stripe extraction algorithm based on spatial coding. Therefore, the technical effect of eliminating the projection requirements of dynamic projection is also achieved.
  • the three-dimensional scanning method further includes: obtaining texture data of the target object, and obtaining color three-dimensional data of the target object based on the three-dimensional data and the texture data of the target object.
  • the three-dimensional scanning method further includes: projecting illumination light onto a target object, and obtaining texture data of the target object based on the illumination light; and obtaining color three-dimensional data of the target object based on the three-dimensional data and the texture data of the target object.
  • Texture data is obtained by a single camera, or synthesized from data obtained by a plurality of cameras.
  • step S703 light modulated by the target object is acquired, and at least two stripe images are obtained based on the same light. At least one of the stripe images is obtained by a monochrome camera.
  • the obtained stripe images are taken as a coding image to determine respective stripe sequences and as a reconstruction image to perform three-dimensional reconstruction on the target object.
  • the stripe image obtained by the monochrome camera is taken as the reconstruction image.
  • step S705 sequences of respective stripes in a plurality of stripe images are determined based on the coding image, and a coding sequence is determined based on arrangement information and color information of the respective stripes in the coding image. For example, if four stripes arranged in red, green, green, and red are coded and decoded by red (1, 0) and green (0, 1), the coding sequence thereof is (1, 0) (0, 1) (0, 1) (1, 0).
  • red (1, 0, 0), green (0, 1, 0) and blue (0, 0, 1)
  • the coding sequence thereof is (1, 0, 0), (0, 0, 1), (0, 0, 1), (0, 1, 0), (1,0,0).
  • step S707 the respective stripes of the reconstruction image are matched based on the coding sequences.
  • stripe matching is performed on reconstruction images of the two image acquisition devices, and point cloud reconstruction is performed after matching, so as to obtain three-dimensional data of a target object.
  • point cloud reconstruction is performed after matching, so as to obtain three-dimensional data of a target object.
  • FIG. 8 is a flowchart of a three-dimensional scanning method according to an embodiment of the present application. As shown in FIG. 8 , the three-dimensional scanning method includes the following steps.
  • step S801 a first image and a second image are obtained.
  • the first image and the second image are stripe images obtained based on a same beam.
  • step S803 coding sequences of respective stripes are determined based on the first image.
  • step S805 stripes of the second image are matched based on the coding sequences to realize three-dimensional reconstruction so as to obtain three-dimensional data of a target object.
  • the three-dimensional scanning method further includes the following steps.
  • step S807 texture data is obtained, and color three-dimensional data of the target object is obtained based on the three-dimensional data and the texture data.
  • the first image (second image) and the texture data are obtained alternately.
  • An image projection device projects a red/green/blue color-coded stripe onto a target object at a first moment.
  • the red/green/blue color-coded stripe is modulated by the target object and then transferred to an image processing device.
  • the red/green/blue color-coded stripe is separated into two red/green/blue color-coded stripes through a partial-reflection partial-transmission prism.
  • One of the red/green/blue color-coded stripes is acquired by a color camera, and the color camera generates a corresponding red/green/blue color-coded stripe image.
  • the other red/green/blue color-coded stripe is acquired by a monochrome camera through a blue light filter, and the monochrome camera generates a corresponding blue stripe image.
  • An illumination member emits white light to the target object at a second moment.
  • the white light is reflected by the target object and then acquired by the color camera, and the color camera generates a texture image.
  • a coding sequence of each stripe is determined based on the red/green/blue color-coded stripe image.
  • the respective stripes of the blue stripe image are matched based on the coding sequence.
  • Three-dimensional reconstruction is realized to obtain three-dimensional data of the target object. True-color three-dimensional data of the target object is obtained based on the three-dimensional data and the texture image.
  • An image projection device projects a red/green/blue color-coded stripe to a target object at a first moment.
  • the red/green/blue color-coded stripe is modulated by the target object and then transferred to an image processing device.
  • the red/green/blue color-coded stripe is separated into two red/green/blue color-coded stripes through a partial-reflection partial-transmission prism.
  • One of the red/green/blue color-coded stripes is acquired by a color camera, and the color camera generates a corresponding red/green/blue color-coded stripe image.
  • the image projection device projects a blue coded stripe onto the target object at a third moment.
  • the blue coded stripe is modulated by the target object and then transferred to the image processing device.
  • the blue coded stripe sequentially passes through the partial-reflection partial-transmission prism and a blue light filter and is acquired by a monochrome camera, and the monochrome camera generates a corresponding blue stripe image.
  • the blue coded stripe corresponds to blue stripes in the red/green/blue color-coded stripe.
  • An illumination member emits white light to the target object at a second moment. The white light is reflected by the target object and then acquired by the color camera, and the color camera generates a texture image.
  • a coding sequence of each stripe is determined based on the red/green/blue color-coded stripe image.
  • the respective stripes of the blue stripe image are matched based on the coding sequence.
  • Three-dimensional reconstruction is realized to obtain three-dimensional data of the target object. True-color three-dimensional data of the target object is obtained based on the three-dimensional data and the texture image.
  • An image projection device projects a red/green/blue color-coded stripe onto a target object at a first moment.
  • the red/green/blue color-coded stripe is modulated by the target object and then transferred to an image processing device.
  • the red/green/blue color-coded stripe is decomposed into a red/green stripe and a blue stripe by a right-angled two-channel dichroic prism.
  • the red/green stripe is acquired by a color camera, and the color camera generates a corresponding red/green stripe image.
  • the blue stripe is acquired by a monochrome camera, and the monochrome camera generates a corresponding blue stripe image.
  • An illumination member emits white light to the target object at a second moment.
  • the white light is reflected by the target object and then acquired by the color camera and the monochrome camera, the color camera generates a texture image based on red light and green light, and the monochrome camera generates a texture image based on blue light.
  • a coding sequence of each stripe is determined based on the red/green stripe image.
  • the respective stripes of the blue stripe image are matched based on the coding sequence.
  • Three-dimensional reconstruction is realized to obtain three-dimensional data of the target object.
  • a texture image based on white light is synthesized based on the texture image of the color camera and the texture image of the monochrome camera.
  • True-color three-dimensional data of the target object is obtained based on the three-dimensional data and the texture image of the white light.
  • An image projection device projects a red/green/blue color-coded stripe onto a target object at a first moment.
  • the red/green/blue color-coded stripe is modulated by the target object and then transferred to an image processing device.
  • the red/green/blue color-coded stripe is decomposed into a red stripe, a green stripe and a blue stripe by a three-channel dichroic prism.
  • the red stripe is acquired by a first monochrome camera, and the first monochrome camera generates a corresponding red stripe image.
  • the green stripe is acquired by a second monochrome camera, and the second monochrome camera generates a corresponding green stripe image.
  • the blue stripe is acquired by a third monochrome camera, and the third monochrome camera generates a corresponding blue stripe image.
  • An illumination member emits white light to the target object at a second moment. The white light is reflected by the target object and then acquired by the three monochrome cameras.
  • the first monochrome camera generates a texture image based on red light
  • the second camera generates a texture image based on green light
  • the third monochrome camera generates a texture image based on blue light.
  • a coding sequence of each stripe is determined based on the combination of the red stripe image, the green stripe image and the blue stripe image. The respective stripes of the red stripe image, the green stripe image and the blue stripe image are matched based on the coding sequence.
  • Three-dimensional reconstruction is realized to obtain three-dimensional data of the target object.
  • a texture image based on white light is synthesized based on the texture images of the three monochrome cameras.
  • True-color three-dimensional data of the target object is obtained based on the three-dimensional data and the texture image of the white light.
  • An image projection device projects a green/blue color-coded stripe onto a target object at a first moment.
  • the green/blue color-coded stripe is modulated by the target object and then transferred to an image processing device.
  • the green/blue color-coded stripe is decomposed into a green stripe and a blue stripe by a three-channel dichroic prism.
  • the green stripe is acquired by a second monochrome camera, and the second monochrome camera generates a corresponding green stripe image.
  • the blue stripe is acquired by a third monochrome camera, and the third monochrome camera generates a corresponding blue stripe image.
  • An illumination member emits white light to the target object at a second moment. The white light is reflected by the target object and then acquired by the three monochrome cameras.
  • the first monochrome camera generates a texture image based on red light
  • the second monochrome camera generates a texture image based on green light
  • the third monochrome camera generates a texture image based on blue light.
  • a coding sequence of each stripe is determined based on the combination of the green stripe image and the blue stripe image.
  • the respective stripes of the green stripe image and the blue stripe image are matched based on the coding sequence.
  • Three-dimensional reconstruction is realized to obtain three-dimensional data of the target object.
  • a texture image based on white light is synthesized based on the texture images of the three monochrome cameras.
  • True-color three-dimensional data of the target object is obtained based on the three-dimensional data and the texture image of the white light.
  • Embodiments of the present invention provide a storage medium having, stored thereon, a program which, when executed by a processor, implements the three-dimensional scanning method.
  • Embodiments of the present invention provide a processor for running a program.
  • the program when run, performs the three-dimensional scanning method.
  • a three-dimensional scanner is provided.
  • FIG. 9 is a schematic diagram of a three-dimensional scanner according to an embodiment of the present application. As shown in FIG. 9 , the three-dimensional scanner includes an image projection device 10 and an image acquisition device 20.
  • the image projection device 10 is configured to respectively project, in each predetermined period, a predetermined stripe pattern corresponding to the predetermined period onto a target object. Stripes of each predetermined stripe pattern are disposed according to arrangement of predetermined color-coded stripes. Each predetermined stripe pattern includes stripes of at least one color in the predetermined color-coded stripes, and a plurality of predetermined stripe patterns include stripes of at least two colors in the predetermined color-coded stripes. The stripes in the predetermined stripe patterns are arranged in the same way as stripes of the same color in the predetermined color-coded stripes.
  • the operation of projecting, in each predetermined period, a predetermined stripe pattern corresponding to the predetermined period onto the target object may be that: the image projection device 10 periodically projects a predetermined stripe pattern.
  • the image projection device 10 projects a plurality of predetermined stripe patterns in each predetermined period.
  • the plurality of predetermined stripe patterns are projected at different time periods.
  • the image projection device 10 projects a first predetermined stripe pattern at a first time period and a second predetermined stripe pattern at a second time period.
  • the image acquisition device 20 acquires the first predetermined stripe pattern at the first time period and the second predetermined stripe pattern at the second time period.
  • the image acquisition device 20 repeats this process until the scanning of the target object is completed.
  • the image projection device 10 further includes a DLP projection portion 11.
  • the image projection device 10 respectively projects, in each predetermined period, a plurality of predetermined stripe patterns corresponding to the predetermined period onto the target object through the DLP projection portion 11.
  • the image projection device 10 may realize the function through the DLP projection portion 11.
  • the DLP projection portion 11 respectively projects, in each predetermined period, a plurality of predetermined stripe patterns corresponding to the predetermined period onto a target object.
  • Stripes of each predetermined stripe pattern are disposed according to arrangement of predetermined color-coded stripes.
  • Each predetermined stripe pattern includes stripes of at least one color in the predetermined color-coded stripes, and the plurality of predetermined stripe patterns include stripes of at least two colors in the predetermined color-coded stripes.
  • the stripes in the predetermined stripe patterns are arranged in the same way as stripes of the same color in the predetermined color-coded stripes.
  • the image projection device 10 further includes: a light emitting portion 12, configured to respectively emit, in each predetermined period, a plurality of beams of initial light corresponding to the predetermined period, wherein each beam of the initial light is composed of light of at least one stripe color, and the stripe color is the color of stripes in the predetermined color-coded stripes; and a light transmitting portion 13, arranged on a transfer path of the initial light, wherein after each beam of the initial light is transmitted by patterns of predetermined color-coded stripes on the light transmitting portion 13, respective corresponding predetermined color stripes are generated, i.e. predetermined stripe patterns are projected onto a target object, and stripes in the predetermined stripe patterns are arranged in the same way as stripes of the same color in the predetermined color-coded stripes.
  • a light emitting portion 12 configured to respectively emit, in each predetermined period, a plurality of beams of initial light corresponding to the predetermined period, wherein each beam of the initial light is composed of light of at least one stripe color, and the stripe
  • the predetermined color-coded stripe is a predetermined arrangement standard for respective color stripes.
  • predetermined stripe patterns complying with the predetermined arrangement standard for respective color stripes may be directly projected through the DLP projection portion 11.
  • the light transmitting portion 13 may be taken as a carrier of the predetermined arrangement standard for respective color stripes, i.e. the light transmitting portion 13 determines the predetermined arrangement standard for respective color stripes, and initial light passes through the light transmitting portion and then generates predetermined stripe patterns arranged according to the predetermined arrangement standard for respective color stripes.
  • the image projection device 10 may realize the function through the light emitting portion 12 and the light transmitting portion 13.
  • the three-dimensional scanner may form different predetermined stripe patterns by means of transmission projection and project the predetermined stripe patterns onto the target object, and stripes of each of the generated predetermined stripe patterns are disposed according to arrangement of predetermined color-coded stripes on the light transmitting portion 13.
  • Each predetermined stripe pattern includes stripes of at least one color in the predetermined color-coded stripes, and a plurality of predetermined stripe patterns include stripes of at least two colors in the predetermined color-coded stripes.
  • the stripes in the predetermined stripe patterns are arranged in the same way as stripes of the same color in the predetermined color-coded stripes.
  • the light emitting portion 12 further includes a plurality of light source units 121. Bands of light emitted by all the light source units 121 are different.
  • the light emitting portion 12 emits the initial light through the plurality of light source units 121.
  • the initial light may be single-band light emitted by only a single light source unit 121, or may be multi-band light emitted simultaneously by a plurality of light source units 121.
  • the light emitting portion 12 includes three light source units 121, and bands of light emitted by all the light source units 121 are different.
  • the first light source unit 121 emits light of a band of 605-700, i.e. red light.
  • the second light source unit 121 emits light of a band of 435-480, i.e. blue light.
  • the third light source unit 121 emits light of a band of 500-560, i.e. green light.
  • the first light source unit 121 emits light of a band of 605-700 at a time period A of a predetermined period.
  • the second light source unit 121 emits light of a band of 435-480 at a time period B of the predetermined period.
  • the first light source unit 121 emits light of a band of 605-700 at a time period C of the predetermined period, the second light source unit 121 emits light of a band of 450-480, and meanwhile, the third light source unit 121 emits light of a band of 500-560.
  • the first light source unit 121 emits light of a band of 605-700 at a time period A of a predetermined period.
  • the second light source unit 121 emits light of a band of 450-480 at a time period B of the predetermined period.
  • the third light source unit 121 emits light of a band of 500-560 at a time period C of the predetermined period.
  • the above settings of the first light source unit 121, the second light source unit 121 and the third light source unit 121 are illustrative examples, and are not specific limitations on the band of light which can be emitted by the light source units 121.
  • the band of light which can be emitted by the light source units 121 may be arbitrarily selected. The present application does not specifically limit that.
  • the above settings of the light source units 121 operated in the predetermined periods A, B and C are illustrative examples, and are not specific limitations on the light source units 121 capable of emitting light in each predetermined period.
  • the light source units 121 which can be started in each predetermined period may be arbitrarily selected. The present application does not specifically limit that.
  • the light source unit 121 may include at least one of an LED light source and a laser emitter.
  • the light source unit 121 may realize the function through the laser emitter and may realize the function through the LED light source.
  • Laser light has the advantages of directed light emission, extremely high brightness, extremely pure color, and good coherence.
  • the light emitting portion 12 further includes a plurality of LED light sources. Bands of light emitted by all the LED light sources are different. The light emitting portion 12 emits the initial light through the plurality of LED light sources.
  • the light emitting portion 12 further includes a plurality of laser emitters. Bands of light emitted by all the laser emitters are different. The light emitting portion 12 emits the initial light through the plurality of laser emitters.
  • the light emitting portion 12 further includes a light aggregation unit.
  • the light aggregation unit is arranged on a transfer path of light emitted from the plurality of light source units 121.
  • the light emitted from the plurality of light source units 121 is aggregated by the light aggregation unit and then projected to the light transmitting portion 13 on the same transfer path.
  • the initial light is a combination of light projected to the light transmitting portion 13 on the same transfer path after being aggregated by the light aggregation unit.
  • the light aggregation unit may realize the function through the partial-reflection partial-transmission prism 22c.
  • the light emitting portion 12 includes three light source units 121, and bands of light emitted by all the light source units 121 are different.
  • a first partial-reflection partial-transmission prism 22c is arranged on light paths of the first light source unit 121 and the second light source unit 121.
  • the first partial-reflection partial-transmission prism 22c is configured to aggregate light emitted by the first light source unit 121 and the second light source unit 121 so that the light is projected onto a second partial-reflection partial-transmission prism 22c.
  • the third light source unit 121 is arranged on a side of the second partial-reflection partial-transmission prism 22c away from the aggregated light.
  • the light emitted from the third light source unit 121 and the aggregated light are aggregated by the second partial-reflection partial-transmission prism 22c to generate a combination of light projected to the light transmitting portion 13 on the same transfer path.
  • the light transmitting portion 13 further includes a grating. Specifically, the light transmitting portion 13 generates a predetermined stripe pattern through the grating for projection onto the target object.
  • the different regions are arranged on the grating, and the different regions correspond to different bands, i.e. different regions may transmit light of different bands.
  • the different regions on the grating determine predetermined color-coded stripes.
  • the respective regions on the grating are arranged in the same way as respective stripes in the predetermined color-coded stripes, and the bands corresponding to the respective regions correspond to stripe colors corresponding to the stripes arranged in the same way.
  • the grating includes a first region for transmitting light of a first band and a second region for transmitting light of a second band.
  • the light of the first band passes through the grating and forms stripes of the first band arranged in the same way as the first region.
  • the light of the second band passes through the grating and forms stripes of the second band arranged in the same way as the second region.
  • the light emitting portion 12 emits different beams of initial light at different time periods of a predetermined period. At this moment, when a certain beam of initial light is projected onto the grating, light of various colors is transmitted through the respective regions to form a predetermined stripe pattern.
  • the light emitting portion 12 may further include a phase modulation unit.
  • the phase modulation unit is arranged on a transfer path of the initial light so that the initial light is projected to the light transmitting portion 13 after diffraction spots are removed by the phase modulation unit.
  • the phase modulation unit may include a phase modulation element and a beam coupling element.
  • the phase modulation element is arranged on the transfer path of the initial light, and the phase modulation element rotates around a predetermined axis.
  • the transfer path of the initial light is parallel to the predetermined axis of the phase modulation element.
  • the beam coupling element is arranged on the transfer path of the initial light for collimating and adjusting the initial light and reducing the divergence angle of the initial light.
  • the phase modulation element may be in any one of the following forms: a thin sheet made of a transparent optical material, a micro-optical element, or a random phase plate.
  • the phase modulation unit further includes a drive motor. The phase modulation element is driven by the drive motor to rotate at a certain speed around a rotation axis.
  • the beam coupling element may be composed of a collimating system and a converging lens, or an optical system having an equivalent function thereto.
  • the phase modulation element may be located in front of the beam coupling element or may also be located behind the beam coupling element.
  • the light emitting portion 12 may further include a solid medium element.
  • the solid medium element is arranged on the transfer path of the initial light. After being reflected and mixed repeatedly by the solid medium element, the initial light is projected to the light transmitting portion 13 in the form of uniform light field intensity.
  • the solid medium element may be in any one of the following forms: an elongated hexahedral prism, a cylindrical prism, and a pyramidal prism.
  • the solid medium element may be a hollow bar for repeatedly reflecting light in a space defined by a solid interface, or a solid bar for repeatedly reflecting light inside a solid transparent medium.
  • Input end and output end faces of the solid bar are each coated with an anti-reflection film, and an internal surface of the hollow bar is coated with a high reflection film.
  • an emergent end face of the solid medium element is parallel to an incident end face of the solid medium element.
  • the three-dimensional scanner further includes a timing control portion.
  • the timing control portion is connected to the image projection device 10 and the image acquisition device 20, and is configured to control the image projection device 10 to respectively emit, in each predetermined period, a predetermined stripe pattern corresponding to the predetermined period, and to control the image acquisition device 20 to respectively acquire light modulated by the target object in a plurality of predetermined periods so as to obtain a stripe image corresponding to each of the predetermined stripe patterns.
  • the three-dimensional scanner controls, through the timing control portion, the image projection device 10 to respectively emit, in each predetermined period, a predetermined stripe pattern corresponding to the predetermined period, and controls the image acquisition device 20 to respectively acquire light modulated by the target object in a plurality of predetermined periods so as to obtain a stripe image corresponding to each of the predetermined stripe patterns.
  • the three-dimensional scanner matches the processes of the image projection device 10 and the image acquisition device 20 through the timing control portion.
  • the three-dimensional scanner further includes a timing control portion.
  • the timing control portion is connected to the plurality of light source units 121 and the image acquisition device 20, and is configured to control the plurality of light source units 121 to respectively emit light in different predetermined periods so as to respectively generate, in each predetermined period, initial light corresponding to the predetermined period, and to control the image acquisition device 20 to respectively acquire light modulated by the target object in a plurality of predetermined periods so as to obtain a stripe image corresponding to each beam of the initial light.
  • the three-dimensional scanner controls, through the timing control portion, the plurality of light source units 121 to respectively emit light in different predetermined periods so as to generate a predetermined stripe pattern corresponding to each predetermined period and projected onto the target object, and to control the image acquisition device 20 to respectively acquire light modulated by the target object in a plurality of predetermined periods so as to obtain a stripe image corresponding to each beam of the initial light.
  • the three-dimensional scanner matches the processes of the plurality of light source units 121 and the image acquisition device 20 through the timing control portion.
  • the three-dimensional scanning device in the present application includes a first timing control portion or a second timing control portion.
  • the first timing control portion is connected to the image projection device 10 and the image acquisition device 20, and is configured to control the image projection device 10 to respectively emit, in each predetermined period, a predetermined stripe pattern corresponding to the predetermined period, and to control the image acquisition device 20 to respectively acquire light modulated by the target object in a plurality of predetermined periods so as to obtain a stripe image corresponding to each of the predetermined stripe patterns.
  • the second timing control portion is connected to the plurality of light source units 121 and the image acquisition device 20, and is configured to control the plurality of light source units 121 to respectively emit light in different predetermined periods so as to respectively generate, in each predetermined period, initial light corresponding to the predetermined period, and to control the image acquisition device 20 to respectively acquire light modulated by the target object in a plurality of predetermined periods so as to obtain a stripe image corresponding to each beam of the initial light.
  • the three-dimensional scanner further includes an illumination member 30.
  • the image acquisition device 20 is further configured to acquire illumination light reflected by the target object to obtain texture data of the target object in the case where the target object is illuminated by the illumination member 30.
  • the image acquisition device 20 may identify and determine red light, blue light and green light, so that the image acquisition device 20 acquires, in the case where illumination light is projected onto a target object by the illumination member 30, a texture image of the target object, and generates a three-dimensional model consistent with (or substantially consistent with) the target object in color through the texture image and the three-dimensional data, i.e. realizing true-color scanning.
  • the above illumination member 30 may be an LED lamp emitting white light. If the image projection device 10 includes a DLP projection portion 11, it is sufficient to project illumination light through the DLP projection portion 11, i.e. the image projection device 10 and the illumination member 30 are an integrated device.
  • the timing control portion is further connected to the illumination member 30 for controlling the illumination member 30 to project illumination light to a target object and controlling the image acquisition device 20 to acquire a texture image of the target object in the case where illumination light is projected onto the target object by the illumination member 30.
  • the timing control portion is configured to control the image projection device 10 and the illumination member 30 to alternately project a predetermined stripe pattern and illumination light onto a target object
  • the timing control portion is configured to control the image acquisition device 20 to synchronously acquire the predetermined stripe pattern with respect to the image projection device 10 and to control the image acquisition device 20 to synchronously acquire a texture image with respect to the illumination member 30.
  • the timing control portion is configured to control the plurality of light source units 121 and the illumination member 30 to alternately project a predetermined stripe pattern and illumination light onto a target object, and the timing control portion is configured to control the image acquisition device 20 to synchronously acquire the predetermined stripe pattern with respect to the image projection device 10 and to control the image acquisition device 20 to synchronously acquire a texture image with respect to the illumination member 30.
  • the three-dimensional scanner further includes a reflector 40.
  • the reflector 40 is configured to change a transfer path of light.
  • the reflector 40 is arranged on a transfer path of a predetermined stripe pattern. Specifically, the predetermined stripe pattern is reflected onto the target object by the reflector 40, and then is reflected to the image acquisition device 20 after being modulated by the target object. At this moment, the installation constraint of the image projection device 10 and the image acquisition device 20 can be reduced, and the size of space required for the image projection device 10 and the image acquisition device 20 can be reduced.
  • the reflector 40 is arranged on a transfer path of light emitted by the plurality of light source units 121.
  • the reflector 40 is configured to change the transfer path of the light emitted by the plurality of light source units 121 so as to reduce the installation constraint of the plurality of light source units 121 and reduce the size of space required for the plurality of light source units 121.
  • the three-dimensional scanner further includes an illumination member 30 and a reflector 40 and the reflector 40 is arranged on a transfer path of a predetermined stripe pattern, as shown in FIG. 3
  • the illumination member 30 may be arranged on the outer periphery of the reflector 40, or may also be arranged in other parts of the scanner and arranged in cooperation with the reflector 40. Illumination light is reflected to the target object through the reflector 40.
  • the illumination member 30 is arranged on a side of the first imaging lens 14 close to the light source unit 121, and light projected by the illumination member and the light source unit 121 can pass through the first imaging lens 14 and can be reflected to the target object by the reflector 40.
  • the three-dimensional scanner includes a grip portion and an entrance portion arranged at a front end of the grip portion.
  • the image projection device 10 and the image acquisition device 20 are both installed on the grip portion.
  • the reflector 40 is installed on the entrance portion.
  • the illumination member 30 may be installed on the entrance portion or may also be installed on the grip portion.
  • the image acquisition device 20 is configured to acquire light modulated by the target object so as to obtain a plurality of stripe images in the case where predetermined stripe patterns are projected onto the target object.
  • the obtained stripe images are taken as coding images to determine respective stripe sequences and as reconstruction images to perform three-dimensional reconstruction on the target object, so as to generate three-dimensional data of the target object.
  • the projected predetermined stripe pattern will be mapped on the target object, and the predetermined stripe pattern will be deformed (i.e. modulated) based on the shape of the target object.
  • the image acquisition device 20 acquires the above deformed predetermined stripe pattern, and then obtains a stripe image.
  • the stripe image is used for determining respective stripe sequences and performing three-dimensional reconstruction on the target object.
  • the image acquisition device 20 further includes a plurality of cameras 21.
  • the plurality of cameras 21 include at least one monochrome camera 21.
  • the image acquisition device 20 acquires light modulated by the target object through the plurality of cameras 21 to obtain a plurality of stripe images.
  • a stripe image obtained by the at least one monochrome camera 21 is taken as a reconstruction image to perform three-dimensional reconstruction on the target object.
  • stripe images obtained by at least a plurality of monochrome cameras 21 are taken as coding images to determine respective stripe sequences, and/or, a stripe image obtained by at least one color camera 21 is taken as a coding image to determine respective stripe sequences.
  • the image acquisition device 20 acquires light modulated by the target object through the plurality of cameras 21 so as to obtain a plurality of stripe images, and the above plurality of cameras 21 include at least one monochrome camera 21.
  • a stripe image obtained by the at least one monochrome camera 21 is taken as a reconstruction image to perform three-dimensional reconstruction on the target object.
  • the imaging resolution of the monochrome camera 21 is higher than that of the color camera 21. Therefore, the plurality of cameras 21 include at least one monochrome camera 21, and a stripe image generated by the monochrome camera 21 is used for three-dimensional reconstruction, thereby improving the accuracy of the three-dimensional reconstruction of the target object.
  • the operation of taking a stripe image obtained by the at least one monochrome camera 21 as a reconstruction image to perform three-dimensional reconstruction on the target object includes: taking a stripe image obtained by one monochrome camera 21 as a reconstruction image to perform three-dimensional reconstruction on the target object; taking stripe images obtained by a plurality of monochrome cameras 21 as reconstruction images to perform three-dimensional reconstruction on the target object; taking stripe images obtained by one monochrome camera 21 and at least one color camera 21 as reconstruction images to perform three-dimensional reconstruction on the target object; and taking stripe images obtained by a plurality of monochrome cameras 21 and at least one color camera 21 as reconstruction images to perform three-dimensional reconstruction on the target object.
  • the operation of taking stripe images obtained by at least a plurality of monochrome cameras 21 as coding images to determine respective stripe sequences and/or taking a stripe image obtained by at least one color camera 21 as a coding image to determine respective stripe sequences includes: taking stripe images obtained by a plurality of monochrome cameras 21 as coding images to determine respective stripe sequences; taking a stripe image obtained by at least one color camera 21 as a coding image to determine respective stripe sequences; and taking stripe images obtained by at least one color camera 21 and at least one monochrome camera 21 as coding images to determine respective stripe sequences.
  • stripe information contained in at least one stripe image as a coding image needs to determine coding sequences of respective stripes.
  • the coding image is composed of stripe images capable of determining the coding sequences of the respective stripes.
  • the camera 21 may be a CDD camera or a CMOS camera.
  • the present application does not specifically define the form of the camera, and a person skilled in the art would have been able to make corresponding replacements according to technical requirements.
  • the CCD camera is small in size and light in weight, is not affected by a magnetic field, and has anti-shock and anti-impact properties. Therefore, in the case where the three-dimensional scanner adopts a 2CCD camera to obtain a stripe image, the volume of the three-dimensional scanner can be reduced accordingly, so that the three-dimensional scanner is convenient for handheld use, and is applied in a small-space environment to be scanned (e.g.: oral cavity).
  • a pre-designed predetermined stripe image A is projected onto a target object by the image projection device 10 at a time period a of a predetermined period
  • a pre-designed predetermined stripe image B is projected onto the target object at a time period b of the predetermined period
  • the image acquisition device 20 is controlled to rapidly acquire an image of the target object with a predetermined stripe image.
  • the cameras 21 included in the image acquisition device 20 respectively acquire different stripe images.
  • the camera 21 is a color camera 21 for obtaining a color stripe image in the case where a predetermined stripe pattern A is projected onto the target object, and the camera 21 is a monochrome camera 21 for obtaining a monochrome stripe image in the case where a predetermined stripe pattern B is projected onto the target object.
  • the color stripe image and the monochrome stripe image are transferred to a computer terminal.
  • the computer takes the color stripe image as coding information and the monochrome stripe image as a reconstruction image so as to obtain a three-dimensional shape of the target object.
  • the image acquisition device 20 further includes a second imaging lens 23.
  • the second imaging lens 23 corresponds to the light input portion of the beam processing device 22. Light acquired by the image acquisition device 20 is emitted towards the light input portion of the beam processing device 22 to different light output portions of the beam processing device 22 through the second imaging lens 23.
  • the image acquisition device 20 enables the plurality of cameras 21 to respectively perform imaging based on coaxial light incident from the same second imaging lens 23 by means of the arranged beam processing device 22, i.e. enables stripe patterns respectively obtained by the plurality of cameras 21 to have consistent fields of view and angles.
  • the light input portion of the beam processing device 22 is provided with a second imaging lens 23.
  • the beam processing device 22 includes a plurality of light output portions corresponding to the cameras 21 one by one.
  • the beam processing device 22 performs direction adjustment and/or band separation on light incident therein, so that the respective cameras 21 may respectively perform imaging based on light in the same incident direction and may perform imaging based on light of a specified band.
  • light of the target object enters the light input portion of the beam processing device 22.
  • the beam processing device 22 separates image light of the target object so that the image light is emitted out from the at least two light output portions respectively to be projected onto the plurality of cameras 21.
  • stripe images acquired by the plurality of cameras 21 are all stripe images obtained in the same perspective.
  • the beam processing device 22 separates the received light into light projected in a plurality of directions by the first beam separation unit. For example, a beam of red and blue light is processed by the first beam separation unit to form two beams of red and blue light, which are emitted out in different directions respectively.
  • the beam processing device 22 further includes at least one second beam separation unit configured to separate light to be obtained by a specified camera 21 so that the specified camera 21 obtains light of a specified band.
  • the specified band at least includes: a light band contained in at least one beam of initial light.
  • the beam processing device 22 may separate light of a partial band from the received light by the second beam separation unit. For example, a beam of red and blue light is processed by the second beam separation unit to form a beam of blue light.
  • first beam separation unit and the second beam separation unit in the present application may be integrated in one physical unit, or each unit may be physically present separately.
  • the first beam separation unit may be a partial-reflection partial-transmission prism 22c.
  • the second beam separation unit may be a light filter 22d.
  • the first beam separation unit and the second beam separation unit may be integrated in a right-angled two-channel dichroic prism 22a.
  • the first beam separation unit and the second beam separation unit may be integrated in a three-channel dichroic prism 22b.
  • a pre-designed predetermined stripe image A is projected onto a target object by the image projection device 10 at a time period a of a predetermined period.
  • the predetermined stripe image A is formed by combining a blue stripe and a green stripe.
  • the second beam separation unit corresponding to the camera 21 separates the light to be obtained by the camera 21, so that the camera 21 can obtain green light and blue light.
  • the green light and the blue light can be obtained by the camera 21.
  • the number of stripe colors in the reconstruction image is less than the number of stripe colors in the predetermined color-coded stripe, so that the spacing between adjacent stripes is not too small, and the problem that the spacing is too small to match accurately in the stripe matching process is solved.
  • the reconstruction image is composed of only one color stripe.
  • the reconstruction image is obtained by a monochrome camera 21.
  • the reconstruction image is a monochrome stripe image generated only by blue light, and the blue light has higher anti-interference and higher stability than light of other colors.
  • the stripe extraction algorithm based on spatial coding achieves the technical effects of eliminating the projection requirements of dynamic projection and realizing three-dimensional reconstruction of a target object only with a few two-dimensional images, and solves the technical problem that three-dimensional reconstruction methods in the related art are high in hardware cost and are thus not conducive to the promotion and use of a three-dimensional scanning device.
  • the three-dimensional scanner also improves the accuracy of three-dimensional identification by using colors as spatial coding information.
  • the image acquisition device 20 includes a third camera 213 corresponding to the third light output portion, and a fourth camera 214 corresponding to the fourth light output portion.
  • the third camera 213 generates a third stripe image based on the acquired light.
  • the fourth camera 214 generates a fourth stripe image based on the acquired light.
  • the third stripe image and the fourth stripe image both include identifiable stripes of at least two colors.
  • the beam processing device 22 separates light to be obtained by a specified camera 21 through the right-angled two-channel dichroic prism 22a so that the specified camera 21 obtains light containing a specified band.
  • the operation of obtaining light containing a specified band by the specified camera 21 includes: obtaining light of a third specified band by the third camera 213, and obtaining light of a fourth specified band by the fourth camera 214.
  • the third camera 213 is a monochrome camera 21, and the fourth camera 214 is a color camera 21.
  • the light emitting portion 12 emits red light to the light transmitting portion 13 at a first time period. After the red light is projected by a predetermined pattern on the light transmitting portion 13, a first predetermined stripe pattern is generated. The first predetermined stripe pattern is projected onto the target object in the form of red coded stripes. Light is transferred to the image processing device after being modulated by the target object.
  • the right-angled two-channel dichroic prism 22a is a red/green/blue dichroic prism, so that red light is emitted from the third light output portion and green light and blue light are emitted from the fourth light output portion. At this moment, the red coded stripes are emitted from the third light output portion through the right-angled two-channel dichroic prism 22a and acquired by the monochrome camera 21.
  • the monochrome camera 21 generates a third stripe image containing red stripes.
  • the light emitting portion 12 emits green light and blue light to the light transmitting portion 13 at a second time period. After the green light and the blue light are transmitted by a predetermined pattern on the light transmitting portion 13, a second predetermined stripe pattern is generated. The second predetermined stripe pattern is projected onto the target object in the form of green/blue coded stripes. Light is transferred to the image processing device after being modulated by the target object. At this moment, the green/blue coded stripes are emitted from the fourth light output portion through the right-angled two-channel dichroic prism 22a and acquired by the color camera 21. The color camera 21 generates a fourth stripe image containing green stripes and blue stripes.
  • the third stripe image and the fourth stripe image both correspond to the same light transmitting portion 13
  • the respective stripes in the third stripe image and the fourth stripe image correspond to each other.
  • the stripes therein correspond to the predetermined color-coded stripes on the light transmitting portion 13.
  • the third stripe image is taken as a reconstruction image
  • the fourth stripe image is taken as a coding image.
  • the fourth stripe image is acquired by the color camera 21, and green stripes and blue stripes in the fourth stripe image may both be identified and determined, thereby determining coding sequences of the respective stripes in the fourth stripe image.
  • the respective stripes of the third stripe image may be identified and matched by coding sequences of fourth stripes to realize three-dimensional reconstruction based on a stripe correspondence between the third stripe image and the fourth stripe image.
  • the monochrome camera 21 obtains only single-color light. Therefore, the third stripe image may also be identified and determined. The third stripe image may be combined with the fourth stripe image to determine coding sequences of the respective stripes. That is, the third stripe image and the fourth stripe image are both taken as coding images.
  • the light filter 22d may be arranged or the light filter 22d may not be arranged in the present embodiment.
  • the light filter 22d may be arranged in cooperation with the right-angled two-channel dichroic prism 22a.
  • the beam processing device 22 separates light projected from the light input portion through the right-angled two-channel dichroic prism 22a so that the light is respectively projected from the third light output portion and the fourth light output portion to cameras 21 corresponding to the respective light output portions. That is, the beam processing device 22 realizes the function corresponding to the first beam separation unit through the right-angled two-channel dichroic prism 22a.
  • the beam processing device 22 also separates light to be obtained by a specified camera 21 through the right-angled two-channel dichroic prism 22a so that the specified camera 21 obtains light containing a specified band. That is, the beam processing device 22 realizes the function corresponding to the second beam separation unit through the right-angled two-channel dichroic prism 22a.
  • the right-angled two-channel dichroic prism 22a integrates a first beam separation unit and a second beam separation unit so as to enable light of a specified band to be emitted from a specified direction.
  • the right-angled two-channel dichroic prism 22a enables red and green light to be emitted from the third light output portion and blue light to be emitted from the fourth light output portion.
  • a beam containing red, green and blue light passes through the right-angled two-channel dichroic prism 22a, the red and green light is separated from the blue light, the red and green light is emitted through the third light output portion, and the blue light is emitted through the third light output portion.
  • the beam processing device 22 includes a three-channel dichroic prism 22b, and the three-channel dichroic prism 22b includes a fifth light output portion, a sixth light output portion, and a seventh light output portion.
  • the beam processing device 22 separates light projected from the light input portion through the three-channel dichroic prism 22b so that the light is respectively projected from the fifth light output portion, the sixth light output portion, and the seventh light output portion to cameras 21 corresponding to the respective light output portions.
  • the image acquisition device 20 includes a fifth camera 215 corresponding to the fifth light output portion, a sixth camera 216 corresponding to the sixth light output portion, and a seventh camera 217 corresponding to the seventh light output portion.
  • the fifth camera 215 generates a fifth stripe image based on the acquired light.
  • the sixth camera 216 generates a sixth stripe image based on the acquired light.
  • the seventh camera 217 generates a seventh stripe image based on the acquired light.
  • the fifth stripe image, the sixth stripe image, and the seventh stripe image all include identifiable stripes of at least two colors.
  • the inclusion of stripes of at least two colors in the fifth stripe image, the sixth stripe image and the seventh stripe image is used for realizing a distinguishing process of two stripes in color, not a limitation of the color.
  • the beam processing device 22 separates light to be obtained by a specified camera 21 through the three-channel dichroic prism 22b so that the specified camera 21 obtains light containing a specified band.
  • the operation of obtaining light containing a specified band by the specified camera 21 at least includes: obtaining light of a fifth specified band by the fifth camera 215, and obtaining light of a sixth specified band by the sixth camera 216, the fifth specified band being different from the sixth specified band.
  • the fifth camera 215, the sixth camera 216 and the seventh camera 217 is a monochrome camera 21.
  • the fifth camera 215 is a monochrome camera 21 and the sixth camera 216 and the seventh camera 217 are color cameras 21.
  • the fifth camera 215 and the sixth camera 216 are monochrome cameras 21 and the seventh camera 217 is a color camera 21.
  • the fifth camera 215, the sixth camera 216 and the seventh camera 217 are all monochrome cameras 21.
  • the fifth camera 215, the sixth camera 216 and the seventh camera 217 are all monochrome cameras 21.
  • the light emitting portion 12 emits red light to the light transmitting portion 13 at a third time period. After the red light is projected by predetermined color-coded stripes on the light transmitting portion 13, a third predetermined stripe pattern is generated. The third predetermined stripe pattern is projected onto the target object in the form of red coded stripes. Light is transferred to the image processing device after being modulated by the target object.
  • the beam processing device is the three-channel dichroic prism 22b configured to separate red, green and blue colors, so that red light is emitted from the fifth light output portion, green light is emitted from the sixth light output portion, and blue light is emitted from the seventh light output portion.
  • the red coded stripes are decomposed by the three-channel dichroic prism 22b and acquired by the fifth camera 215 through the fifth light output portion.
  • the fifth camera 215 generates a fifth stripe image containing red stripes.
  • the light emitting portion 12 emits blue light to the light transmitting portion 13 at a fourth time period. After the blue light is projected by a predetermined pattern on the light transmitting portion 13, a fourth predetermined stripe pattern is generated. The fourth predetermined stripe pattern is projected onto the target object in the form of blue coded stripes. Light is transferred to the image processing device after being modulated by the target object. At this moment, the blue coded stripes are decomposed by the three-channel dichroic prism 22b and acquired by the sixth camera 216 through the sixth light output portion. The sixth camera 216 generates a sixth stripe image containing blue stripes.
  • the light emitting portion 12 emits green light to the light transmitting portion 13 at a fifth time period. After the green light is projected by a predetermined pattern on the light transmitting portion 13, a fifth predetermined stripe pattern is generated. The fifth predetermined stripe pattern is projected onto the target object in the form of green coded stripes. Light is transferred to the image processing device after being modulated by the target object. At this moment, the green coded stripes are decomposed by the three-channel dichroic prism 22b and acquired by the seventh camera 217 through the seventh light output portion. The seventh camera 217 generates a seventh stripe image containing green stripes.
  • the illumination member 30 projects illumination light onto the target object at a ninth time period.
  • the illumination light is transferred to the image processing device after being emitted by the target object.
  • Red light in the illumination light is acquired by the fifth camera 215 to generate a fifth texture image.
  • Blue light is acquired by the sixth camera 216 to generate a sixth texture image.
  • Green light is acquired by the seventh camera 217 to generate a seventh texture image.
  • the fifth texture image, the sixth texture image and the seventh texture image are synthesized into a texture image of the target object.
  • any stripe image combination determined by the fifth stripe image, the sixth stripe image and the seventh stripe image may be taken as a reconstruction image
  • any stripe image combination determined by the fifth stripe image, the sixth stripe image and the seventh stripe image may be taken as a coding image
  • the fifth stripe image, the sixth stripe image and the seventh stripe image are taken as a coding image together to determine coding sequences of the respective stripes.
  • the fifth stripe image, the sixth stripe image and the seventh stripe image are taken as a reconstruction image together to realize three-dimensional reconstruction.
  • the light filter 22d may be arranged or the light filter 22d may not be arranged in the present embodiment.
  • the light filter 22d may be arranged in cooperation with the three-channel dichroic prism 22b.
  • the beam processing device 22 separates light projected from the light input portion through the three-channel dichroic prism 22b so that the light is respectively projected from the fifth light output portion, the sixth light output portion and the seventh light output portion to cameras 21 corresponding to the respective light output portions. That is, the beam processing device 22 realizes the function corresponding to the first beam separation unit through the three-channel dichroic prism 22b.
  • the beam processing device 22 also separates light to be obtained by a specified camera 21 through the three-channel dichroic prism 22b so that the specified camera 21 obtains light containing a specified band. That is, the beam processing device 22 realizes the function corresponding to the second beam separation unit through the three-channel dichroic prism 22b.
  • the beam processing device 22 includes a partial-reflection partial-transmission prism 22c, and the partial-reflection partial-transmission prism 22c includes a first light output portion and a second light output portion.
  • the beam processing device 22 separates light projected from the light input portion through the partial-reflection partial-transmission prism 22c so that the light is respectively projected from the first light output portion and the second light output portion to cameras 21 corresponding to the respective light output portions.
  • the image acquisition device 20 includes a first camera 211 corresponding to the first light output portion, and a second camera 212 corresponding to the second light output portion.
  • the first camera 211 generates a first stripe image based on the acquired light.
  • the second camera 212 generates a second stripe image based on the acquired light.
  • the first stripe image and the second stripe image both include identifiable stripes of at least two colors.
  • both the first stripe image and the second stripe image is used for realizing a distinguishing process of two stripes in color, not a limitation of the color.
  • the beam processing device 22 further includes a light filter 22d.
  • the beam processing device 22 separates light acquired by a specified camera 21 through the light filter 22d so that the specified camera 21 obtains light containing a specified band, and at least one camera 21 in the plurality of cameras 21 is a specified camera 21.
  • the light filter 22d is arranged between the first light output portion and the first camera 211 so that the first camera 211 obtains light of a first specified band, and/or, arranged between the second light output portion and the second camera 212 so that the second camera 212 obtains light of a second specified band.
  • the first camera 211 is a monochrome camera 21
  • the second camera 212 is a color camera 21
  • the monochrome camera 21 corresponds to the light filter 22d.
  • the light is filtered by a red light filter 22d before being acquired by the monochrome camera 21. That is, a filter color of the light filter 22d arranged in front of the camera 21 corresponds to the color of a beam acquired by the camera 21.
  • the light emitting portion 12 emits red light and blue light to the light transmitting portion 13 at a seventh time period. After the red light and the blue light are projected by a predetermined image on the light transmitting portion 13, a seventh predetermined stripe pattern is generated. The seventh predetermined stripe pattern is projected onto the target object in the form of red/blue coded stripes. Light is transferred to the image processing device after being modulated by the target object. At this moment, the red/blue coded stripes are decomposed by the partial-reflection partial-transmission prism 22c into two beams of red and blue light. At least one beam of light is acquired by the color camera 21 to generate a second stripe image.
  • first stripe image and the second stripe image both correspond to the same light transmitting portion 13
  • respective stripes in the first stripe image and the second stripe image correspond to each other.
  • the first stripe image and the second stripe image correspond to the predetermined patterns on the light transmitting portion 13.
  • the first stripe image is taken as a reconstruction image
  • the second stripe image is taken as a coding image.
  • the second stripe image is acquired by the color camera 21, and red stripes and blue stripes in the second stripe image may both be identified and determined, thereby determining coding sequences of the respective stripes in the second stripe image.
  • the respective stripes of the first stripe image may be identified and matched by coding sequences of the second stripe image to realize three-dimensional reconstruction based on a stripe correspondence between the first stripe image and the second stripe image.
  • the arrangement of the light filter 22d in front of the monochrome camera 21 is an optional example.
  • the present application does not specifically define whether to arrange the light filter 22d in front of the camera 21, and only ensures that stripes of at least two colors in a stripe image obtained by each camera 21 may be identified and determined.
  • the light filter 22d is not arranged in front of the monochrome camera 21, and the first stripe image obtained by the monochrome camera 21 contains red stripes.
  • a blue light filter 22d is arranged in front of the color camera 21, and the second stripe image obtained by the color camera 21 contains blue stripes. Since the light emitting portion 12 emits red light at the sixth time period and emits red light and blue light at the seventh time period, in order to ensure that stripes of at least two colors in the stripe image obtained by each camera 21 may be identified and determined, a red light filter 22d cannot be arranged in front of the color camera 21, so as to avoid only red stripes being in the stripe image obtained by the monochrome camera 21 and the color camera 21. Alternatively, a two-color light filter 22d is arranged in front of the color camera 21, and the second stripe image obtained by the color camera 21 contains red stripes and blue stripes.
  • each predetermined stripe pattern in each period and a projection time interval of the illumination light are very small, thus ensuring that the three-dimensional scanner remains stationary or substantially stationary in this period, and the predetermined stripe pattern and the illumination light are (substantially) projected onto the same region of the target object.
  • the beam processing device 22 separates light projected from the light input portion by transmitting and reflecting the light through the partial-reflection partial-transmission prism 22c so that the light is respectively projected from the first light output portion and the second light output portion to cameras 21 corresponding to the respective light output portions. That is, the beam processing device 22 realizes the function corresponding to the first beam separation unit through the partial-reflection partial-transmission prism 22c.
  • the beam processing device 22 separates light to be obtained by a specified camera 21 through the light filter 22d so that the specified camera 21 obtains light containing a specified band. That is, the beam processing device 22 realizes the function corresponding to the second beam separation unit through the light filter 22d.
  • Embodiments I, II and III listed in the present application are all illustrative examples to enable a person skilled in the art to more clearly understand the technical solution of the present application.
  • the present application is not specifically limited herein. If other specific devices may realize the functional definition description of the beam processing device 22 in the present application, the devices may also serve as an executable technical solution of the present application.
  • Embodiments I, II and III listed in the present application may all be combined with reference to each other to realize the functional definition description of the beam processing device 22 in the present application.
  • the beam processing device 22 may still continue to realize the function corresponding to the second beam separation unit again through the light filter 22d.
  • the three-dimensional scanner provided by the present application has the advantages of low hardware cost, low real-time frame rate requirements, high brightness and large depth of field of an optical system, and device miniaturization. Further, the three-dimensional scanner can directly perform dynamic real-time three-dimensional scanning with color texture on materials characterized by light reflection, transmission and diffusion such as intra-oral teeth and gums.
  • a three-dimensional scanning system is also provided.
  • FIG. 13 is a schematic diagram of a three-dimensional scanning system according to an embodiment of the present application. As shown in FIG. 13 , the three-dimensional scanning system includes: a three-dimensional scanner 71 and an image processor 73.
  • the three-dimensional scanner 71 is configured to respectively project, in each predetermined period, a predetermined stripe pattern corresponding to the predetermined period to a target object, and acquire light modulated by the target object so as to obtain a plurality of stripe images in the case where predetermined stripe patterns are projected onto the target object.
  • Stripes of each predetermined stripe pattern are disposed according to arrangement of predetermined color-coded stripes.
  • Each predetermined stripe pattern includes stripes of at least one color in the predetermined color-coded stripes, and a plurality of predetermined stripe patterns include stripes of at least two colors in the predetermined color-coded stripes.
  • the stripes in the predetermined stripe patterns are arranged in the same way as stripes of the same color in the predetermined color-coded stripes.
  • the image processor 73 is connected to the three-dimensional scanner 71, and configured to obtain a plurality of stripe images obtained by the three-dimensional scanner 71, and take the stripe images as coding images to determine respective stripe sequences and as reconstruction images to perform three-dimensional reconstruction on the target object.
  • the three-dimensional scanner 71 is any one three-dimensional scanner provided by the above embodiments.
  • a stripe extraction algorithm based on spatial coding achieves the technical effects that the three-dimensional scanner 71 may perform pattern projection processing by means of simple transmission projection and realizing three-dimensional reconstruction of a target object only with a few two-dimensional images, and solves the technical problem that three-dimensional reconstruction methods in the related art are high in hardware cost and are thus not conducive to the promotion and use of a three-dimensional scanning device.
  • the three-dimensional scanning system also improves the accuracy of three-dimensional identification by using colors as spatial coding information.
  • the image processor 73 is further configured to: take a stripe image obtained by the at least one monochrome camera 21 as a reconstruction image to perform three-dimensional reconstruction on the target object; and take stripe images obtained by at least a plurality of monochrome cameras 21 as coding images to determine respective stripe sequences, and/or, take a stripe image obtained by at least one color camera 21 as a coding image to determine respective stripe sequences.
  • a three-dimensional scanning method is also provided.
  • the three-dimensional scanning method in the embodiments of the present application is applied to the above three-dimensional scanner provided in the embodiments of the present application.
  • the three-dimensional scanning method provided by the embodiments of the present application will be described below.
  • FIG. 14 is a flowchart of a three-dimensional scanning method according to an embodiment of the present application. As shown in FIG. 14 , the three-dimensional scanning method includes the following steps.
  • step S1401 in each predetermined period, initial light corresponding to the predetermined period is respectively emitted.
  • Each beam of the initial light is composed of light of at least one color in the predetermined color-coded stripes, and after each beam of the initial light is transmitted by patterns of the predetermined color-coded stripes on the light transmitting portion 13, respective corresponding predetermined color stripes are generated and projected onto a target object.
  • step S1403 light modulated by the target object in the plurality of predetermined periods is respectively acquired, and a plurality of stripe images are obtained based on the above light.
  • the obtained stripe images are taken as coding images to determine respective stripe sequences and as reconstruction images to perform three-dimensional reconstruction on the target object.
  • step S1405 sequences of respective stripes in the plurality of stripe images are determined based on the coding image.
  • step S1407 three-dimensional reconstruction is performed on the reconstruction image based on the sequences, and three-dimensional data of the target object is obtained.
  • a stripe extraction algorithm based on spatial coding achieves the technical effects that the three-dimensional scanner may perform pattern projection processing by means of simple transmission projection and realizing three-dimensional reconstruction of a target object only with a few two-dimensional images, and solves the technical problem that three-dimensional reconstruction methods in the related art are high in hardware cost and are thus not conducive to the promotion and use of a three-dimensional scanning device.
  • the three-dimensional scanning method also achieves the technical effect of improving the accuracy of three-dimensional identification by using colors as spatial coding information.
  • the three-dimensional scanning method further includes: projecting illumination light onto the target object and obtaining texture data of the target object based on the illumination light; and obtaining color three-dimensional data of the target object based on the three-dimensional data and the texture data of the target object.
  • the texture data may be obtained by a single camera 21, or synthesized from data obtained by a plurality of cameras 21.
  • step S803 light modulated by the target object is acquired, and at least two stripe images are obtained based on the light. At least one of the stripe images is obtained by a monochrome camera 21. The stripe image obtained by the monochrome camera 21 is taken as a reconstruction image.
  • step S805 sequences of respective stripes in a plurality of stripe images are determined based on the coding image, and a coding sequence is determined based on arrangement information and color information of the respective stripes in the coding image. For example, if four stripes arranged in red, green, green, and red are coded and decoded by red (1, 0) and green (0, 1), the coding sequence thereof is (1, 0) (0, 1) (0, 1) (1, 0).
  • red (1, 0, 0), green (0, 1, 0) and blue (0, 0, 1)
  • the coding sequence thereof is (1, 0, 0), (0, 0, 1), (0, 0, 1), (0, 1, 0), (1, 0, 0).
  • step S807 the respective stripes of the reconstruction image are matched based on the coding sequences.
  • stripe matching is performed on reconstruction images of the two image acquisition devices 20, and point cloud reconstruction is performed after matching, so as to obtain three-dimensional data of a target object.
  • point cloud reconstruction is performed after matching, so as to obtain three-dimensional data of a target object.
  • the light emitting portion 12 and the light transmitting portion 13 project red/blue color-coded stripes to a target object at a first time period.
  • the red/blue color-coded stripes are modulated by the target object and then transferred to the image processing device.
  • Light of the red/blue color-coded stripes is separated into at least one beam of light of red/blue color-coded stripes by the partial-reflection partial-transmission prism 22c.
  • One of the beams of light of red/blue color-coded stripes is acquired by a color camera 21, and the color camera 21 generates a corresponding red/blue color-coded stripe image.
  • the light emitting portion 12 and the light transmitting portion 13 project blue coded stripes to the target object at a second time period.
  • the blue coded stripes are modulated by the target object and then transferred to the image processing device.
  • Light of the blue coded stripes is separated into at least one beam of light of blue coded stripes by the partial-reflection partial-transmission prism 22c.
  • One of the beams of light of blue coded stripes is acquired by a monochrome camera 21 through a blue light filter 22d, and the monochrome camera 21 generates a corresponding blue stripe image.
  • the illumination member 30 illuminates white light to the target object at a third time period.
  • the white light is reflected by the target object and then acquired by the color camera 21.
  • the color camera 21 generates a texture image.
  • a coding sequence of each stripe is determined based on the red/blue color-coded stripe image.
  • the respective stripes of the blue stripe image are matched based on the coding sequence.
  • Three-dimensional reconstruction is realized to obtain three-dimensional data of the target object. True-color three-dimensional data of the target object is obtained based on the three-dimensional data and the texture image.
  • the light emitting portion 12 and the light transmitting portion 13 project red/green color-coded stripes to a target object at a first time period.
  • the red/green color-coded stripes are modulated by the target object and then transferred to the image processing device.
  • Light of the red/green color-coded stripes is decomposed into one beam of light of red/green color-coded stripes by the right-angled two-channel dichroic prism 22a.
  • the light of red/green color-coded stripes is acquired by a color camera 21, and the color camera 21 generates a corresponding red/green color-coded stripe image.
  • the light emitting portion 12 and the light transmitting portion 13 project blue coded stripes to the target object at a second time period.
  • the blue coded stripes are modulated by the target object and then transferred to the image processing device.
  • Light of the blue coded stripes is decomposed into one beam of light of blue coded stripes by the right-angled two-channel dichroic prism 22a.
  • the beam of light of blue coded stripes is acquired by a monochrome camera 21, and the monochrome camera 21 generates a corresponding blue stripe image.
  • the illumination member 30 emits white light to the target object at a third time period.
  • the white light is reflected by the target object and then acquired by the color camera 21 and the monochrome camera 21.
  • the color camera 21 generates a texture image based on red light and green light.
  • the monochrome camera 21 generates a texture image based on blue light.
  • a coding sequence of each stripe is determined based on the red/green color-coded stripe image.
  • the respective stripes of the blue stripe image are matched based on the coding sequence.
  • Three-dimensional reconstruction is realized to obtain three-dimensional data of the target object.
  • a texture image based on white light is synthesized based on the texture image of the color camera 21 and the texture image of the monochrome camera 21.
  • True-color three-dimensional data of the target object is obtained based on the three-dimensional data and the texture image of the white light.
  • the light emitting portion 12 and the light transmitting portion 13 project red coded stripes to a target object at a first time period.
  • the red coded stripes are modulated by the target object and then transferred to the image processing device.
  • Light of the red coded stripes is decomposed into a beam of light of red coded stripes by the three-channel dichroic prism 22b.
  • the beam of light of red coded stripes is acquired by a first monochrome camera 21, and the first monochrome camera 21 generates a corresponding red coded stripe image.
  • the light emitting portion 12 and the light transmitting portion 13 project green coded stripes to the target object at a second time period.
  • the green coded stripes are modulated by the target object and then transferred to the image processing device.
  • Light of the green coded stripes is decomposed into a beam of light of green coded stripes by the three-channel dichroic prism 22b.
  • the beam of light of green coded stripes is acquired by a second monochrome camera 21, and the second monochrome camera 21 generates a corresponding green coded stripe image.
  • the light emitting portion 12 and the light transmitting portion 13 project blue coded stripes to the target object at a third time period.
  • the blue coded stripes are modulated by the target object and then transferred to the image processing device.
  • Light of the blue coded stripes is decomposed into a beam of light of blue coded stripes by the three-channel dichroic prism 22b.
  • the beam of light of blue coded stripes is acquired by a third monochrome camera 21, and the third monochrome camera 21 generates a corresponding blue coded stripe image.
  • the illumination member 30 emits white light to the target object at a fourth time period.
  • the white light is reflected by the target object and then acquired by the three monochrome cameras 21.
  • the first monochrome camera 21 generates a texture image based on red light
  • the second monochrome camera 21 generates a texture image based on green light
  • the third monochrome camera 21 generates a texture image based on blue light.
  • a coding sequence of each stripe is determined based on the combination of the red stripe image, the green stripe image and the blue stripe image.
  • the respective stripes of the red stripe image, the green stripe image and the blue stripe image are matched based on the coding sequence.
  • Three-dimensional reconstruction is realized to obtain three-dimensional data of the target object.
  • a texture image based on white light is synthesized based on the texture images of the three monochrome cameras 21.
  • True-color three-dimensional data of the target object is obtained based on the three-dimensional data and the texture image of the white light.
  • Embodiments of the present invention provide a storage medium having, stored thereon, a program which, when executed by a processor, implements the three-dimensional scanning method.
  • Embodiments of the present invention provide a processor for running a program.
  • the program when run, performs the three-dimensional scanning method.
  • the embodiments of the present application may be provided as a method, a system, or a computer program product. Therefore, the present application may take the form of an entirely hardware embodiment, an entirely software embodiment, or an embodiment combining software and hardware. Moreover, the present application may take the form of a computer program product implemented on one or more computer available storage media (including, but not limited to, a disk memory, a CD-ROM, an optical memory, etc.) containing computer available program codes.
  • a computer available storage media including, but not limited to, a disk memory, a CD-ROM, an optical memory, etc.

Landscapes

  • Engineering & Computer Science (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Computer Vision & Pattern Recognition (AREA)
  • Multimedia (AREA)
  • Signal Processing (AREA)
  • Theoretical Computer Science (AREA)
  • Mathematical Analysis (AREA)
  • Algebra (AREA)
  • Mathematical Optimization (AREA)
  • Mathematical Physics (AREA)
  • Pure & Applied Mathematics (AREA)
  • Computer Graphics (AREA)
  • Geometry (AREA)
  • Software Systems (AREA)
  • Electromagnetism (AREA)
  • Length Measuring Devices By Optical Means (AREA)

Claims (14)

  1. Dreidimensionaler Scanner, umfassend:
    eine Bildprojektionsvorrichtung (10), die konfiguriert ist, um Licht auf ein Zielobjekt zu projizieren, wobei das Licht vorbestimmtes Licht umfasst, das in Form eines farbcodierten Streifens projiziert wird, wobei das vorbestimmte Licht durch Codieren von mindestens zwei Farbstreifen gebildet wird;
    eine Bilderfassungsvorrichtung (20), die konfiguriert ist, um durch das Zielobjekt moduliertes Licht zu erfassen, um so mindestens ein Streifenbild zu erhalten, wenn Licht durch die Bildprojektionsvorrichtung (10) auf das Zielobjekt projiziert wird, wobei das erhaltene Streifenbild als ein Codierungsbild verwendet wird, um jeweilige Streifensequenzen zu bestimmen, und als Rekonstruktionsbild verwendet wird, um eine dreidimensionale Rekonstruktion des Zielobjekts durchzuführen;
    wobei die Bilderfassungseinrichtung (20) ferner eine Vielzahl von Kameras (21) umfasst,
    wobei die Vielzahl von Kameras (21) mindestens eine Vielzahl von Monochromkameras umfasst, wobei die Bilderfassungsvorrichtung (20) zum Erfassen des durch das Zielobjekt modulierten Lichts durch die Vielzahl von Kameras (21) konfiguriert ist, um eine Vielzahl von Streifenbildern zu erhalten, wobei das durch mindestens eine Monochromkamera erhaltene Streifenbild als ein Rekonstruktionsbild verwendet wird, um eine dreidimensionale Rekonstruktion des Zielobjekts durchzuführen, und die durch mindestens eine Vielzahl von Monochromkameras erhaltenen Streifenbilder als Codierungsbilder verwendet werden, um jeweilige Streifensequenzen zu bestimmen;
    und/oder wobei die Vielzahl von Kameras (21) mindestens eine Monochromkamera und mindestens eine Farbkamera umfasst, wobei das durch mindestens eine Monochromkamera erhaltene Streifenbild als ein Rekonstruktionsbild verwendet wird, um eine dreidimensionale Rekonstruktion des Zielobjekts durchzuführen, das durch mindestens eine Farbkamera erhaltene Streifenbild als ein Codierungsbild verwendet wird, um jeweilige Streifensequenzen zu bestimmen,
    wobei das Streifenbild einer spezifizierten Farbe durch die Monochromkamera erhalten wird.
  2. Dreidimensionaler Scanner nach Anspruch 2, wobei die Bilderfassungsvorrichtung (20) ferner eine Strahlverarbeitungsvorrichtung (22) umfasst, wobei die Strahlverarbeitungsvorrichtung (22) einen Lichteingangsabschnitt und mindestens zwei Lichtausgangsabschnitte umfasst, wobei die jeweiligen Kameras (21) unterschiedlichen Lichtausgangsabschnitten entsprechen und die Bilderfassungsvorrichtung (20) das durch das Zielobjekt modulierte Licht durch die Strahlverarbeitungsvorrichtung (22) erfasst.
  3. Dreidimensionaler Scanner nach Anspruch 2, wobei die Strahlverarbeitungsvorrichtung (22) ferner mindestens eine erste Strahltrenneinheit zum Trennen des von dem Lichteingangsabschnitt projizierten Lichts umfasst, sodass das Licht von den mindestens zwei Lichtausgangsabschnitten zu den Kameras (21) projiziert wird, die jeweils den Lichtausgangsabschnitten entsprechen;
    wobei die Strahlverarbeitungsvorrichtung (22) ferner mindestens eine zweite Strahltrenneinheit zum Trennen des durch eine spezifizierte Kamera erhaltenen Lichts umfasst, sodass die spezifizierte Kamera Licht erhält, das ein spezifiziertes Band enthält, wobei der farbcodierte Streifen einen Streifen einer Farbe umfasst, die dem spezifizierten Band entspricht.
  4. Dreidimensionaler Scanner nach Anspruch 3, wobei der dreidimensionale Scanner auf eine der folgenden Weisen angeordnet ist:
    die Strahlverarbeitungsvorrichtung (22) umfasst ein rechtwinkliges dichroitisches Zweikanalprisma (22a), und das rechtwinklige dichroitische Zweikanalprisma (22a) umfasst einen dritten Lichtausgangsabschnitt und einen vierten Lichtausgangsabschnitt; die Strahlverarbeitungsvorrichtung (22) trennt das von dem Lichteingangsabschnitt projizierte Licht durch das rechtwinklige dichroitische Zweikanalprisma (22a), sodass das Licht jeweils von dem dritten Lichtausgangsbereich und von dem vierten Lichtausgangsbereich zu den Kameras (21) projiziert wird, die den jeweiligen Lichtausgangsabschnitten entsprechen;
    die Bilderfassungsvorrichtung (20) umfasst eine dritte Kamera (213), die dem dritten Lichtausgabeabschnitt entspricht, und eine vierte Kamera (214), die dem vierten Lichtausgabeabschnitt entspricht, wobei die dritte Kamera (213) basierend auf dem erfassten Licht ein drittes Streifenbild erzeugt, die vierte Kamera (214) basierend auf dem erfassten Licht ein viertes Streifenbild erzeugt und das dritte Streifenbild und das vierte Streifenbild identifizierbare Streifen von mindestens zwei Farben umfassen;
    die Strahlverarbeitungsvorrichtung (22) trennt das durch eine spezifizierte Kamera erhaltene Licht durch das rechtwinklige dichroitische Zweikanalprisma (22a), sodass die spezifizierte Kamera Licht erhält, das ein spezifiziertes Band enthält, wobei der Vorgang des Erhaltens von Licht, das ein spezifiziertes Band enthält, durch die spezifizierte Kamera umfasst: Erhalten von Licht eines ersten Filterbands durch die dritte Kamera (213) und/oder Erhalten von Licht eines zweiten Filterbands durch die vierte Kamera (214);
    die Strahlverarbeitungsvorrichtung (22) umfasst ein dichroitisches Dreikanalprisma (22b), und das dichroitische Dreikanalprisma (22b) umfasst einen fünften Lichtausgangsabschnitt, einen sechsten Lichtausgangsabschnitt und einen siebten Lichtausgangsabschnitt; die Strahlverarbeitungsvorrichtung (22) trennt das von dem Lichteingangsabschnitt projizierte Licht durch das dichroitische Dreikanalprisma (22b), sodass das Licht von dem fünften Lichtausgangsbereich, von dem sechsten Lichtausgangsbereich bzw. von dem siebten Lichtausgangsbereich zu den Kameras (21) projiziert wird, die den jeweiligen Lichtausgangsabschnitten entsprechen;
    die Bilderfassungsvorrichtung (20) umfasst eine fünfte Kamera (215), die dem fünften Lichtausgabeabschnitt entspricht, eine sechste Kamera (216), die dem sechsten Lichtausgabeabschnitt entspricht, und eine siebte Kamera (217), die dem siebten Lichtausgabeabschnitt entspricht, wobei die fünfte Kamera (215) basierend auf dem erfassten Licht ein fünftes Streifenbild erzeugt, die sechste Kamera (216) basierend auf dem erfassten Licht ein sechstes Streifenbild erzeugt, die siebte Kamera (217) basierend auf dem erfassten Licht ein siebtes Streifenbild erzeugt und das fünfte Streifenbild, das sechste Streifenbild und das siebte Streifenbild identifizierbare Streifen von mindestens zwei Farben umfassen;
    die Strahlverarbeitungsvorrichtung (22) trennt das durch eine spezifizierte Kamera erhaltene Licht durch das dichroitische Dreikanalprisma (22b), sodass die spezifizierte Kamera Licht erhält, das ein spezifiziertes Band enthält, wobei der Vorgang des Erhaltens von Licht, das ein spezifiziertes Band enthält, durch die spezifizierte Kamera umfasst: Erhalten von Licht eines dritten Filterbands durch die fünfte Kamera (215) und Erhalten von Licht eines vierten Filterbands durch die sechste Kamera (216), wobei sich das dritte Filterband von dem vierten Filterband unterscheidet;
    die Strahlverarbeitungsvorrichtung (22) umfasst ein teilweise reflektierendes und teilweise durchlässiges Prisma (22c), und das teilweise reflektierende und teilweise durchlässige Prisma (22c) umfasst einen ersten Lichtausgangsabschnitt und einen zweiten Lichtausgangsabschnitt; die Strahlverarbeitungsvorrichtung (22) trennt das von dem Lichteingangsabschnitt projizierte Licht durch das teilweise reflektierende und teilweise durchlässige Prisma (22c), sodass das Licht jeweils von dem ersten Lichtausgangsabschnitt und von dem zweiten Lichtausgangsabschnitt zu den Kameras (21) projiziert wird, die den jeweiligen Lichtausgangsabschnitten entsprechen;
    die Bilderfassungsvorrichtung (20) umfasst eine erste Kamera (211), die dem ersten Lichtausgangsabschnitt entspricht, und eine zweite Kamera (212), die dem zweiten Lichtausgangsabschnitt entspricht, wobei die erste Kamera (211) basierend auf dem erfassten Licht ein erstes Streifenbild erzeugt, die zweite Kamera (212) basierend auf dem erfassten Licht ein zweites Streifenbild erzeugt und das erste Streifenbild und das zweite Streifenbild identifizierbare Streifen von mindestens zwei Farben umfassen.
  5. Dreidimensionaler Scanner nach Anspruch 1, wobei
    die Bildprojektionsvorrichtung (10) konfiguriert ist, um in jedem vorbestimmten Zeitraum jeweils ein vorbestimmtes Streifenmuster, das dem vorbestimmten Zeitraum entspricht, auf ein Zielobjekt zu projizieren, wobei Streifen jedes vorbestimmten Streifenmusters gemäß vorbestimmten farbcodierten Streifen angeordnet sind, jedes vorbestimmte Streifenmuster Streifen von mindestens einer Farbe in den vorbestimmten farbcodierten Streifen umfasst, eine Vielzahl von vorbestimmten Streifenmustern Streifen von mindestens zwei Farben in den vorbestimmten farbcodierten Streifen umfasst, und die Streifen in den vorbestimmten Streifenmustern auf die gleiche Weise angeordnet sind wie Streifen der gleichen Farbe in den vorbestimmten farbcodierten Streifen;
    die Bilderfassungsvorrichtung (20) konfiguriert ist, um durch das Zielobjekt moduliertes Licht zu erfassen, um so eine Vielzahl von Streifenbildern zu erhalten, wenn auf das Zielobjekt vorbestimmte Streifenmuster projiziert werden, wobei die erhaltenen Streifenbilder als Codierungsbilder zum Bestimmen jeweiliger Streifensequenzen und als Rekonstruktionsbilder zum Durchführen einer dreidimensionalen Rekonstruktion des Zielobjekts verwendet werden.
  6. Dreidimensionaler Scanner nach Anspruch 5, wobei die Bildprojektionsvorrichtung (10) ferner umfasst:
    einen Licht emittierenden Abschnitt (12), der konfiguriert ist, um in jedem vorbestimmten Zeitraum jeweils Anfangslicht zu emittieren, das dem vorbestimmten Zeitraum entspricht, wobei jedes des Anfangslichts aus Licht von mindestens einer Streifenfarbe zusammengesetzt ist und die Streifenfarbe die Farbe von Streifen in den vorbestimmten farbcodierten Streifen ist;
    einen Licht durchlassenden Abschnitt (13), der auf einem Übertragungsweg des Anfangslichts angeordnet ist, wobei, nachdem jedes des Anfangslichts durch Muster aus vorbestimmten farbcodierten Streifen auf dem Licht durchlassenden Abschnitt (13) durchgelassen wurde, jeweilige vorbestimmte Farbstreifen erzeugt werden und auf ein Zielobjekt projiziert werden.
  7. Dreidimensionaler Scanner nach Anspruch 6, wobei der Licht emittierende Abschnitt (12) ferner eine Vielzahl von Lichtquelleneinheiten (121) umfasst, wobei jede der Lichtquelleneinheiten (121) Licht mit unterschiedlichen Bändern emittiert; der Licht emittierende Abschnitt (12) emittiert das Anfangslicht durch die Vielzahl von Lichtquelleneinheiten (121).
  8. Dreidimensionaler Scanner nach Anspruch 5, ferner umfassend: einen Zeitsteuerungsabschnitt, der mit der Bildprojektionsvorrichtung (10) und der Bilderfassungsvorrichtung (20) verbunden ist und konfiguriert ist, um die Bildprojektionsvorrichtung (10) so zu steuern, dass sie in jedem vorbestimmten Zeitraum jeweils ein vorbestimmtes Streifenmuster emittiert, das jedem vorbestimmten Zeitraum entspricht, und die Bilderfassungsvorrichtung (20) so zu steuern, dass sie jeweils in einer Vielzahl vorbestimmter Zeiträume durch das Zielobjekt moduliertes Licht erfasst, um ein Streifenbild zu erhalten, das jedem der vorbestimmten Streifenmuster entspricht.
  9. Dreidimensionales Scansystem, umfassend:
    einen dreidimensionalen Scanner, der konfiguriert ist, um Licht auf ein Zielobjekt zu projizieren und durch das Zielobjekt moduliertes Licht zu erfassen, um so mindestens ein Streifenbild zu erhalten, wenn Licht auf das Zielobjekt projiziert wird, wobei das projizierte Licht vorbestimmtes Licht umfasst, das in Form eines farbcodierten Streifens projiziert wird, der durch Codieren von mindestens zwei Farbstreifen gebildet wird,
    einen Bildprozessor, der mit dem dreidimensionalen Scanner verbunden ist und konfiguriert ist, um mindestens ein durch den dreidimensionalen Scanner erhaltenes Streifenbild zu erhalten und das Streifenbild als ein Codierungsbild zum Bestimmen jeweiliger Streifensequenzen und als ein Rekonstruktionsbild zum Durchführen einer dreidimensionalen Rekonstruktion des Zielobjekts zu verwenden,
    wobei der dreidimensionale Scanner der dreidimensionale Scanner nach Anspruch 1 ist.
  10. Dreidimensionales Scansystem nach Anspruch 9, wobei, wenn der dreidimensionale Scanner durch das Zielobjekt moduliertes Licht durch eine Vielzahl von Kameras erfasst, um so mindestens ein Streifenbild zu erhalten, und die Vielzahl von Kameras mindestens eine Monochromkamera umfasst, der Bildprozessor ferner konfiguriert ist zum:
    Verwenden des durch die mindestens eine Monochromkamera erhaltenen Streifenbilds als ein Rekonstruktionsbild, um eine dreidimensionale Rekonstruktion des Zielobjekts durchzuführen;
    Verwenden der durch mindestens eine Vielzahl von Monochromkameras erhaltenen Streifenbilder als Codierungsbilder, um jeweilige Streifensequenzen zu bestimmen, und/oder Verwenden des durch mindestens eine Farbkamera erhaltenen Streifenbilds als ein Codierungsbild, um jeweilige Streifensequenzen zu bestimmen.
  11. Dreidimensionales Scanverfahren, das auf den dreidimensionalen Scanner nach Anspruch 1 angewendet wird, wobei das dreidimensionale Scanverfahren umfasst:
    Projizieren von vorbestimmtem Licht in Form eines farbcodierten Streifens auf ein Zielobjekt;
    Erfassen von durch das Zielobjekt moduliertem Licht und Erhalten von mindestens einem Streifenbild basierend auf dem Licht, wobei das erhaltene Streifenbild als ein Codierungsbild verwendet wird, um jeweilige Streifensequenzen zu bestimmen, und als ein Rekonstruktionsbild verwendet wird, um eine dreidimensionale Rekonstruktion des Zielobjekts durchzuführen;
    Bestimmen von Sequenzen jeweiliger Streifen in der Vielzahl von Streifenbildern basierend auf dem Codierungsbild;
    Durchführen einer dreidimensionalen Rekonstruktion des Rekonstruktionsbildes basierend auf den Sequenzen und Erhalten dreidimensionaler Daten des Zielobjekts;
    wobei der dreidimensionale Scanner die Bilderfassungsvorrichtung (20) umfasst, die Bilderfassungsvorrichtung (20) ferner eine Vielzahl von Kameras (21) umfasst,
    wobei die Vielzahl von Kameras (21) mindestens eine Vielzahl von Monochromkameras umfasst, wobei die Bilderfassungsvorrichtung (20) das durch das Zielobjekt modulierte Licht durch die Vielzahl von Kameras (21) erfasst, um eine Vielzahl von Streifenbildern zu erhalten, wobei das durch mindestens eine Monochromkamera erhaltene Streifenbild als ein Rekonstruktionsbild verwendet wird, um eine dreidimensionale Rekonstruktion des Zielobjekts durchzuführen, die durch mindestens eine Vielzahl von Monochromkameras erhaltenen Streifenbilder als Codierungsbilder verwendet werden, um jeweilige Streifensequenzen zu bestimmen;
    und/oder wobei die Vielzahl von Kameras (21) mindestens eine Monochromkamera und mindestens eine Farbkamera umfasst, wobei das durch mindestens eine Monochromkamera erhaltene Streifenbild als ein Rekonstruktionsbild verwendet wird, um eine dreidimensionale Rekonstruktion des Zielobjekts durchzuführen, das durch mindestens eine Farbkamera erhaltene Streifenbild als ein Codierungsbild verwendet wird, um jeweilige Streifensequenzen zu bestimmen;
    wobei das Streifenbild einer spezifizierten Farbe durch die Monochromkamera erhalten wird.
  12. Dreidimensionales Scanverfahren nach Anspruch 11, wobei das dreidimensionale Scanverfahren umfasst:
    Erhalten eines ersten Bildes und eines zweiten Bildes, wobei das erste Bild und das zweite Bild Streifenbilder sind, die basierend auf demselben Strahl erhalten wurden;
    Bestimmen von Codierungssequenzen von jeweiligen Streifen basierend auf dem ersten Bild;
    Abgleichen der Streifen des zweiten Bildes basierend auf den Codierungssequenzen, um eine dreidimensionale Rekonstruktion durchzuführen und so dreidimensionale Daten eines Zielobjekts zu erhalten.
  13. Dreidimensionales Scanverfahren nach Anspruch 11, umfassend:
    jeweils Emittieren, in jedem vorbestimmten Zeitraum, von Anfangslicht, das dem vorbestimmten Zeitraum entspricht, wobei jedes des Anfangslichts aus Licht von mindestens einer Farbe in den vorbestimmten farbcodierten Streifen zusammengesetzt ist, und, nachdem jedes des Anfangslichts durch Muster der vorbestimmten farbcodierten Streifen auf dem Licht durchlassenden Abschnitt übertragen wurde, werden jeweilige vorbestimmte Farbstreifen erzeugt und auf ein Zielobjekt projiziert;
    jeweils Erfassen von durch das Zielobjekt moduliertem Licht in der Vielzahl von vorbestimmten Zeiträumen und Erhalten einer Vielzahl von Streifenbildern basierend auf dem obigen Licht, wobei die erhaltenen Streifenbilder als Codierungsbilder verwendet werden, um jeweilige Streifensequenzen zu bestimmen, und als Rekonstruktionsbilder verwendet werden, um eine dreidimensionale Rekonstruktion des Zielobjekts durchzuführen;
    Bestimmen von Sequenzen jeweiliger Streifen in der Vielzahl von Streifenbildern basierend auf den Codierungsbildern;
    Durchführen einer dreidimensionalen Rekonstruktion der Rekonstruktionsbilder basierend auf den Sequenzen und Erhalten dreidimensionaler Daten des Zielobjekts.
  14. Dreidimensionales Scansystem, umfassend:
    einen dreidimensionalen Scanner, der konfiguriert ist, um in jedem vorbestimmten Zeitraum jeweils ein vorbestimmtes Streifenmuster, das dem vorbestimmten Zeitraum entspricht, auf ein Zielobjekt zu projizieren und durch das Zielobjekt moduliertes Licht zu erfassen, um so eine Vielzahl von Streifenbildern zu erhalten, wenn auf das Zielobjekt vorbestimmte Streifenmuster projiziert werden, wobei Streifen jedes vorbestimmten Streifenmusters gemäß vorbestimmten farbcodierten Streifen angeordnet sind, jedes vorbestimmte Streifenmuster Streifen von mindestens einer Farbe in den vorbestimmten farbcodierten Streifen umfasst, eine Vielzahl von vorbestimmten Streifenmustern Streifen von mindestens zwei Farben in den vorbestimmten farbcodierten Streifen umfasst, und die Streifen in den vorbestimmten Streifenmustern auf die gleiche Weise angeordnet sind wie Streifen der gleichen Farbe in den vorbestimmten farbcodierten Streifen;
    einen Bildprozessor, der mit dem dreidimensionalen Scanner verbunden ist und konfiguriert ist, um eine Vielzahl von durch den dreidimensionalen Scanner erhaltenen Streifenbildern zu erhalten, und die Streifenbilder als Codierungsbilder zum Bestimmen jeweiliger Streifensequenzen und als Rekonstruktionsbilder zum Durchführen einer dreidimensionalen Rekonstruktion des Zielobjekts zu verwenden,
    wobei der dreidimensionale Scanner der dreidimensionale Scanner nach Anspruch 5 ist.
EP20878731.7A 2019-10-24 2020-10-26 Dreidimensionaler scanner und dreidimensionales scanverfahren Active EP4050302B1 (de)

Applications Claiming Priority (3)

Application Number Priority Date Filing Date Title
CN201911018729.0A CN112710253B (zh) 2019-10-24 2019-10-24 三维扫描仪和三维扫描方法
CN201911018772.7A CN112712583B (zh) 2019-10-24 2019-10-24 三维扫描仪、三维扫描系统和三维扫描方法
PCT/CN2020/123684 WO2021078300A1 (zh) 2019-10-24 2020-10-26 三维扫描仪和三维扫描方法

Publications (4)

Publication Number Publication Date
EP4050302A1 EP4050302A1 (de) 2022-08-31
EP4050302A4 EP4050302A4 (de) 2023-11-29
EP4050302B1 true EP4050302B1 (de) 2025-09-10
EP4050302C0 EP4050302C0 (de) 2025-09-10

Family

ID=75620409

Family Applications (1)

Application Number Title Priority Date Filing Date
EP20878731.7A Active EP4050302B1 (de) 2019-10-24 2020-10-26 Dreidimensionaler scanner und dreidimensionales scanverfahren

Country Status (8)

Country Link
US (1) US12007224B2 (de)
EP (1) EP4050302B1 (de)
JP (1) JP7298025B2 (de)
KR (1) KR102686393B1 (de)
AU (1) AU2020371142B2 (de)
CA (1) CA3158933A1 (de)
ES (1) ES3043063T3 (de)
WO (1) WO2021078300A1 (de)

Families Citing this family (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN113654487B (zh) * 2021-08-17 2023-07-18 西安交通大学 一种单幅彩色条纹图动态三维测量方法及系统
CN120436568A (zh) * 2024-02-07 2025-08-08 苏州佳世达光电有限公司 立体扫描装置及其立体扫描方法
CN118319259B (zh) * 2024-06-12 2024-10-18 先临三维科技股份有限公司 三维扫描方法、装置、扫描仪及存储介质

Family Cites Families (32)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2004325096A (ja) * 2003-04-22 2004-11-18 Fujitsu Ltd 格子パターン投影法における画像処理方法、画像処理装置及び計測装置
WO2005067389A2 (en) 2004-01-15 2005-07-28 Technion Research & Development Foundation Ltd. Three-dimensional video scanner
US20090322859A1 (en) * 2008-03-20 2009-12-31 Shelton Damion M Method and System for 3D Imaging Using a Spacetime Coded Laser Projection System
EP3669819B1 (de) * 2010-07-12 2022-04-13 3Shape A/S 3d-modellierung eines objekts unter verwendung von strukturmerkmalen
CN101975558B (zh) * 2010-09-03 2012-04-11 东南大学 基于彩色光栅投影的快速三维测量方法
US20120258431A1 (en) * 2011-04-11 2012-10-11 Lauren Mark D Method and System for Tracking Jaw Motion
CN102445165B (zh) * 2011-08-05 2013-10-16 南京航空航天大学 基于单幅彩色编码光栅的立体视觉测量方法
KR20130091090A (ko) * 2012-02-07 2013-08-16 한국기술교육대학교 산학협력단 3차원 스캐너 및 스캐닝 방법
WO2013132494A1 (en) 2012-03-09 2013-09-12 Galil Soft Ltd System and method for non-contact measurement of 3d geometry
US20140004953A1 (en) 2012-06-29 2014-01-02 Zynga Inc. Social Network Data Analysis to Generate Suggestion Metrics for Online Gaming
US8805057B2 (en) * 2012-07-31 2014-08-12 Mitsubishi Electric Research Laboratories, Inc. Method and system for generating structured light with spatio-temporal patterns for 3D scene reconstruction
US9030470B2 (en) * 2012-08-14 2015-05-12 Hong Kong Applied Science and Technology Research Institute Company Limited Method and system for rapid three-dimensional shape measurement
CN202770413U (zh) 2012-08-23 2013-03-06 杭州先临三维科技股份有限公司 用黑白相机获取彩色图像的三维扫描仪
DE102012220048B4 (de) * 2012-11-02 2018-09-20 Sirona Dental Systems Gmbh Kalibrierungsvorrichtung und Verfahren zur Kalibrierung einer dentalen Kamera
DE102012222505B4 (de) * 2012-12-07 2017-11-09 Michael Gilge Verfahren zum Erfassen dreidimensionaler Daten eines zu vermessenden Objekts, Verwendung eines derartigen Verfahrens zur Gesichtserkennung und Vorrichtung zur Durchführung eines derartigen Verfahrens
DE102014207022A1 (de) 2014-04-11 2015-10-29 Siemens Aktiengesellschaft Tiefenbestimmung einer Oberfläche eines Prüfobjektes
WO2017172810A1 (en) 2016-04-01 2017-10-05 Halliburton Energy Services, Inc. Borehole dispersive wave processing with automatic dispersion matching for compressional and shear slowness
ES2876155T3 (es) 2016-07-13 2021-11-12 Dds Company Escáner tridimensional y aparato para el procesamiento de objetos artificiales mediante el uso del mismo
CN106251376B (zh) 2016-08-12 2019-08-06 南京航空航天大学 一种面向彩色结构光编码与边缘提取方法
JP6823985B2 (ja) * 2016-09-28 2021-02-03 Juki株式会社 3次元形状計測方法及び3次元形状計測装置
EP3529553A4 (de) * 2016-10-18 2020-06-17 Dentlytec G.P.L. Ltd. Intraorale scanmuster
CN108986177A (zh) * 2017-05-31 2018-12-11 华为技术有限公司 结构光编码方法、装置及终端设备
CN108269279B (zh) * 2017-07-17 2019-11-08 先临三维科技股份有限公司 基于单目三维扫描系统的三维重构方法和装置
CA3022442C (en) * 2017-10-24 2019-08-27 Shining 3D Tech Co., Ltd. Three-dimensional reconstruction method and device based on monocular three-dimensional scanning system
CN108534714B (zh) * 2018-03-09 2019-09-24 南昌航空大学 基于正弦和二进制条纹投影的快速三维测量方法
US10578720B2 (en) * 2018-04-05 2020-03-03 Luminar Technologies, Inc. Lidar system with a polygon mirror and a noise-reducing feature
CN108827184B (zh) 2018-04-28 2020-04-28 南京航空航天大学 一种基于相机响应曲线的结构光自适应三维测量方法
CN108955571B (zh) * 2018-05-16 2019-07-09 南昌航空大学 双频外差与相移编码相结合的三维测量方法
CN108955574B (zh) * 2018-07-09 2020-04-28 广东工业大学 一种三维测量方法及系统
CN109283186A (zh) 2018-10-12 2019-01-29 成都精工华耀科技有限公司 一种轨道可视化巡检双光谱二维与三维融合成像系统
CN109489583B (zh) * 2018-11-19 2021-09-17 先临三维科技股份有限公司 投影装置、采集装置及具有其的三维扫描系统
CN110245165B (zh) 2019-05-20 2023-04-11 平安科技(深圳)有限公司 风险传导关联图谱优化方法、装置和计算机设备

Also Published As

Publication number Publication date
AU2020371142A1 (en) 2022-05-26
WO2021078300A1 (zh) 2021-04-29
JP7298025B2 (ja) 2023-06-26
JP2023500820A (ja) 2023-01-11
EP4050302A4 (de) 2023-11-29
US12007224B2 (en) 2024-06-11
US20220364853A1 (en) 2022-11-17
CA3158933A1 (en) 2021-04-29
ES3043063T3 (en) 2025-11-24
AU2020371142B2 (en) 2023-06-08
EP4050302A1 (de) 2022-08-31
EP4050302C0 (de) 2025-09-10
KR20220084402A (ko) 2022-06-21
KR102686393B1 (ko) 2024-07-19

Similar Documents

Publication Publication Date Title
CN112710253B (zh) 三维扫描仪和三维扫描方法
US20240192484A1 (en) Three-dimensional scanner, three-dimensional scanning system, and three-dimensional reconstruction method
EP4050302B1 (de) Dreidimensionaler scanner und dreidimensionales scanverfahren
EP3885700B1 (de) Dreidimensionales abtastsystem
JP6392934B2 (ja) 光源システム及び投影装置
JP5964436B2 (ja) スペクトル的に隣り合った色帯域を使用する立体プロジェクタ
US9451240B2 (en) 3-dimensional image acquisition apparatus and 3D image acquisition method for simultaneously obtaining color image and depth image
JP4477571B2 (ja) 照明ユニット及びこれを採用した画像投射装置
US20120314039A1 (en) 3d image acquisition apparatus employing interchangable lens
CN105988270B (zh) 用于3d图像显示的硅基液晶投影系统
CN112712583B (zh) 三维扫描仪、三维扫描系统和三维扫描方法
JP7140452B2 (ja) デジタルマイクロミラーデバイスのための順方向対順方向高ダイナミックレンジ・アーキテクチャ
CN118317029A (zh) 三维扫描装置及三维扫描系统
US9213225B2 (en) Polarization converter for use in a projector apparatus and projector apparatus comprising the polarization converter
TWI453464B (zh) 用於一投影裝置之光學模組及投影裝置
CN103412461B (zh) 基于分光片的3d成像系统
CN222166030U (zh) 三维扫描仪
CN222544657U (zh) 三维扫描仪
CN222069599U (zh) 三维扫描设备
US20250355337A1 (en) Three-dimensional Imaging Module and Three-dimensional Scanner
CN103293835B (zh) 极化转换元件组及包含该极化转换元件组的投影装置
CN118000947A (zh) 一种用于口内的三维扫描仪系统

Legal Events

Date Code Title Description
STAA Information on the status of an ep patent application or granted ep patent

Free format text: STATUS: THE INTERNATIONAL PUBLICATION HAS BEEN MADE

PUAI Public reference made under article 153(3) epc to a published international application that has entered the european phase

Free format text: ORIGINAL CODE: 0009012

STAA Information on the status of an ep patent application or granted ep patent

Free format text: STATUS: REQUEST FOR EXAMINATION WAS MADE

17P Request for examination filed

Effective date: 20220524

AK Designated contracting states

Kind code of ref document: A1

Designated state(s): AL AT BE BG CH CY CZ DE DK EE ES FI FR GB GR HR HU IE IS IT LI LT LU LV MC MK MT NL NO PL PT RO RS SE SI SK SM TR

DAV Request for validation of the european patent (deleted)
DAX Request for extension of the european patent (deleted)
A4 Supplementary search report drawn up and despatched

Effective date: 20231031

RIC1 Information provided on ipc code assigned before grant

Ipc: G06T 7/50 20170101ALI20231025BHEP

Ipc: A61B 5/103 20060101ALI20231025BHEP

Ipc: G01B 11/00 20060101ALI20231025BHEP

Ipc: G01B 11/25 20060101AFI20231025BHEP

STAA Information on the status of an ep patent application or granted ep patent

Free format text: STATUS: EXAMINATION IS IN PROGRESS

17Q First examination report despatched

Effective date: 20240923

GRAP Despatch of communication of intention to grant a patent

Free format text: ORIGINAL CODE: EPIDOSNIGR1

STAA Information on the status of an ep patent application or granted ep patent

Free format text: STATUS: GRANT OF PATENT IS INTENDED

INTG Intention to grant announced

Effective date: 20250402

GRAS Grant fee paid

Free format text: ORIGINAL CODE: EPIDOSNIGR3

GRAA (expected) grant

Free format text: ORIGINAL CODE: 0009210

STAA Information on the status of an ep patent application or granted ep patent

Free format text: STATUS: THE PATENT HAS BEEN GRANTED

AK Designated contracting states

Kind code of ref document: B1

Designated state(s): AL AT BE BG CH CY CZ DE DK EE ES FI FR GB GR HR HU IE IS IT LI LT LU LV MC MK MT NL NO PL PT RO RS SE SI SK SM TR

REG Reference to a national code

Ref country code: GB

Ref legal event code: FG4D

REG Reference to a national code

Ref country code: CH

Ref legal event code: EP

REG Reference to a national code

Ref country code: DE

Ref legal event code: R096

Ref document number: 602020058714

Country of ref document: DE

REG Reference to a national code

Ref country code: IE

Ref legal event code: FG4D

U01 Request for unitary effect filed

Effective date: 20251003

U07 Unitary effect registered

Designated state(s): AT BE BG DE DK EE FI FR IT LT LU LV MT NL PT RO SE SI

Effective date: 20251014

REG Reference to a national code

Ref country code: ES

Ref legal event code: FG2A

Ref document number: 3043063

Country of ref document: ES

Kind code of ref document: T3

Effective date: 20251124

U20 Renewal fee for the european patent with unitary effect paid

Year of fee payment: 6

Effective date: 20251030

PGFP Annual fee paid to national office [announced via postgrant information from national office to epo]

Ref country code: GB

Payment date: 20251029

Year of fee payment: 6

PG25 Lapsed in a contracting state [announced via postgrant information from national office to epo]

Ref country code: NO

Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT

Effective date: 20251210

PGFP Annual fee paid to national office [announced via postgrant information from national office to epo]

Ref country code: MC

Payment date: 20251029

Year of fee payment: 6

PG25 Lapsed in a contracting state [announced via postgrant information from national office to epo]

Ref country code: HR

Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT

Effective date: 20250910

PG25 Lapsed in a contracting state [announced via postgrant information from national office to epo]

Ref country code: GR

Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT

Effective date: 20251211

PG25 Lapsed in a contracting state [announced via postgrant information from national office to epo]

Ref country code: PL

Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT

Effective date: 20250910

PG25 Lapsed in a contracting state [announced via postgrant information from national office to epo]

Ref country code: RS

Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT

Effective date: 20251210