WO2020038277A1 - 三维扫描的图像获取、处理方法、装置以及三维扫描设备 - Google Patents

三维扫描的图像获取、处理方法、装置以及三维扫描设备 Download PDF

Info

Publication number
WO2020038277A1
WO2020038277A1 PCT/CN2019/100799 CN2019100799W WO2020038277A1 WO 2020038277 A1 WO2020038277 A1 WO 2020038277A1 CN 2019100799 W CN2019100799 W CN 2019100799W WO 2020038277 A1 WO2020038277 A1 WO 2020038277A1
Authority
WO
WIPO (PCT)
Prior art keywords
image
projection
optical system
illumination
light
Prior art date
Application number
PCT/CN2019/100799
Other languages
English (en)
French (fr)
Inventor
马超
赵晓波
陈晓军
Original Assignee
先临三维科技股份有限公司
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by 先临三维科技股份有限公司 filed Critical 先临三维科技股份有限公司
Priority to US17/265,509 priority Critical patent/US11887321B2/en
Priority to EP19851007.5A priority patent/EP3819872A4/en
Publication of WO2020038277A1 publication Critical patent/WO2020038277A1/zh

Links

Images

Classifications

    • GPHYSICS
    • G01MEASURING; TESTING
    • G01BMEASURING LENGTH, THICKNESS OR SIMILAR LINEAR DIMENSIONS; MEASURING ANGLES; MEASURING AREAS; MEASURING IRREGULARITIES OF SURFACES OR CONTOURS
    • G01B11/00Measuring arrangements characterised by the use of optical techniques
    • G01B11/24Measuring arrangements characterised by the use of optical techniques for measuring contours or curvatures
    • G01B11/25Measuring arrangements characterised by the use of optical techniques for measuring contours or curvatures by projecting a pattern, e.g. one or more lines, moiré fringes on the object
    • G01B11/2518Projection by scanning of the object
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T17/00Three dimensional [3D] modelling, e.g. data description of 3D objects
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/50Depth or shape recovery
    • G06T7/521Depth or shape recovery from laser ranging, e.g. using interferometry; from the projection of structured light
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61CDENTISTRY; APPARATUS OR METHODS FOR ORAL OR DENTAL HYGIENE
    • A61C9/00Impression cups, i.e. impression trays; Impression methods
    • A61C9/004Means or methods for taking digitized impressions
    • A61C9/0046Data acquisition means or methods
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61CDENTISTRY; APPARATUS OR METHODS FOR ORAL OR DENTAL HYGIENE
    • A61C9/00Impression cups, i.e. impression trays; Impression methods
    • A61C9/004Means or methods for taking digitized impressions
    • A61C9/0046Data acquisition means or methods
    • A61C9/0053Optical means or methods, e.g. scanning the teeth by a laser or light beam
    • A61C9/006Optical means or methods, e.g. scanning the teeth by a laser or light beam projecting one or more stripes or patterns on the teeth
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01BMEASURING LENGTH, THICKNESS OR SIMILAR LINEAR DIMENSIONS; MEASURING ANGLES; MEASURING AREAS; MEASURING IRREGULARITIES OF SURFACES OR CONTOURS
    • G01B11/00Measuring arrangements characterised by the use of optical techniques
    • G01B11/24Measuring arrangements characterised by the use of optical techniques for measuring contours or curvatures
    • G01B11/25Measuring arrangements characterised by the use of optical techniques for measuring contours or curvatures by projecting a pattern, e.g. one or more lines, moiré fringes on the object
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T15/003D [Three Dimensional] image rendering
    • G06T15/04Texture mapping
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T15/003D [Three Dimensional] image rendering
    • G06T15/50Lighting effects
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T15/003D [Three Dimensional] image rendering
    • G06T15/50Lighting effects
    • G06T15/506Illumination models
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/20Analysis of motion
    • G06T7/246Analysis of motion using feature-based methods, e.g. the tracking of corners or segments
    • G06T7/248Analysis of motion using feature-based methods, e.g. the tracking of corners or segments involving reference images or patches
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/30Determination of transform parameters for the alignment of images, i.e. image registration
    • G06T7/33Determination of transform parameters for the alignment of images, i.e. image registration using feature-based methods
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/40Analysis of texture
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/10Image acquisition modality
    • G06T2207/10024Color image
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/10Image acquisition modality
    • G06T2207/10028Range image; Depth image; 3D point clouds
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/10Image acquisition modality
    • G06T2207/10141Special mode during image acquisition
    • G06T2207/10152Varying illumination
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/30Subject of image; Context of image processing
    • G06T2207/30004Biomedical image processing
    • G06T2207/30036Dental; Teeth
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N25/00Circuitry of solid-state image sensors [SSIS]; Control thereof
    • H04N25/70SSIS architectures; Circuits associated therewith
    • H04N25/71Charge-coupled device [CCD] sensors; Charge-transfer registers specially adapted for CCD sensors

Definitions

  • the present disclosure relates to the field of communications, and in particular, to a three-dimensionally scanned image acquisition, processing method, device, and three-dimensionally scanned device.
  • Dynamic three-dimensional scanners mostly use the principle of structured light triangle imaging. Structured light patterns need to be projected on the surface of the object for three-dimensional reconstruction. Texture maps are generally collected at the previous time and position or the later time and position of the reconstructed map. Therefore, there must be a time difference and a position difference between the reconstructed image and the texture image.
  • Embodiments of the present disclosure provide a three-dimensionally scanned image acquisition and processing method, device, and three-dimensional scanning device, so as to at least solve the production cost in the process of reducing the time difference and position difference between the reconstructed image and the texture image in the related art in the three-dimensional scan Very high, low accuracy in texture mapping, and time consuming issues.
  • an image processing method for three-dimensional scanning including: synchronously acquiring a projection pattern of light of a first wavelength projected on a measured object and an illumination image of light of a second wavelength; The color texture image of the measured object is described.
  • the projection pattern coincides with the position of the illumination image; or, when the measured object is projected using a different projection method, the projection
  • the positional relationship between the pattern and the illumination image is determined according to the positional relationship between the acquisition device of the projection pattern and the acquisition device of the illumination image.
  • the color texture image is obtained by triggering white light to be projected on the measured object.
  • the illumination image is an image used to reflect the texture of the measured object.
  • the first wavelength light and the second wavelength light are wavelength lights that do not interfere with each other in wavelength.
  • a three-dimensional scanning image processing method including: determining an image relationship between a color texture image and an illumination image; and matching the color texture image to a projection pattern according to the image relationship. 3D reconstruction of the texture map of the data; wherein the projection pattern is obtained by projecting a first wavelength light on the measured object, and the illumination image is obtained by projecting a second wavelength light on the measured object in synchronization with the projection pattern The color texture image is obtained by projection on the measured object.
  • the image relationship between the color texture image and the illumination image is determined by at least one of the following methods: using the sift or surf feature extraction point matching on the color texture image and the illumination image, or, The color texture image and the illumination image are processed by an optical flow tracking algorithm.
  • the image relationship includes at least one of the following: a conversion relationship between the color texture image and the illumination image, and a mapping relationship between the color texture image and the illumination image.
  • the projection pattern, the illumination image, and the color texture image are obtained by: inputting the projected image into one or more color CCD chips; receiving the one or more color CCDs Image after chip processing
  • an image acquisition device for three-dimensional scanning, including: a first image acquisition module that synchronously acquires a projection pattern of light of a first wavelength projected on a measured object and a An illumination image; a second image acquisition module that acquires a color texture image of the measured object;
  • an image processing apparatus for three-dimensional scanning, including: an image processing module that determines an image relationship between the color texture image and the illumination image; and a reconstruction module, according to the image relationship, Matching the color texture image to the projection pattern to perform a three-dimensional reconstruction of the texture map of the data; wherein the projection pattern is obtained by projecting the first wavelength light on the measured object, and the illumination image is on the measured object The projected second wavelength light is acquired in synchronization with the projection pattern; the color texture image is acquired by projection on the measured object.
  • a three-dimensional scanning device including: a timing control circuit, a projection optical system, an auxiliary optical system, an illumination optical system, an image acquisition optical system, wherein the projection optical system, the Both the auxiliary optical system and the illumination optical system include a projection device configured to project light; the image acquisition optical system is configured to acquire the projection optical system, the auxiliary optical system, and the illumination optical system for projection To the image of the object under test; the timing control circuit is connected to the image acquisition optical system, the projection optical system auxiliary optical system, and the illumination optical system, and is configured to connect the image acquisition optical system, the projection The optical system, the auxiliary optical system, and the illumination optical system perform timing control.
  • the projection optical system is triggered to project a pattern of light of the first wavelength onto the object to be measured, and the auxiliary optical system is triggered to simultaneously project light of the second wavelength To the measured object, and obtain the projection pattern of the first wavelength light and the illumination image of the second wavelength light through the image acquisition optical system; when the timing control circuit determines that the second time is reached, the illumination is triggered An optical system projects on the measured object, and obtains a color texture image through the image acquisition optical system.
  • the three-dimensional scanning device is connected to an image processor, or the three-dimensional scanning device includes the image processor, wherein the image processor collects the projection pattern, the illumination image, and the color texture image Determining an image relationship between the color texture image and the illumination image, and matching the color texture image to the projection pattern to perform three-dimensional reconstruction of a texture map of the data according to the image relationship.
  • the projection device includes a projection frame and a projection lens connected to the projection frame, wherein the projection frame includes a light source located in an end of the projection frame far from the projection lens.
  • An aperture located between the projection frame and disposed between the light source and the projection lens; wherein the light source is disposed coaxially with the projection lens.
  • the projection optical system the auxiliary optical system, and the illumination optical system
  • at least a projection device of the projection optical system is provided with a projection pattern mask, and the projection pattern mask is located on the diaphragm. And the projection lens.
  • the projection frame further includes: a condenser lens located at an end of the projection frame away from the projection lens, configured to condense the light source; and located between the light source and the diaphragm
  • the heat-insulating glass provided between the two is arranged to prevent the heat generated by the light source from diffusing.
  • the auxiliary optical system and / or the illumination optical system are LEDs.
  • the auxiliary optical system and the illumination optical system are the same optical system.
  • a storage medium stores a computer program, wherein the computer program is configured to execute the steps in any one of the foregoing method embodiments when running.
  • an electronic device including a memory and a processor, the memory stores a computer program, and the processor is configured to run the computer program to execute any one of the above Steps in a method embodiment.
  • an auxiliary light source is added at the same time and the same position, and the projection of the auxiliary light source is used to match the color problem map. Therefore, it can solve the problem of dislocation of the texture caused by the time difference and position difference between the 3D reconstruction data and the color texture map in the related technology, so as to reduce the time difference and position difference between the 3D reconstruction data and the color texture map, and the accuracy of the texture map High performance, low production cost and less time consuming effect.
  • FIG. 1 is a block diagram of a hardware structure of a mobile terminal of an image processing method according to an embodiment of the present disclosure
  • FIG. 2 is a flowchart of an image acquisition method according to an embodiment of the present disclosure
  • FIG. 3 is a flowchart of an image processing method according to an embodiment of the present disclosure.
  • FIG. 4 is a structural block diagram of an image acquisition device according to an embodiment of the present disclosure.
  • FIG. 5 is a structural block diagram of an image processing apparatus according to an embodiment of the present disclosure.
  • FIG. 6 is a schematic structural diagram of a three-dimensional scanning device according to an embodiment of the present disclosure.
  • FIG. 7 is a structural block diagram of another three-dimensional scanning device according to an embodiment of the present disclosure.
  • FIG. 8 is a structural block diagram of still another three-dimensional scanning device according to an embodiment of the present disclosure.
  • FIG. 9 is a structural schematic diagram of a three-dimensional scanning device according to an embodiment of the present disclosure.
  • FIG. 10 is a structural schematic diagram of a projection device according to an embodiment of the present disclosure.
  • FIG. 11 is a structural schematic diagram of another projection device according to an embodiment of the present disclosure.
  • FIG. 1 is a block diagram of a hardware structure of a mobile terminal in a three-dimensionally scanned image acquisition method according to an embodiment of the present disclosure.
  • the mobile terminal 10 may include one or more (only one shown in FIG. 1) a processor 102 (the processor 102 may include, but is not limited to, a processing device such as a microprocessor MCU or a programmable logic device FPGA) ) And a memory 104 for storing data, optionally, the above mobile terminal may further include a transmission device 106 and an input-output device 108 for communication functions.
  • FIG. 1 is only a schematic, and it does not limit the structure of the above mobile terminal.
  • the mobile terminal 10 may further include more or fewer components than those shown in FIG. 1, or have a different configuration from that shown in FIG. 1.
  • the memory 104 may be configured to store a computer program, for example, a software program and module of application software, such as a computer program corresponding to a three-dimensionally scanned image acquisition method in the embodiment of the present disclosure.
  • the processor 102 runs a computer stored in the memory 104 A program to execute various functional applications and data processing, that is, to implement the method described above.
  • the memory 104 may include a high-speed random access memory, and may further include a non-volatile memory, such as one or more magnetic storage devices, a flash memory, or other non-volatile solid-state memory.
  • the memory 104 may further include a memory remotely disposed with respect to the processor 102, and these remote memories may be connected to the mobile terminal 10 through a network. Examples of the above network include, but are not limited to, the Internet, an intranet, a local area network, a mobile communication network, and combinations thereof.
  • the transmission device 106 is configured to receive or transmit data via a network.
  • a specific example of the above network may include a wireless network provided by a communication provider of the mobile terminal 10.
  • the transmission device 106 includes a network adapter (Network Interface Controller, NIC for short), which can be connected to other network equipment through a base station so as to communicate with the Internet.
  • the transmission device 106 may be a radio frequency (RF) module, which is used to communicate with the Internet in a wireless manner.
  • RF radio frequency
  • FIG. 2 is a flowchart of a three-dimensionally scanned image acquisition method according to an embodiment of the present disclosure. As shown in FIG. 2, the process includes the following steps. :
  • step S202 the projection pattern of the first wavelength light and the illumination image of the second wavelength light that are projected on the measured object are acquired synchronously;
  • synchronous acquisition refers to acquiring the projection pattern and the illumination image at the same time.
  • the illumination image is an image used to reflect the texture of the measured object.
  • the illumination image described in this embodiment can reflect the texture of the projection object under the second wavelength light, but cannot reflect the true color of the projection object.
  • the above-mentioned first wavelength light and second wavelength light are wavelength lights that do not interfere with each other.
  • the first wavelength light and the second wavelength light may be monochromatic light, for example, the first wavelength light is blue light, and the second wavelength light is red light.
  • the mixed light is also within the protection scope of this embodiment, at least to ensure that the mixed light and another medium-wavelength light can not interfere with each other, that is, within the protection scope of this embodiment.
  • acquiring the projection pattern and an illumination image of the second wavelength light includes: acquiring the projection pattern and the illumination image of the second wavelength light includes: passing a first channel corresponding to the first wavelength light And the second channel corresponding to the second wavelength light distinguishes the acquired projection pattern from the illumination image.
  • the first wavelength light is blue light
  • the second wavelength light is red light
  • a mixed image of the projection pattern mixed with the first wavelength light and the illumination image of the second wavelength light is obtained. Images, and input the mixed images into color channels with wavelengths of 760-622nm and color channels of 450-435nm.
  • the projection pattern since it is obtained by projecting the pattern of the first wavelength light onto the measured object, the projection pattern can only be obtained after inputting a color channel of 450-435nm, and the projection pattern at 760-622nm Color channels are filtered out.
  • the illumination image is obtained by projecting the pattern of light of the second wavelength onto the measured object. Therefore, the projected pattern can only be obtained after inputting the color channel of 760-622nm, and at 450- The 435nm color channel is filtered out.
  • synchronously acquiring the projection pattern and the illumination image projected on the measured object includes: the projection pattern coincides with a position of the illumination image; or the projection pattern and the illumination The positional relationship of the image is determined according to the positional relationship of the projection device of the projection pattern and the projection device of the illumination image.
  • the positional relationship between the two is the same.
  • the positional relationship between the projection pattern and the illumination image at this time is bound to be different. In this case, the relationship between the positions of the two acquisition devices needs to be determined in advance to determine the positional relationship between the projection pattern and the illumination image.
  • the projection pattern and the illumination image need to be distinguished by a first channel corresponding to the first wavelength light and a second channel corresponding to the second wavelength light.
  • Step S204 Acquire a color texture image of the measured object.
  • the color texture image is obtained by triggering white light to be projected on the measured object.
  • time for triggering step S204 is later than the time for step S202.
  • the projection pattern of the first wavelength light is used for three-dimensional reconstruction, and the illumination image of the second wavelength light is used to assist texture tracking.
  • FIG. 3 is a flowchart of a three-dimensionally scanned image processing method according to an embodiment of the present disclosure. As shown in FIG. 3, the process includes the following steps:
  • Step S302 determining an image relationship between the color texture image and the illumination image
  • the illumination image is obtained by: inputting the projected image into one or more color CCD chips; and receiving the image processed by the one or more color CCD chips.
  • the color texture image is obtained by: inputting the projected color texture image into one or more color CCD chips; and receiving the color texture image processed by the one or more color CCD chips.
  • the image relationship between the color texture image and the illumination image is determined by at least one of the following methods: using sift or surf feature extraction point matching on the color texture image and the illumination image, or The color texture image and the illumination image are processed by an optical flow tracking algorithm.
  • the image relationship includes at least: a conversion relationship between the color texture image and the illumination image, and a mapping relationship between the color texture image and the illumination image.
  • Step S304 Match the color texture image to the projection pattern to perform three-dimensional reconstruction of the texture map of the data according to the image relationship.
  • the doctor In order to understand the serious condition of the patient's diseased teeth, the doctor often uses a dynamic 3D scanner to scan the teeth to observe the full picture of the user's diseased teeth.
  • a blue light pattern and a red light are simultaneously projected onto a patient's diseased tooth.
  • the blue channel and the red channel of the color camera used can respectively capture the projection pattern corresponding to the blue light and the tooth image corresponding to the red light. At the same time, they are taken at the same time, so it is guaranteed that the projection pattern corresponding to the blue light and the tooth image corresponding to the red light do not have any time difference and position difference.
  • the illumination optical system is used to project on the diseased tooth, and by shooting, the color texture map of the diseased tooth can be obtained.
  • the tooth image corresponding to the red light and the color texture map of the tooth can be used to reflect the characteristics of the tooth, using sift or surf feature point extraction and matching, or optical flow tracking algorithms and other processing methods, can be obtained
  • the relationship between various parts of the teeth for example, the mapping relationship and conversion relationship between the position of the gum in the tooth image corresponding to the red light and the position, coordinates, angle, and color of the gum in the color texture map of the tooth.
  • the 3D reconstruction map of the teeth and the color texture map of the teeth can be matched in time and position. That is, it can solve the problem of dislocation of texture caused by the time difference and position difference between the 3D reconstruction data and the color texture map.
  • the technical solution of the present disclosure that is essentially or contributes to the existing technology can be embodied in the form of a software product that is stored in a storage medium (such as ROM / RAM, magnetic disk, The optical disc) includes several instructions for causing a terminal device (which may be a mobile phone, a computer, a server, or a network device, etc.) to execute the methods described in the embodiments of the present disclosure.
  • a terminal device which may be a mobile phone, a computer, a server, or a network device, etc.
  • a three-dimensionally scanned image acquisition device is also provided.
  • the device is used to implement the foregoing embodiments and preferred implementation manners, and the descriptions will not be repeated.
  • the term "module” may implement a combination of software and / or hardware for a predetermined function.
  • the devices described in the following embodiments are preferably implemented in software, implementation in hardware, or a combination of software and hardware is also possible and conceived.
  • FIG. 4 is a structural block diagram of a three-dimensionally scanned image acquisition device according to an embodiment of the present disclosure. As shown in FIG. 4, the device includes:
  • a first image acquisition module 42 configured to synchronously acquire a projection pattern of light of a first wavelength and an illumination image of light of a second wavelength projected on a measured object;
  • a second image acquisition module 44 configured to acquire a projected color texture image of the measured object
  • the above modules can be implemented by software or hardware. For the latter, they can be implemented in the following ways, but are not limited to the above: the above modules are located in the same processor; or the above modules are arbitrarily combined The forms are located in different processors.
  • a three-dimensionally scanned image processing device is also provided.
  • the device is used to implement the foregoing embodiments and preferred implementation manners, and the descriptions will not be repeated.
  • the term "module” may implement a combination of software and / or hardware for a predetermined function.
  • the devices described in the following embodiments are preferably implemented in software, implementation in hardware, or a combination of software and hardware is also possible and conceived.
  • FIG. 5 is a structural block diagram of a three-dimensionally scanned image processing apparatus according to an embodiment of the present disclosure. As shown in FIG. 5, the apparatus includes:
  • An image processing module 52 configured to determine an image relationship between the color texture image and the illumination image
  • the reconstruction module 54 is configured to match the color texture image to the projection pattern to perform three-dimensional reconstruction of the texture map of the data according to the image relationship.
  • the projection pattern is obtained by projecting light of a first wavelength on the measured object, and the illumination image is obtained by projecting a second wavelength of light on the measured object in synchronization with the projection pattern; the color texture image is Obtained by measuring the projection on the object.
  • a three-dimensional scanning device is also provided.
  • the device is used to implement the foregoing embodiments and preferred implementation manners, and the descriptions will not be repeated.
  • FIG. 6 is a structural block diagram of a three-dimensional scanning device according to an embodiment of the present disclosure. As shown in FIG. 6, the device includes:
  • the timing control circuit 61 The timing control circuit 61, the projection optical system 62, the auxiliary optical system 63, the illumination optical system 64, and the image acquisition optical system 65.
  • the projection optical system, the auxiliary optical system, and the illumination optical system each include: a projection device configured to project light; and the image acquisition optical system 65 configured to acquire the projection optical system 62. An image projected by the auxiliary optical system 63 and the illumination optical system 64 onto the measured object;
  • the timing control circuit 61 is connected to the image acquisition optical system 65, the projection optical system 62, the auxiliary optical system 63, and the illumination optical system 64, and is configured to connect the image acquisition optical system 65 and the projection.
  • the optical system 62, the auxiliary optical system 63, and the illumination optical system 64 perform timing control.
  • the timing control circuit 61 determines that the first time is reached, the projection optical system 62 is triggered to project a pattern of light of the first wavelength onto the measured object, and the auxiliary optical system 63 is triggered to cause the second optical system 63 to The wavelength light is simultaneously projected onto the measured object, and the projection pattern of the first wavelength light and the illumination image of the second wavelength light are acquired through the image acquisition optical system 65; when the timing control circuit 61 determines that the At two times, the illumination optical system 64 is triggered to project white light on the measured object, and a color texture image is obtained through the image acquisition optical 65 system;
  • the auxiliary optical system 63 and the illumination optical system 64 are the same optical system, that is, they include the same projection device, and the same projection device projects the light of the second wavelength and the white light.
  • FIG. 7 is a structural block diagram of another three-dimensional scanning device according to an embodiment of the present disclosure.
  • the image processor 72 collects the projection pattern, the illumination image, and the color texture image; determines an image relationship between the color texture image and the illumination image; and according to the image relationship, the color The texture image is matched to the projection pattern to perform 3D reconstruction of the texture map of the data.
  • FIG. 8 is a structural block diagram of still another three-dimensional scanning device according to an embodiment of the present disclosure.
  • the three-dimensional scanning device includes the image processor 82, and the image processor 72 collects the projection pattern, the illumination image, and the color texture image; determining the color texture image and the illumination An image relationship of the image; and according to the image relationship, the color texture image is matched to the projection pattern to perform three-dimensional reconstruction of the texture map of the data.
  • FIG. 9 is a schematic structural diagram of a three-dimensional scanning device according to an embodiment of the present disclosure.
  • the Projector optical path shown in FIG. 9 is a projection optical system 62, and the LED is set to provide a second wavelength.
  • the Camera is set to perform operations of the image acquisition optical system 65.
  • the Control board is set to perform the timing control function of the timing control circuit 61.
  • the image processor 72 or the image processor 82 may be implemented by a PC terminal shown in FIG. 9.
  • FIG. 9 also provides a plurality of plane mirrors, which are arranged to project or project the emitted light in the light path of the LED or Projector onto the measured object.
  • the LED is a three-color light source or three light sources with different wavelengths.
  • FIG. 9 only exemplarily gives a structural schematic diagram of the three-dimensional scanning device, and other structural schematic diagrams based on the ideas of this embodiment are also within the protection scope of this embodiment.
  • a plurality of plane mirrors may be provided with a plurality of plane mirrors at different positions and other optical devices, so that the emitted light is filtered by reflection and transmission.
  • a projection device of at least one of the projection optical system 62, the auxiliary optical system 63, and the illumination optical system 64 is a digital micromirror device, wherein the digital micromirror device includes: using digital light Processing technology DLP projection module, in one embodiment, the Projector light path shown in FIG. 9 is a digital micromirror device, and the LED can also be replaced by a digital micromirror device.
  • FIG. 10 is a schematic structural diagram of a projection device according to an embodiment of the present disclosure. As shown in FIG. 10, it includes a projection frame 100 and a projection lens 102 connected to the projection frame 100.
  • the projection frame 100 includes:
  • the light source 1002 is located at an end of the projection frame 100 away from the projection lens;
  • the condenser lens 1004 is located at an end of the projection frame 100 away from the projection lens, and is configured to condense the light source 1002;
  • a diaphragm 1006 is a diaphragm located in the projection frame 100 and disposed between the light source 1002 and the projection lens 102.
  • the stop 1006 includes: two upper and lower sub-stops connected to the main body of the projection frame 100.
  • the light source 1002 is coaxially disposed with the projection lens 102.
  • FIG. 11 is a schematic structural diagram of another projection device according to an embodiment of the present disclosure.
  • the projection optical system, the auxiliary optical system, and In the illumination optical system at least a projection device of the projection optical system is provided with a projection pattern mask 1102.
  • a projection pattern mask 1102 is provided in a projection device of the projection optical system, and the projection device in the auxiliary optical system and the illumination optical system is not provided with a projection pattern mask 1102.
  • the projection device in the auxiliary optical system is provided with a projection pattern mask 1102.
  • the projection pattern mask 1102 is provided with a simple pattern, such as a crosshair, which does not affect texture tracking and can quickly And accurately obtain the conversion relationship between the illumination image and the color texture image, thereby improving the efficiency and accuracy of the texture mapping.
  • the projection pattern mask 1102 is located on the diaphragm 1006 and the projection lens 102, that is, a template for a pattern to be projected.
  • the projection device further includes: a heat-insulating glass 1104 located between the light source 1002 and the diaphragm 1006, and configured to prevent the heat generated by the light source 1002 from being diffused into the projection pattern and the projection lens 102 to extend the use of the device life.
  • a heat-insulating glass 1104 located between the light source 1002 and the diaphragm 1006, and configured to prevent the heat generated by the light source 1002 from being diffused into the projection pattern and the projection lens 102 to extend the use of the device life.
  • An embodiment of the present disclosure further provides a storage medium that stores a computer program therein, wherein the computer program is configured to execute the steps in any one of the foregoing method embodiments when running.
  • the foregoing storage medium may be configured to store a computer program for performing the following steps:
  • the foregoing storage medium may include, but is not limited to, a U disk, a read-only memory (ROM), a random access memory (Random Access Memory, RAM), A variety of media that can store computer programs, such as mobile hard disks, magnetic disks, or optical disks.
  • ROM read-only memory
  • RAM Random Access Memory
  • An embodiment of the present disclosure further provides an electronic device including a memory and a processor.
  • the memory stores a computer program
  • the processor is configured to run the computer program to perform the steps in any one of the foregoing method embodiments.
  • the electronic device may further include a transmission device and an input-output device, wherein the transmission device is connected to the processor, and the input-output device is connected to the processor.
  • the foregoing processor may be configured to execute the following steps by a computer program:
  • modules or steps of the present disclosure may be implemented by a general-purpose computing device, and they may be centralized on a single computing device or distributed on a network composed of multiple computing devices Above, optionally, they may be implemented with program code executable by a computing device, so that they may be stored in a storage device and executed by the computing device, and in some cases, may be in a different order than here
  • the steps shown or described are performed either by making them into individual integrated circuit modules or by making multiple modules or steps into a single integrated circuit module. As such, the present disclosure is not limited to any particular combination of hardware and software.

Landscapes

  • Engineering & Computer Science (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Theoretical Computer Science (AREA)
  • Computer Vision & Pattern Recognition (AREA)
  • Computer Graphics (AREA)
  • Health & Medical Sciences (AREA)
  • Optics & Photonics (AREA)
  • Dentistry (AREA)
  • Oral & Maxillofacial Surgery (AREA)
  • Epidemiology (AREA)
  • Life Sciences & Earth Sciences (AREA)
  • Animal Behavior & Ethology (AREA)
  • General Health & Medical Sciences (AREA)
  • Public Health (AREA)
  • Veterinary Medicine (AREA)
  • Multimedia (AREA)
  • Geometry (AREA)
  • Software Systems (AREA)
  • Length Measuring Devices By Optical Means (AREA)

Abstract

本公开提供了一种三维扫描的图像获取、处理方法、装置以及三维扫描设备。其中,该三维扫描的图像处理方法包括:同步获取投影在被测物体上的第一波长光的投影图案以及第二波长光的照明图像,获取投射所述被测物体的彩色纹理图像。通过本公开,解决了相关技术中三维重建数据与彩色纹理图之间的时间差和位置差而导致纹理错位的问题,进而达到了纹理贴图的准确性高,生产成本低,耗费时间少的效果。

Description

三维扫描的图像获取、处理方法、装置以及三维扫描设备 技术领域
本公开涉及通信领域,具体而言,涉及一种三维扫描的图像获取、处理方法、装置以及三维扫描设备。
背景技术
目前市面上已出现较多动态三维扫描仪,可以实时进行三维扫描获取物体的表面三维形貌,如果在三维数据表面贴上物体本身的纹理会提升显示效果,因此需要在扫描当前视角三维数据的时候获取当前视角的纹理图片来实时贴图。动态三维扫描仪较多采用的结构光三角成像原理,需要在物体表面投射结构光图案以进行三维重建,而纹理图一般在重建图的前一时刻和位置或后一时刻和位置来进行采集,因此必然会存在重建图和纹理图之间的时间差和位置差。市面上已有的产品会采取很多种方法来解决该问题,比如:提高相机的采集帧率以减少时间差和位置差,或者,采用前后帧纹理图差值的方式来估计等等。然而申请人发现,上述市面上已有的产品由于需要较高的采集帧率获取用前后帧纹理图差值的方式来估计的方式来实现,因此不仅实际生产成本很高,同时估计的方式存在很大的误差,因此纹理贴图的准确性还是会受到影响。
发明内容
本公开实施例提供了一种三维扫描的图像获取、处理方法、装置以及三维扫描设备,以至少解决相关技术中在三维扫描中减少重建图和纹理图之间的时间差和位置差过程当中生产成本很高、纹理贴图的准确性低,以及耗费大量时间等问题。
根据本公开的一个实施例,提供了一种三维扫描的图像处理方法,包括:同步获取投影在被测物体上的第一波长光的投影图案以及在第二波长光的照明图像;获取投射所述被测物体的彩色纹理图像。
可选地,在使用相同的投影方式投影所述被测物体时,所述投影图案与所述照明图像的位置重合;或,在使用不同的投影方式投影所述被测物体时,所述投影图案与所述照明图像的位置关系根据所述投影图案的采集设备和所述照明图像的采集设备的位置关系确定。
可选地,所述彩色纹理图像通过触发白光在所述被测物体上进行投射获取。
可选地,所述照明图像为用于反映所述被测物体的纹理的图像。
可选地,所述第一波长光与所述第二波长光为在波长上互不干扰的波长光。
根据本公开的另一个实施例,提供了一种三维扫描的图像处理方法,包括:确定彩色纹理图像与照明图像的图像关系;根据所述图像关系,将所述彩色纹理图像匹配到投影图案进行数据的纹理贴图的三维重建;其中,所述投影图案是通过在被测物体上投影第一波长光获取的,照明图像是在被测物体上的投影第二波长光与所述投影图案同步获取的;彩色纹理图像是在被测物体上的投影获取的。
可选地,所述彩色纹理图像与所述照明图像的图像关系至少通过以下其中之一的方式确定:对所述彩色纹理图像与所述照明图像使用sift或surf特征提取点匹配,或,对所述彩色纹理图像与所述照明图像进行光流跟踪算法处理。
可选地,所述图像关系至少包括其中之一:所述彩色纹理图像与所述照明图像的转换关系,以及所述彩色纹理图像与所述照明图像的映射关系。
可选地,所述投影图案,所述照明图像以及所述彩色纹理图像通过以下方式获取:将在投影后的图像输入到一个或者多个彩色CCD芯片当中;接收所述一个或者多个彩色CCD芯片处理后的图像
根据本公开的另一个实施例,提供了一种三维扫描的图像获取装置,包括:第一图像获取模块,同步获取投影在被测物体上的第一波长光的投影图案以及第二波长光的照明图像;第二图像获取模块,获取投射所述被 测物体的彩色纹理图像。
根据本公开的另一个实施例,提供了一种三维扫描的图像处理装置,包括:图像处理模块,确定所述彩色纹理图像与所述照明图像的图像关系;重建模块,根据所述图像关系,将所述彩色纹理图像匹配到所述投影图案进行数据的纹理贴图的三维重建;其中,所述投影图案是通过在被测物体上投影第一波长光获取的,照明图像是在被测物体上的投影第二波长光与所述投影图案同步获取的;彩色纹理图像是在被测物体上的投影获取的。
根据本公开的另一个实施例,提供了一种三维扫描设备,包括:时序控制电路,投影光学系统,辅助光学系统,照明光学系统,图像采集光学系统,其中,所述投影光学系统,所述辅助光学系统以及所述照明光学系统中均包括:投影设备,所述投影设备被设置为投射光;所述图像采集光学系统被设置为采集所述投影光学系统,辅助光学系统以及照明光学系统投影到被测物体的图像;所述时序控制电路与所述图像采集光学系统、所述投影光学系统辅助光学系统以及所述照明光学系统连接,被设置为对所述图像采集光学系统、所述投影光学系统,所述辅助光学系统以及所述照明光学系统进行时序控制。
可选地,所述时序控制电路判断达到第一时刻时,触发所述投影光学系统将第一波长光的图案投影到被测物体上,以及触发所述辅助光学系统将第二波长光同时投影到被测物体上,并通过所述图像采集光学系统获取所述第一波长光的投影图案以及所述第二波长光的照明图像;当所述时序控制电路判断达到第二时刻时,触发照明光学系统在所述被测物体上进行投射,并通过所述图像采集光学系统取彩色纹理图像。
可选地,所述三维扫描设备与图像处理器连接,或,所述三维扫描设备包括所述图像处理器,其中,所述图像处理器收集所述投影图案、照明图像以及所述彩色纹理图像;确定所述彩色纹理图像与所述照明图像的图像关系;并根据所述图像关系,将所述彩色纹理图像匹配到所述投影图案进行数据的纹理贴图的三维重建。
可选地,所述投影设备包括:投影框体以及与所述投影框体连接的投影镜头,其中,所述投影框体包括:位于所述投影框体中远离所述投影镜头一端设置的光源;位于所述投影框体并设置在所述光源与所述投影镜头之间的光阑;其中,所述光源与所述投影镜头同轴设置。
可选地,所述投影光学系统,所述辅助光学系统以及所述照明光学系统中,至少所述投影光学系统的投影设备设有投影图案掩模,所述投影图案掩模位于所述光阑与所述投影镜头之间。
可选地,所述投影框体还包括:位于所述投影框体中远离所述投影镜头一端设置的聚光镜,被设置为对所述光源进行聚光;位于所述光源与所述光阑之间设置的隔热玻璃,被设置为防止所述光源产生的热量扩散。
可选地,所述辅助光学系统和/或所述照明光学系统为LED。
可选地,所述辅助光学系统和所述照明光学系统为同一光学系统。
根据本公开的又一个实施例,还提供了一种存储介质,所述存储介质中存储有计算机程序,其中,所述计算机程序被设置为运行时执行上述任一项方法实施例中的步骤。
根据本公开的又一个实施例,还提供了一种电子装置,包括存储器和处理器,所述存储器中存储有计算机程序,所述处理器被设置为运行所述计算机程序以执行上述任一项方法实施例中的步骤。
通过本公开,在三维重建当中通过在相同时间和相同位置增加一辅助的光源,并利用该辅助光源的投影与彩色问题图相匹配。因此可以解决相关技术中三维重建数据与彩色纹理图之间的时间差和位置差而导致纹理错位的问题,从而达到减少三维重建数据与彩色纹理图之间的时间差和位置差,同时纹理贴图的准确性高,生产成本低,耗费时间少的效果。
附图说明
此处所说明的附图用来提供对本公开的进一步理解,构成本申请的一部分,本公开的示意性实施例及其说明用于解释本公开,并不构成对本公 开的不当限定。在附图中:
图1是本公开实施例的一种图像处理方法的移动终端的硬件结构框图;
图2是根据本公开实施例的一种图像获取方法的流程图;
图3是根据本公开实施例的一种图像处理方法的流程图;
图4是根据本公开实施例的一种图像获取装置的结构框图;
图5是根据本公开实施例的一种图像处理装置的结构框图;
图6是根据本公开实施例的一种三维扫描设备的结构原理图;
图7是根据本公开实施例的另一种三维扫描设备的结构框图;
图8是根据本公开实施例的再一种三维扫描设备的结构框图;
图9是根据本公开实施例的一种三维扫描设备的结构原理图;
图10是根据本公开实施例的一种投影设备的结构原理图;
图11是根据本公开实施例的另一种投影设备的结构原理图。
具体实施方式
下文中将参考附图并结合实施例来详细说明本公开。需要说明的是,在不冲突的情况下,本申请中的实施例及实施例中的特征可以相互组合。
需要说明的是,本公开的说明书和权利要求书及上述附图中的术语“第一”、“第二”等是用于区别类似的对象,而不必用于描述特定的顺序或先后次序。
实施例1
本申请实施例一所提供的方法实施例可以在移动终端、计算机终端或者类似的运算装置中执行。以运行在移动终端上为例,图1是本公开实施例的一种三维扫描的图像获取方法的移动终端的硬件结构框图。如图1所示,移动终端10可以包括一个或多个(图1中仅示出一个)处理器102(处理器102可以包括但不限于微处理器MCU或可编程逻辑器件FPGA等的处理装置)和用于存储数据的存储器104,可选地,上述移动终端还 可以包括用于通信功能的传输设备106以及输入输出设备108。本领域普通技术人员可以理解,图1所示的结构仅为示意,其并不对上述移动终端的结构造成限定。例如,移动终端10还可包括比图1中所示更多或者更少的组件,或者具有与图1所示不同的配置。
存储器104可被设置为存储计算机程序,例如,应用软件的软件程序以及模块,如本公开实施例中的三维扫描的图像获取方法对应的计算机程序,处理器102通过运行存储在存储器104内的计算机程序,从而执行各种功能应用以及数据处理,即实现上述的方法。存储器104可包括高速随机存储器,还可包括非易失性存储器,如一个或者多个磁性存储装置、闪存、或者其他非易失性固态存储器。在一些实例中,存储器104可进一步包括相对于处理器102远程设置的存储器,这些远程存储器可以通过网络连接至移动终端10。上述网络的实例包括但不限于互联网、企业内部网、局域网、移动通信网及其组合。
传输装置106被设置为经由一个网络接收或者发送数据。上述的网络具体实例可包括移动终端10的通信供应商提供的无线网络。在一个实例中,传输装置106包括一个网络适配器(Network Interface Controller,简称为NIC),其可通过基站与其他网络设备相连从而可与互联网进行通讯。在一个实例中,传输装置106可以为射频(Radio Frequency,简称为RF)模块,其用于通过无线方式与互联网进行通讯。
在本实施例中提供了一种运行于图1的图像获取方法,图2是根据本公开实施例的一种三维扫描的图像获取方法的流程图,如图2所示,该流程包括如下步骤:
步骤S202,同步获取投影在被测物体上的第一波长光的投影图案以及第二波长光的照明图像;
可选地,同步获取是指在同一个时刻,获取所述投影图案以及所述照明图像。
可选地,所述照明图像为用于反映所述被测物体的纹理的图像。
需要说明的是,在本实施例所述的照明图像能够在第二波长光下反映出投影物体的纹理,但不能反映投影物体的真实色彩。
可选地,上述提及的第一波长光与第二波长光是互不干扰的波长光。
具体地,第一波长光与第二波长光可以是单色光,例如,第一波长光是蓝光,第二波长光是红光。此外,混合光也在本实施例的保护范围之内,至少保证混合光与另一中波长光能够互不干扰,即在本实施例的保护范围之内。
可选地,获取所述投影图案以及所述第二波长光的照明图像,包括:获取所述投影图案以及所述第二波长光的照明图像,包括:通过第一波长光对应的第一通道以及第二波长光对应的第二通道将获取到的所述投影图案和所述照明图像进行区分。
例如,以第一波长光是蓝光,第二波长光是红光为例。在将第一波长光的图案和第二波长光同时投影到被测物体上后,获取到了混合有第一波长光的投影图案以及所述第二波长光的照明图像的混合图像,为了甄别两个图像,将混合图像分别输入到波长为760-622nm的颜色信道和450-435nm的颜色信道当中。对于投影图案,由于其是经过第一波长光的图案投影到被测物体上获得的,因此,只有输入到450-435nm的颜色信道后,才能够获取到该投影图案,而在760-622nm的颜色信道中会被过滤掉。同理,对于照明图像由于其是经过第二波长光的图案投影到被测物体上获得的,因此,只有输入到760-622nm的颜色信道后,才能够获取到该投影图案,而在450-435nm的颜色信道中会被过滤掉。
可选地,同步获取投影在所述被测物体上的所述投影图案以及所述照明图像,包括:所述投影图案与所述照明图像的位置重合;或,所述投影图案与所述照明图像的位置关系根据所述投影图案的投影设备和所述照明图像的投影设备的位置关系确定。
具体地,如果使用同一个采集设备对所述投影图案以及所述照明图像进行采集时,二者之间的位置关系是相同的。但是如果使用不同的采集设 备进行图像采集的话,由于采集设备之间的接收角度不同,因此,此时的述投影图案以及所述照明图像的位置关系势必是不同的。在这种情况下,需要预先确定两个采集装置的位置的关系,以确定投影图案和照明图像的位置关系。
具体地,如果二者之间的位置关系是相同时,则需要通过第一波长光对应的第一通道以及第二波长光对应的第二通道将所述投影图案和所述照明图像进行区分。
步骤S204,获取投射所述被测物体的彩色纹理图像。
可选地,所述彩色纹理图像通过触发白光在所述被测物体上进行投射获取。
需要说明的是,触发执行步骤S204的时间要晚于步骤S202的时间。
可选地,所述第一波长光的投影图案用于进行三维重建,而所述第二波长光的照明图像则是用于辅助纹理跟踪。
实施例2
图3是根据本公开实施例的一种三维扫描的图像处理方法的流程图,如图3所示,该流程包括如下步骤
步骤S302,确定所述彩色纹理图像与所述照明图像的图像关系;
具体地,所述投影图案,所述照明图像通过以下方式获取:将在投影后的图像输入到一个或者多个彩色CCD芯片当中;接收所述一个或者多个彩色CCD芯片处理后的图像。
具体地,所述彩色纹理图像,通过以下方式获取:将在投射后的彩色纹理图像输入到一个或者多个彩色CCD芯片当中;接收所述一个或者多个彩色CCD芯片处理后的彩色纹理图像。
具体地,所述彩色纹理图像与所述照明图像的图像关系至少通过以下其中之一的方式确定:对所述彩色纹理图像与所述照明图像使用sift或surf特征提取点匹配,或,对所述彩色纹理图像与所述照明图像进行光流跟踪 算法处理。
具体地,所述图像关系至少包括:所述彩色纹理图像与所述照明图像的转换关系,以及所述彩色纹理图像与所述照明图像的映射关系。
步骤S304,根据所述图像关系,将所述彩色纹理图像匹配到所述投影图案进行数据的纹理贴图的三维重建。
在本实施例中,还提供了以下场景,以便理解上述实施例的技术方案。
在牙医诊所,医生为了能够了解病人的病牙的严重情况,往往会使用动态三维扫描仪对牙齿进行扫描,以观察到用户病牙的全貌。
首先,为了获取到牙齿的三维重建图,将蓝光的图案和红光同时投影到病人的病牙上。通过在拍摄过程中,使用的彩色相机的蓝色通道和红色通道,可以分别拍摄到蓝光对应的投影图案以及红光对应的牙齿图像。同时是同时拍摄的,因此,可以保证蓝光对应的投影图案以及红光对应的牙齿图像没有任何的时间差和位置差。
其次,为了获取到牙齿的彩色纹理图,利用照明光学系统在病牙上进行投影,通过拍摄,从而能够获取到该病牙的彩色纹理图。
再而,由于红光对应的牙齿图像和牙齿的彩色纹理图都是能够用于反映牙齿的特性的,因此,利用sift或surf特征点提取匹配,或光流跟踪算法等处理方式,能够获取到牙齿的各个部分之间的关系,例如,红光对应的牙齿图像中的牙龈的位置和牙齿的彩色纹理图的牙龈的位置的坐标、角度、颜色等映射关系和转换关系。
最后,由于蓝光对应的投影图案以及红光对应的牙齿图像没有任何的时间差和位置差。因此,只要把牙齿的彩色纹理图根据与红光对应的牙齿图像的对应关系匹配到蓝光对应的投影图案当中,即可以将牙齿的三维重建图和牙齿的彩色纹理图在时间和位置上的匹配,即可以解决三维重建数据与彩色纹理图之间的时间差和位置差而导致纹理错位的问题。
需要说明的是,上述举例只是列举,而并非穷举。应用在牙齿医学领域当中指示本实施例中所要保护的其中的一个场景。例如3D打印,建立 模型、建筑设计等其他领域当中。
通过上述步骤,解决了相关技术中三维重建数据与彩色纹理图之间的时间差和位置差而导致纹理错位的问题,从而达到减少三维重建数据与彩色纹理图之间的时间差和位置差,同时纹理贴图的准确性高,生产成本低,耗费时间少的效果。
通过以上的实施方式的描述,本领域的技术人员可以清楚地了解到根据上述实施例的方法可借助软件加必需的通用硬件平台的方式来实现,当然也可以通过硬件,但很多情况下前者是更佳的实施方式。基于这样的理解,本公开的技术方案本质上或者说对现有技术做出贡献的部分可以以软件产品的形式体现出来,该计算机软件产品存储在一个存储介质(如ROM/RAM、磁碟、光盘)中,包括若干指令用以使得一台终端设备(可以是手机,计算机,服务器,或者网络设备等)执行本公开各个实施例所述的方法。
实施例3
在本实施例中还提供了一种三维扫描的图像获取装置,该装置用于实现上述实施例及优选实施方式,已经进行过说明的不再赘述。如以下所使用的,术语“模块”可以实现预定功能的软件和/或硬件的组合。尽管以下实施例所描述的装置较佳地以软件来实现,但是硬件,或者软件和硬件的组合的实现也是可能并被构想的。
图4是根据本公开实施例的一种三维扫描的图像获取装置的结构框图,如图4所示,该装置包括:
第一图像获取模块42,被设置为同步获取投影在被测物体上的第一波长光的投影图案以及第二波长光的照明图像;
第二图像获取模块44,被设置为获取投射所述被测物体的彩色纹理图像;
需要说明的是,上述各个模块是可以通过软件或硬件来实现的,对于后者,可以通过以下方式实现,但不限于此:上述模块均位于同一处理器 中;或者,上述各个模块以任意组合的形式分别位于不同的处理器中。
实施例4
在本实施例中还提供了一种三维扫描的图像处理装置,该装置用于实现上述实施例及优选实施方式,已经进行过说明的不再赘述。如以下所使用的,术语“模块”可以实现预定功能的软件和/或硬件的组合。尽管以下实施例所描述的装置较佳地以软件来实现,但是硬件,或者软件和硬件的组合的实现也是可能并被构想的。
图5是根据本公开实施例的一种三维扫描的图像处理装置的结构框图,如图5所示,该装置包括:
图像处理模块52,被设置为确定所述彩色纹理图像与所述照明图像的图像关系;
重建模块54,被设置为根据所述图像关系,将所述彩色纹理图像匹配到所述投影图案进行数据的纹理贴图的三维重建。
其中,所述投影图案是通过在被测物体上投影第一波长光获取的,照明图像是在被测物体上的投影第二波长光与所述投影图案同步获取的;彩色纹理图像是在被测物体上的投影获取的。
实施例5
在本实施例中还提供了一种三维扫描设备,该装置用于实现上述实施例及优选实施方式,已经进行过说明的不再赘述。
图6是根据本公开实施例的一种三维扫描设备的结构框图,如图6所示,该设备包括:
时序控制电路61,投影光学系统62,辅助光学系统63,照明光学系统64,图像采集光学系统65。
所述投影光学系统,所述辅助光学系统以及所述照明光学系统中均包括:投影设备,所述投影设备被设置为投射光;所述图像采集光学系统65 被设置为采集所述投影光学系统62,辅助光学系统63以及照明光学系统64投影到被测物体的图像;
所述时序控制电路61与所述图像采集光学系统65、所述投影光学系统62,辅助光学系统63以及所述照明光学系统64连接,被设置为对所述图像采集光学系统65、所述投影光学系统62,所述辅助光学系统63以及所述照明光学系统64进行时序控制。
可选地,当所述时序控制电路61判断达到第一时刻时,触发所述投影光学系统62将第一波长光的图案投影到被测物体上,以及触发所述辅助光学系统63将第二波长光同时投影到被测物体上,并通过所述图像采集光学系统65获取所述第一波长光的投影图案以及所述第二波长光的照明图像;当所述时序控制电路61判断达到第二时刻时,触发照明光学系统64将白光在所述被测物体上进行投射,并通过所述图像采集光学65系统取彩色纹理图像;
可选地,所述辅助光学系统63与所述照明光学系统64为同一光学系统,即包括同一个投影设备,由同一个投影设备投射第二波长光和白光。
可选地,在图6中的三维扫描设备基础之上,图7是根据本公开实施例的另一种三维扫描设备的结构框图,如图7所示,所述三维扫描设备与图像处理器72连接,所述图像处理器72收集所述投影图案、照明图像以及所述彩色纹理图像;确定所述彩色纹理图像与所述照明图像的图像关系;并根据所述图像关系,将所述彩色纹理图像匹配到所述投影图案进行数据的纹理贴图的三维重建。
可选地,在图6中的三维扫描设备基础之上,图8是根据本公开实施例的再一种三维扫描设备的结构框图。如图8所示,所述三维扫描设备包括所述图像处理器82,所述图像处理器72收集所述投影图案、照明图像以及所述彩色纹理图像;确定所述彩色纹理图像与所述照明图像的图像关系;并根据所述图像关系,将所述彩色纹理图像匹配到所述投影图案进行数据的纹理贴图的三维重建。
图9是根据本公开实施例的一种三维扫描设备的结构原理图,如图9所示,图9中所示的Projector光路为投影光学系统62,而LED则是被设置为提供第二波长光的辅助光学系统63以及提供白光的照明光学系统64。Camera相机则是被设置为执行图像采集光学系统65的操作。同时,Control board控制板则被设置为执行时序控制电路61的时序控制功能。最后图像处理器72或者图像处理器82则可以通过图9中所示的PC终端来实现。此外,图9中还提供了多个平面镜,被设置为将LED,Projector光路中的发射光投影或者投射到被测物体上。具体地,LED为三色光源或三种不同波长的光源。
此外,需要说明的是,图9只是示例性地给出了三维扫描设备的结构原理图,其他基于本实施例思路的结构原理图也在本实施例的保护范围之内。例如,为了防止干扰,多个平面镜可以通过设置多个不同位置的平面镜以及其他光学装置,以使得通过反射和透射的对发射光进行滤波。
可选地,所述投影光学系统62、所述辅助光学系统63以及所述照明光学系统64中至少一者的投影设备为数字微镜设备,其中,所述数字微镜设备包括:使用数字光处理技术DLP的投影模块,其中一种实施方式为,图9中所示的Projector光路为数字微镜设备,LED也可由数字微镜设备替代。
图10是根据本公开实施例的一种投影设备的结构原理图,如图10所示,包括:投影框体100,与所述投影框体100连接的投影镜头102。其中,投影框体100包括:
光源1002,位于投影框体100中远离所述投影镜头一端;
聚光镜1004,位于投影框体100中远离所述投影镜头一端,被设置为对所述光源1002进行聚光;
光阑1006,位于所述投影框体100并设置在所述光源1002与所述投影镜头102之间的光阑;。具体地,光阑1006包括:与投影框体100本体连接的上下两个子光阑。
其中,所述光源1002与所述投影镜头102同轴设置。
图11是根据本公开实施例的另一种投影设备的结构原理图,如图11所示,在包括图10的结构基础之上还包括:所述投影光学系统,所述辅助光学系统以及所述照明光学系统中,至少所述投影光学系统的投影设备设有投影图案掩模1102。在一种实施例中,所述投影光学系统的投影设备中设有投影图案掩膜1102,所述辅助光学系统以及所述照明光学系统中的投影设备未设置投影图案掩膜1102。在又一实施例中,所述辅助光学系统中的投影设备设有投影图案掩膜1102,所述投影图案掩膜1102上设有简单图案,比如十字叉线,不影响纹理跟踪,又能快速且准确地获取照明图像与彩色纹理图像的转换关系,从而提高纹理贴图的效率及准确性。
投影图案掩模1102,位于光阑1006与投影镜头102,即待投影图案的模板。
可选地,所述投影设备还包括:隔热玻璃1104,位于光源1002与光阑1006之间,被设置为防止所述光源1002产生的热量扩散到投影图案以及投影镜头102,以延长设备使用寿命。
实施例6
本公开的实施例还提供了一种存储介质,该存储介质中存储有计算机程序,其中,该计算机程序被设置为运行时执行上述任一项方法实施例中的步骤。
可选地,在本实施例中,上述存储介质可以被设置为存储用于执行以下步骤的计算机程序:
S1,同步获取投影在被测物体上的第一波长光的投影图案以及第二波长光的照明图像;
S2,获取投射所述被测物体的彩色纹理图像;
S3,确定所述彩色纹理图像与所述照明图像的图像关系;
S4,根据所述图像关系,将所述彩色纹理图像匹配到所述投影图案进 行三维重建。
可选地,在本实施例中,上述存储介质可以包括但不限于:U盘、只读存储器(Read-Only Memory,简称为ROM)、随机存取存储器(Random Access Memory,简称为RAM)、移动硬盘、磁碟或者光盘等各种可以存储计算机程序的介质。
实施例7
本公开的实施例还提供了一种电子装置,包括存储器和处理器,该存储器中存储有计算机程序,该处理器被设置为运行计算机程序以执行上述任一项方法实施例中的步骤。
可选地,上述电子装置还可以包括传输设备以及输入输出设备,其中,该传输设备和上述处理器连接,该输入输出设备和上述处理器连接。
可选地,在本实施例中,上述处理器可以被设置为通过计算机程序执行以下步骤:
S1,同步获取投影在被测物体上的第一波长光的投影图案以及第二波长光的照明图像;
S2,获取投射所述被测物体的彩色纹理图像;
S3,确定所述彩色纹理图像与所述照明图像的图像关系;
S4,根据所述图像关系,将所述彩色纹理图像匹配到所述投影图案进行三维重建。
可选地,本实施例中的具体示例可以参考上述实施例及可选实施方式中所描述的示例,本实施例在此不再赘述。
显然,本领域的技术人员应该明白,上述的本公开的各模块或各步骤可以用通用的计算装置来实现,它们可以集中在单个的计算装置上,或者分布在多个计算装置所组成的网络上,可选地,它们可以用计算装置可执行的程序代码来实现,从而,可以将它们存储在存储装置中由计算装置来执行,并且在某些情况下,可以以不同于此处的顺序执行所示出或描述的 步骤,或者将它们分别制作成各个集成电路模块,或者将它们中的多个模块或步骤制作成单个集成电路模块来实现。这样,本公开不限制于任何特定的硬件和软件结合。
以上所述仅为本公开的优选实施例而已,并不用于限制本公开,对于本领域的技术人员来说,本公开可以有各种更改和变化。凡在本公开的原则之内,所作的任何修改、等同替换、改进等,均应包含在本公开的保护范围之内。

Claims (21)

  1. 一种三维扫描的图像获取方法,其中,包括:
    同步获取投影在被测物体上的第一波长光的投影图案以及第二波长光的照明图像;
    获取投射所述被测物体的彩色纹理图像。
  2. 根据权利要求1所述的方法,其中,同步获取投影在所述被测物体上的所述投影图案以及所述照明图像,包括:
    所述投影图案与所述照明图像的位置重合;或,
    所述投影图案与所述照明图像的位置关系根据所述投影图案的采集设备和所述照明图像的采集设备的位置关系确定。
  3. 根据权利要求1或2所述的方法,其中,所述彩色纹理图像通过触发白光在所述被测物体上进行投射获取。
  4. 根据权利要求1所述的方法,其中,所述照明图像为用于反映所述被测物体的纹理的图像。
  5. 根据权利要求1-4任一项所述的方法,其中,所述第一波长光与所述第二波长光为在波长上互不干扰的波长光。
  6. 一种三维扫描的图像处理方法,其中,包括:
    确定彩色纹理图像与照明图像的图像关系;
    根据所述图像关系,将所述彩色纹理图像匹配到投影图案进行数据的纹理贴图的三维重建;
    其中,所述投影图案是通过在被测物体上投影第一波长光获取的,照明图像是在被测物体上的投影第二波长光与所述投影图案同步获取的;彩色纹理图像是在被测物体上的投影获取的。
  7. 根据权利要求6所述的方法,其中,所述彩色纹理图像与所 述照明图像的图像关系至少通过以下其中之一的方式确定:
    对所述彩色纹理图像与所述照明图像使用sift或surf特征提取点匹配,或,对所述彩色纹理图像与所述照明图像进行光流跟踪算法处理。
  8. 根据权利要求6所述的方法,其中,所述图像关系至少包括其中之一:所述彩色纹理图像与所述照明图像的转换关系,以及所述彩色纹理图像与所述照明图像的映射关系。
  9. 根据权利要求6所述的方法,其中,所述投影图案,所述照明图像以及所述彩色纹理图像通过以下方式获取:
    将在投影后的图像输入到一个或者多个彩色CCD芯片当中;
    接收所述一个或者多个彩色CCD芯片处理后的图像。
  10. 一种三维扫描的图像获取装置,其中,包括:
    第一图像获取模块,被设置为同步获取投影在被测物体上的第一波长光的投影图案以及第二波长光的照明图像;
    第二图像获取模块,被设置为获取投射所述被测物体的彩色纹理图像。
  11. 一种三维扫描的图像处理装置,其中,包括:
    图像处理模块,被设置为确定所述彩色纹理图像与所述照明图像的图像关系;
    重建模块,被设置为根据所述图像关系,将所述彩色纹理图像匹配到所述投影图案进行数据的纹理贴图的三维重建;
    其中,所述投影图案是通过在被测物体上投影第一波长光获取的,照明图像是在被测物体上的投影第二波长光与所述投影图案同步获取的;彩色纹理图像是在被测物体上的投影获取的。
  12. 一种三维扫描设备,包括:时序控制电路,投影光学系统,辅助光学系统,照明光学系统,图像采集光学系统,其中:
    所述投影光学系统,所述辅助光学系统以及所述照明光学系统中均包括:投影设备,所述投影设备被设置为投射光;
    所述图像采集光学系统被设置为采集所述投影光学系统,辅助光学系统以及照明光学系统投影到被测物体的图像;
    所述时序控制电路与所述图像采集光学系统、所述投影光学系统、辅助光学系统以及所述照明光学系统连接,被设置为对所述图像采集光学系统、所述投影光学系统,所述辅助光学系统以及所述照明光学系统进行时序控制。
  13. 根据权利要求11所述的三维扫描设备,所述三维扫描装置被设置为:
    当所述时序控制电路判断达到第一时刻时,触发所述投影光学系统将第一波长光的图案投影到被测物体上,以及触发所述辅助光学系统将第二波长光投影到被测物体上,并通过所述图像采集光学系统获取所述第一波长光的投影图案以及所述第二波长光的照明图像;
    当所述时序控制电路判断达到第二时刻时,触发照明光学系统在所述被测物体上进行投射,并通过所述图像采集光学系统取彩色纹理图像。
  14. 根据权利要求11所述的三维扫描设备,所述三维扫描设备与图像处理器连接,或,所述三维扫描设备包括所述图像处理器,其中,所述图像处理器被设置为收集所述投影图案、照明图像以及所述彩色纹理图像;确定所述彩色纹理图像与所述照明图像的图像关系;并根据所述图像关系,将所述彩色纹理图像匹配到所述投影图案进行 数据的纹理贴图的三维重建。
  15. [根据细则26改正23.10.2019]
    根据权利要求12所述的三维扫描设备,其中,所述投影设备包括:投影框体以及与所述投影框体连接的投影镜头,其中,所述投影框体包括:
    位于所述投影框体中远离所述投影镜头一端设置的光源;
    位于所述投影框体并设置在所述光源与所述投影镜头之间的光阑;其中,所述光源与所述投影镜头同轴设置。
  16. 根据权利要求15所述的三维扫描设备,其中,所述投影光学系统,所述辅助光学系统以及所述照明光学系统中,至少所述投影光学系统的投影设备设有投影图案掩模,所述投影图案掩模位于所述光阑与所述投影镜头之间。
  17. 根据权利要求15所述的三维扫描设备,其中,所述投影框体还包括:
    位于所述投影框体中远离所述投影镜头一端设置的聚光镜,被设置为对所述光源进行聚光;
    位于所述光源与所述光阑之间设置的隔热玻璃,被设置为防止所述光源产生的热量扩散。
  18. 根据权利要求12所述的三维扫描设备,其中,所述辅助光学系统和/或所述照明光学系统为LED。
  19. 根据权利要求12-18任一所述的三维扫描设备,其中,所述辅助光学系统和所述照明光学系统为同一光学系统。
  20. 一种存储介质,其中,所述存储介质中存储有计算机程序,其中,所述计算机程序被设置为运行时执行所述权利要求1至4,5至9任一项中所述的方法。
  21. 一种电子装置,包括存储器和处理器,其中,所述存储器中存储有计算机程序,所述处理器被设置为运行所述计算机程序以执行所述权利要求1至4,5至9任一项中所述的方法。
PCT/CN2019/100799 2018-08-21 2019-08-15 三维扫描的图像获取、处理方法、装置以及三维扫描设备 WO2020038277A1 (zh)

Priority Applications (2)

Application Number Priority Date Filing Date Title
US17/265,509 US11887321B2 (en) 2018-08-21 2019-08-15 Three-dimensional scanning image acquisition and processing methods and apparatuses, and three-dimensional scanning device
EP19851007.5A EP3819872A4 (en) 2018-08-21 2019-08-15 IMAGE CAPTURING AND PROCESSING METHODS AND DEVICES FOR THREE-DIMENSIONAL SCANNING AND THREE-DIMENSIONAL SCANNING DEVICE

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
CN201810954837.8 2018-08-21
CN201810954837.8A CN109584352B (zh) 2018-08-21 2018-08-21 三维扫描的图像获取、处理方法、装置以及三维扫描设备

Publications (1)

Publication Number Publication Date
WO2020038277A1 true WO2020038277A1 (zh) 2020-02-27

Family

ID=65919687

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/CN2019/100799 WO2020038277A1 (zh) 2018-08-21 2019-08-15 三维扫描的图像获取、处理方法、装置以及三维扫描设备

Country Status (4)

Country Link
US (1) US11887321B2 (zh)
EP (1) EP3819872A4 (zh)
CN (1) CN109584352B (zh)
WO (1) WO2020038277A1 (zh)

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN114627249A (zh) * 2022-05-13 2022-06-14 杭州思看科技有限公司 三维扫描系统及三维扫描方法

Families Citing this family (11)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN109584352B (zh) 2018-08-21 2021-01-12 先临三维科技股份有限公司 三维扫描的图像获取、处理方法、装置以及三维扫描设备
CN112712583A (zh) * 2019-10-24 2021-04-27 先临三维科技股份有限公司 三维扫描仪、三维扫描系统和三维扫描方法
CN111578867B (zh) * 2020-04-10 2021-08-24 清华大学深圳国际研究生院 基于多次错位压缩全息重构的三维成像方法及系统
CN114078103A (zh) * 2020-08-21 2022-02-22 先临三维科技股份有限公司 重建数据的方法及系统,扫描设备
CN112040091B (zh) * 2020-09-01 2023-07-21 先临三维科技股份有限公司 相机增益的调整方法和装置、扫描系统
CN112884898B (zh) * 2021-03-17 2022-06-07 杭州思看科技有限公司 用于测量纹理映射精度的参考装置
CN114445388A (zh) * 2022-01-28 2022-05-06 北京奇禹科技有限公司 一种基于多光谱的图像识别方法、装置及存储介质
CN114831756A (zh) * 2022-05-02 2022-08-02 先临三维科技股份有限公司 一种口内扫描处理方法、系统、电子设备及介质
CN115131507B (zh) * 2022-07-27 2023-06-16 北京百度网讯科技有限公司 图像处理方法、图像处理设备和元宇宙三维重建方法
CN116418967B (zh) * 2023-04-13 2023-10-13 青岛图海纬度科技有限公司 水下动态环境激光扫描的色彩还原方法和设备
CN117496092B (zh) * 2023-12-29 2024-04-19 先临三维科技股份有限公司 三维扫描重建方法、装置、设备及存储介质

Citations (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20150256813A1 (en) * 2014-03-07 2015-09-10 Aquifi, Inc. System and method for 3d reconstruction using multiple multi-channel cameras
CN106934394A (zh) * 2017-03-09 2017-07-07 深圳奥比中光科技有限公司 双波长图像采集系统及方法
CN207198919U (zh) * 2017-06-05 2018-04-06 景致三维(江苏)股份有限公司 相机阵列三维重建装置及系统
CN207369209U (zh) * 2017-04-26 2018-05-15 成都通甲优博科技有限责任公司 一种基于阵列摄像组件的三维立体摄像装置
CN108053435A (zh) * 2017-11-29 2018-05-18 深圳奥比中光科技有限公司 基于手持移动设备的动态实时三维重建方法和系统
CN109584352A (zh) * 2018-08-21 2019-04-05 先临三维科技股份有限公司 三维扫描的图像获取、处理方法、装置以及三维扫描设备

Family Cites Families (12)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
GB8724527D0 (en) * 1987-10-20 1987-11-25 Cruickshank J S Projection apparatus
US20100328308A1 (en) * 2008-07-10 2010-12-30 C-True Ltd. Three Dimensional Mesh Modeling
CN103196393A (zh) * 2013-03-14 2013-07-10 南京楚通自动化科技有限公司 一种几何形状及表面色彩实时成像装置
WO2016123618A1 (en) * 2015-01-30 2016-08-04 Adcole Corporation Optical three dimensional scanners and methods of use thereof
CN106289092B (zh) * 2015-05-15 2020-10-27 高准国际科技有限公司 光学装置及其发光装置
CN105333838B (zh) * 2015-12-15 2018-07-17 宁波频泰光电科技有限公司 一种彩色3d测量系统
DE102016002398B4 (de) * 2016-02-26 2019-04-25 Gerd Häusler Optischer 3D-Sensor zur schnellen und dichten Formerfassung
US11529056B2 (en) * 2016-10-18 2022-12-20 Dentlytec G.P.L. Ltd. Crosstalk reduction for intra-oral scanning using patterned light
CN107105217B (zh) 2017-04-17 2018-11-30 深圳奥比中光科技有限公司 多模式深度计算处理器以及3d图像设备
WO2019002616A1 (en) * 2017-06-30 2019-01-03 Carestream Dental Technology Topco Limited SURFACE CARTOGRAPHY USING AN INTRA-MOBILE SCANNER HAVING PENETRATION CAPABILITIES
CN107202554B (zh) * 2017-07-06 2018-07-06 杭州思看科技有限公司 同时具备摄影测量和三维扫描功能的手持式大尺度三维测量扫描仪系统
CN107633165B (zh) * 2017-10-26 2021-11-19 奥比中光科技集团股份有限公司 3d人脸身份认证方法与装置

Patent Citations (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20150256813A1 (en) * 2014-03-07 2015-09-10 Aquifi, Inc. System and method for 3d reconstruction using multiple multi-channel cameras
CN106934394A (zh) * 2017-03-09 2017-07-07 深圳奥比中光科技有限公司 双波长图像采集系统及方法
CN207369209U (zh) * 2017-04-26 2018-05-15 成都通甲优博科技有限责任公司 一种基于阵列摄像组件的三维立体摄像装置
CN207198919U (zh) * 2017-06-05 2018-04-06 景致三维(江苏)股份有限公司 相机阵列三维重建装置及系统
CN108053435A (zh) * 2017-11-29 2018-05-18 深圳奥比中光科技有限公司 基于手持移动设备的动态实时三维重建方法和系统
CN109584352A (zh) * 2018-08-21 2019-04-05 先临三维科技股份有限公司 三维扫描的图像获取、处理方法、装置以及三维扫描设备

Non-Patent Citations (1)

* Cited by examiner, † Cited by third party
Title
See also references of EP3819872A4 *

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN114627249A (zh) * 2022-05-13 2022-06-14 杭州思看科技有限公司 三维扫描系统及三维扫描方法

Also Published As

Publication number Publication date
CN109584352A (zh) 2019-04-05
EP3819872A4 (en) 2021-08-25
US11887321B2 (en) 2024-01-30
US20210295545A1 (en) 2021-09-23
CN109584352B (zh) 2021-01-12
EP3819872A1 (en) 2021-05-12

Similar Documents

Publication Publication Date Title
WO2020038277A1 (zh) 三维扫描的图像获取、处理方法、装置以及三维扫描设备
CN112985307B (zh) 一种三维扫描仪、系统及三维重建方法
WO2019085392A1 (zh) 牙齿三维数据重建方法、装置和系统
JP3962588B2 (ja) 三次元画像処理方法、三次元画像処理装置、三次元画像処理システムおよび三次元画像処理プログラム
CN109155843A (zh) 图像投影系统和图像投影方法
CN110111262A (zh) 一种投影仪畸变校正方法、装置和投影仪
JP4043258B2 (ja) 3次元画像撮影装置
JP2011530911A (ja) 欠陥目を高精度で検知するインカメラに基づいた方法
CN107370951B (zh) 图像处理系统及方法
JP2014115109A (ja) 距離計測装置及び方法
CN112929617B (zh) 投影仪的控制方法、投影仪和显示系统
CN106408664A (zh) 一种基于三维扫描装置的三维模型曲面重建方法
US20220124247A1 (en) Panoramic photographing method and device, camera and mobile terminal
JP2003202216A (ja) 三次元画像処理方法、三次元画像処理装置、三次元画像処理システムおよび三次元画像処理プログラム
CN102944928B (zh) 一种三维内窥镜及其三维重建方法
WO2023213311A1 (zh) 胶囊内窥镜、摄像系统的测距方法和装置
JP2004289613A (ja) 3次元撮像装置、投光ユニット及び3次元再構成システム、並びに、3次元撮像方法及び投光ユニットの投光方法
CN112146565B (zh) 扫描仪和三维扫描系统
CN108267098B (zh) 三维扫描方法、装置、系统、存储介质和处理器
CN112565733A (zh) 基于多相机同步拍摄的三维成像方法、装置及拍摄系统
CN110545365B (zh) 与附近的其它行动装置协同闪光的行动装置及其方法
TW448340B (en) Single-lens instantaneous three-dimensional image taking apparatus
WO2023051323A1 (zh) 扫描仪和扫描方法
CN116206069A (zh) 三维扫描中的图像数据处理方法、装置和三维扫描仪
CN114264253A (zh) 高温物体三维轮廓非接触测量装置及其测量方法

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 19851007

Country of ref document: EP

Kind code of ref document: A1

NENP Non-entry into the national phase

Ref country code: DE