WO2020047692A1 - 3-d intraoral scanner using light field imaging - Google Patents

3-d intraoral scanner using light field imaging Download PDF

Info

Publication number
WO2020047692A1
WO2020047692A1 PCT/CN2018/103741 CN2018103741W WO2020047692A1 WO 2020047692 A1 WO2020047692 A1 WO 2020047692A1 CN 2018103741 W CN2018103741 W CN 2018103741W WO 2020047692 A1 WO2020047692 A1 WO 2020047692A1
Authority
WO
WIPO (PCT)
Prior art keywords
sensor
image
light
illumination
array
Prior art date
Application number
PCT/CN2018/103741
Other languages
French (fr)
Inventor
Yu Zhou
Guijian WANG
Qinran Chen
Dawei Sun
Longxiang HUANG
Yunian WU
Original Assignee
Carestream Dental Technology Shanghai Co., Ltd.
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Carestream Dental Technology Shanghai Co., Ltd. filed Critical Carestream Dental Technology Shanghai Co., Ltd.
Priority to PCT/CN2018/103741 priority Critical patent/WO2020047692A1/en
Publication of WO2020047692A1 publication Critical patent/WO2020047692A1/en

Links

Images

Classifications

    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61CDENTISTRY; APPARATUS OR METHODS FOR ORAL OR DENTAL HYGIENE
    • A61C9/00Impression cups, i.e. impression trays; Impression methods
    • A61C9/004Means or methods for taking digitized impressions
    • A61C9/0046Data acquisition means or methods
    • A61C9/0053Optical means or methods, e.g. scanning the teeth by a laser or light beam
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B1/00Instruments for performing medical examinations of the interior of cavities or tubes of the body by visual or photographical inspection, e.g. endoscopes; Illuminating arrangements therefor
    • A61B1/00002Operational features of endoscopes
    • A61B1/00004Operational features of endoscopes characterised by electronic signal processing
    • A61B1/00009Operational features of endoscopes characterised by electronic signal processing of image signals during a use of endoscope
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B1/00Instruments for performing medical examinations of the interior of cavities or tubes of the body by visual or photographical inspection, e.g. endoscopes; Illuminating arrangements therefor
    • A61B1/00163Optical arrangements
    • A61B1/00193Optical arrangements adapted for stereoscopic vision
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B1/00Instruments for performing medical examinations of the interior of cavities or tubes of the body by visual or photographical inspection, e.g. endoscopes; Illuminating arrangements therefor
    • A61B1/00163Optical arrangements
    • A61B1/00194Optical arrangements adapted for three-dimensional imaging
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B1/00Instruments for performing medical examinations of the interior of cavities or tubes of the body by visual or photographical inspection, e.g. endoscopes; Illuminating arrangements therefor
    • A61B1/24Instruments for performing medical examinations of the interior of cavities or tubes of the body by visual or photographical inspection, e.g. endoscopes; Illuminating arrangements therefor for the mouth, i.e. stomatoscopes, e.g. with tongue depressors; Instruments for opening or keeping open the mouth
    • A61B1/247Instruments for performing medical examinations of the interior of cavities or tubes of the body by visual or photographical inspection, e.g. endoscopes; Illuminating arrangements therefor for the mouth, i.e. stomatoscopes, e.g. with tongue depressors; Instruments for opening or keeping open the mouth with means for viewing areas outside the direct line of sight, e.g. dentists' mirrors
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61CDENTISTRY; APPARATUS OR METHODS FOR ORAL OR DENTAL HYGIENE
    • A61C9/00Impression cups, i.e. impression trays; Impression methods
    • GPHYSICS
    • G02OPTICS
    • G02BOPTICAL ELEMENTS, SYSTEMS OR APPARATUS
    • G02B27/00Optical systems or apparatus not provided for by any of the groups G02B1/00 - G02B26/00, G02B30/00
    • G02B27/0075Optical systems or apparatus not provided for by any of the groups G02B1/00 - G02B26/00, G02B30/00 with means for altering, e.g. increasing, the depth of field or depth of focus
    • GPHYSICS
    • G02OPTICS
    • G02BOPTICAL ELEMENTS, SYSTEMS OR APPARATUS
    • G02B27/00Optical systems or apparatus not provided for by any of the groups G02B1/00 - G02B26/00, G02B30/00
    • G02B27/28Optical systems or apparatus not provided for by any of the groups G02B1/00 - G02B26/00, G02B30/00 for polarising
    • G02B27/283Optical systems or apparatus not provided for by any of the groups G02B1/00 - G02B26/00, G02B30/00 for polarising used for beam splitting or combining
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N23/00Cameras or camera modules comprising electronic image sensors; Control thereof
    • H04N23/80Camera processing pipelines; Components thereof
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N23/00Cameras or camera modules comprising electronic image sensors; Control thereof
    • H04N23/95Computational photography systems, e.g. light-field imaging systems
    • H04N23/957Light-field or plenoptic cameras or camera modules
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N23/00Cameras or camera modules comprising electronic image sensors; Control thereof
    • H04N23/60Control of cameras or camera modules
    • H04N23/67Focus control based on electronic image sensor signals

Definitions

  • the disclosure relates generally to apparatus for intraoral imaging and more particularly to handheld apparatus for intraoral scanning using light field imaging.
  • Intraoral imaging has proven to have significant value as a tool with a diverse range of applications, including use as a diagnostic aid, use in color shade matching, and use for treatment and restoration planning, among other applications.
  • conventional intraoral imaging techniques have adapted traditional photography methods for acquiring detailed information on the surface structure and appearance of patient dentition.
  • approaches that have been developed and commercialized with some success for intraoral imaging are structured light imaging using triangularization, active and passive stereo imaging, and confocal imaging.
  • Confocal imaging using movable components in order to adjust the image plane, may allow some depth metrics, but adds cost and complexity to the intraoral imaging camera.
  • a handheld optical apparatus for imaging a sample comprising:
  • an illumination source for directing illumination to the sample
  • a microlens array disposed in front of the sensor and configured to convey the reflected light to the sensor
  • each microlens in the microlens array directs light to a corresponding plurality of pixels in the array of pixels;
  • control logic processor in signal communication with the sensor and programmed to store sensor data and to render a processed image from the sensor to a display.
  • FIG. 1A is a schematic diagram showing aspects of conventional camera lens operation for light handling.
  • FIG. 1B is a schematic diagram showing aspects of conventional camera lens operation as viewed with respect to a single 2-D plane.
  • FIG. 2 shows a ray space diagram for a conventional camera lens.
  • FIG. 3 shows a portion of the simplified ray-space diagram D for a plenoptic camera that uses light field imaging.
  • FIG. 4 shows a schematic diagram of a plenoptic intraoral camera that is used for light-field imaging according to an embodiment of the present disclosure.
  • FIGs. 5A and 5B are schematic diagrams that show how the image information allowing different focus is obtained.
  • FIG. 6 is a schematic diagram that shows what the plenoptic camera achieves from a single image capture, using the principles outlined in FIGs. 5A and 5B.
  • FIG. 7 is a logic flow diagram that shows a sequence for determining depth distance for an intraoral feature using plenoptic imaging.
  • first does not necessarily denote any ordinal, sequential, or priority relation, but are simply used to more clearly distinguish one step, element, or set of elements from another, unless specified otherwise.
  • the term “energizable” relates to a device or set of components that perform an indicated function upon receiving power and, optionally, upon receiving an enabling signal.
  • opticals is used generally to refer to lenses and other refractive, diffractive, and reflective components or apertures used for shaping and orienting a light beam.
  • An individual component of this type is termed an optic.
  • the term “scattered light” is used generally to include light that is reflected and backscattered from an object.
  • viewer In the context of the present disclosure, the terms “viewer” , “operator” , and “user” are considered to be equivalent and refer to the viewing practitioner, technician, or other person who may operate a camera or scanner and may also view and manipulate an image, such as a dental image, on a display monitor.
  • An “operator instruction” or “viewer instruction” is obtained from explicit commands entered by the viewer, such as by clicking a button on the camera or scanner or by using a computer mouse or by touch screen or keyboard entry.
  • the phrase “in signal communication” indicates that two or more devices and/or components are capable of communicating with each other via signals that travel over some type of signal path.
  • Signal communication may be wired or wireless.
  • the signals may be communication, power, data, or energy signals.
  • the signal paths may include physical, electrical, magnetic, electromagnetic, optical, wired, and/or wireless connections between the first device and/or component and second device and/or component.
  • the signal paths may also include additional devices and/or components between the first device and/or component and second device and/or component.
  • the term "camera” relates to a device that is enabled to acquire a reflectance, 2-D digital image from reflected visible or NIR light, such as structured light that is reflected from the surface of teeth and supporting structures.
  • scanner relates to an optical system that projects a scanned light beam to the tooth surface and provides image content based on the resulting scattered or reflected light content.
  • set refers to a non-empty set, as the concept of a collection of elements or members of a set is widely understood in elementary mathematics.
  • subset or “partial subset” , unless otherwise explicitly stated, are used herein to refer to a non-empty proper subset, that is, to a subset of the larger set, having one or more members.
  • a subset may comprise the complete set S.
  • a “proper subset” of set S is strictly contained in set S and excludes at least one member of set S.
  • a "partition of a set” is a grouping of the set′s elements into non-empty subsets so that every element is included in one and only one of the subsets. Two sets are "disjoint" when they have no element in common.
  • subject refers to the tooth or other portion of a patient that is being imaged and, in optical terms, can be considered equivalent to the "object” of the corresponding imaging system.
  • a "light field” refers to the 4D function that defines the amount of light (e.g., radiance) traveling along each ray within a defined spatial region.
  • the region of space is the interior of the recording camera or other optical device.
  • primary interest is directed to the rays of light flowing onto an imaging plane.
  • the imaging plane corresponds to the focal plane defined by the photosensor array in a conventional digital camera.
  • spatial resolution refers to the sampling density within the 2D imaging plane itself and “directional resolution” refers to the sampling density in the 2D angular domain of rays incident on the imaging plane.
  • One shortcoming of conventional reflectance imaging relates to how light from an object is recorded on the image sensor array or film at the imaging plane.
  • the sensor pixel, or the corresponding localized point on the photosensitive film at the image plane provides accurate information on the total amount of light energy that is received at that point (pixel) , but does not convey any information on the overall angular or directional distribution of the light received.
  • the camera offers spatial resolution, but not directional resolution. That is, there is no information that characterizes the geometry of the light received at numerous angles from the different points of the camera lens.
  • the acquired 2-D image shows only the spatially-resolved information about incoming light that can be obtained with a fixed focus and a defined depth of field. That is, the acquired image content collects 2-D information for each pixel (x, y) at the image plane, expressed as:
  • Light field imaging is a method that adapts the conventional camera with additional modifications, so that the modified camera can acquire the added dimensions of directional information that tells the geometric distribution of the light and provides directional resolution for the recorded image content.
  • light field imaging characterizes the light field and light ray paths through the plane of the camera lens, with coordinates (u, v) , and to the image plane of the camera, assigned coordinates (x, y) .
  • light field imaging collects 4-D information on each light ray that leads from the lens to the image plane, as represented in the following expression:
  • the distinction between conventional 2-D camera imaging and 4-D light field imaging is typically represented using a ray-space diagram that models the relationship of the lens plane (u, v) with the image plane (x, y) .
  • the conventional camera lens 20 geometry focuses light from the object, through the lens plane Q to image plane L.
  • the 3-D spatial pattern of light on the lens is simplified to a 2-D pattern as shown in FIG. 1B.
  • rays are considered along a single plane K through the lens, with plane K extending along the u-axis direction, with corresponding rays extending in the x-axis direction at image plane I.
  • a ray-space diagram D for conventional camera lens is shown in the simplified schematic diagram of FIG. 2.
  • light from a single point P1 in the object is focused at a single pixel P1’on the sensor array detector or film.
  • the ray-space diagram shows, there is no information recorded that tells at what u axis angle the light is incident on the pixel; only the accumulated amount of light from all angles is recorded for the pixel P1’at the image plane L for a sensor array 10 at a position x.
  • FIG. 3 shows a portion of the simplified ray-space diagram D for a plenoptic camera that uses light field imaging.
  • Plenoptic or light field imaging records information on light direction within an imaging apparatus.
  • a microlens array A placed in front of the sensor, redirects light to the image sensor array 10 at plane L.
  • the sensor it is preferable to have the sensor in the focal plane of the microlens.
  • Each column in the ray-space diagram represents all of the light that is conveyed through a single microlens M.
  • Each “cell” shown within the column represents light that goes to a single pixel.
  • the x pixel resolution shown in the direction of the horizontal axis, is reduced. This corresponds to some reduction in spatial resolution.
  • the light received at sensor array 10 now has added information about the angle at which light traveled through the lens 20, providing directional resolution information relative to the u axis to light distributed from the microlens.
  • a number of different focal planes can be obtained for the imaged light, as described in more detail subsequently.
  • 4D light field imaging stores angular or directional data, thus recording considerably more information about the imaging geometry than is stored by the conventional 2D camera, but at the expense of some pixel resolution.
  • the slight sacrifice in spatial resolution can be more than compensated by the advantages of identifying the source direction of light within the light field and capability for manipulating the directional resolution data. These advantages include the ability to refocus a single obtained image along a number of focal planes, with corresponding advantages for depth resolution.
  • FIG. 4 shows a schematic diagram of a plenoptic intraoral camera 100 that is used for light-field imaging according to an embodiment of the present disclosure.
  • Illumination directed toward the subject is generated by a light source 12 that directs light to a collimating lens 16 and optionally through a pattern mask 22, which can be provided for imparting a pattern to the illumination, such as using a spatial light modulator or other suitable device.
  • the light source 12 can be a solid-state light source such as a light-emitting diode (LED) , for example.
  • the illumination can provide a uniform flat field of light or can provide a patterned light, advantaged for improving the capability to sense focus.
  • the spatial light modulator can be, for example, a liquid crystal device (LCD) or a digital micromirror array, such as a Digital Light Processor (DLP) from Texas Instruments, Inc., Dallas, TX.
  • LCD liquid crystal device
  • DLP Digital Light Processor
  • the illumination is transmitted through a beam splitter BS, which can be a polarizing beam splitter PBS, and through a main lens 30 and quarter-wave plate (QWP) 32 or other polarization component.
  • the polarized light is then reflected from a mirror or other reflective surface 36 onto the tooth 38 surface.
  • Reflected surface 36 can be a coated surface, such as a dichroic mirror, for example.
  • the use of a QWP in the optical path allows the PBS to combine the illumination and image-bearing light onto a single path.
  • Each transit through the QWP rotates the polarization state by one quarter wave, so that the light returning from the tooth 38 has its polarization state orthogonal to the illumination light from source 12.
  • the PBS transmits light of one polarization state and reflects light of the orthogonal polarization state, directing the imaged light to sensor 10 for recording or display.
  • a control logic processor such as a microprocessor, a dedicated processor, or other computer 40 is in signal communication with the sensor 10 for acquiring and storing image data content and for processing the stored image content for rendering an image on a display 42.
  • FIG. 4 shows one architecture that can be used for intraoral plenoptic imaging. Other designs for obtaining similar results can be used.
  • the intraoral camera can be a 2D camera or a 3D camera that reconstructs 3D depth information from a focal stack.
  • the 3D camera can acquire 3D images directly, such as acquiring a 3D mesh or point cloud image.
  • Sensor array 10 can be a CCD (Charge-Coupled Device) or CMOS (Complementary Metal-Oxide Semiconductor) sensor array, for example.
  • CCD Charge-Coupled Device
  • CMOS Complementary Metal-Oxide Semiconductor
  • the focal stack (a stack of images at different focal planes) can be manipulated in order to form an image that has features at different focal distances appearing in focus at the same time or an image of a selectable focal length, generated from the image captured by the light field camera in one single exposure.
  • the image with each pixel (x’, y’) at an arbitrary focal plane can be modeled as follows:
  • F is the lens-sensor distance
  • value ⁇ is the depth of the virtual image plane relative to F
  • E ⁇ F is the intensity image at the plane ⁇ F.
  • FIGs. 5A and 5B show how the image information allowing different focus is obtained.
  • parameterization of the light field L F (u, v, x, y) indicates the intensity of the ray formed by the coordinate (u, v) on the lens plane and (x, y) on the sensor plane.
  • FIG. 5B represents a sub-aperture image.
  • the sub-aperture image can be considered as the contribution of a small aperture (u, v) to the image content recorded at the main lens plane. It is constructed by identifying the pixel behind each microlens M of array A which corresponds to the aperture (u, v) and reorganizing all of the identified pixels into a single image.
  • the sub-aperture image can be characterized as a re-combination or down-sampled version of the raw image pixels stored by sensor 10. Dilation and shift processing is performed to form a re-focused image. Shift aligns image features from the different sub-aperture images.
  • the digital re-focused image is actually a dilated shifted summation image of the recorded sub-aperture images.
  • equation (1) given above provides a summation of dilated and shifted sub-aperture images of the object.
  • FIG. 6 shows, highly exaggerated as to scale, what the plenoptic camera 100 achieves from a single image capture, using the principles outlined in FIGs. 5A and 5B.
  • the plenoptic imager provides image content capable of manipulation to effectively form a focal image stack S having multiple image planes L1, L2, L3 corresponding to multiple object planes O1, O2, O3.
  • the different image planes effectively correspond to different sensor to lens distances. Advantages provided by this arrangement include:
  • Depth data can be readily computed from the image content for a single image. Different surfaces can be detected as in focus from different distances, within a range. Feature depth can be estimated via ray tracing or using the Gaussian lens formula, for example.
  • the logic flow diagram of FIG. 7 shows a sequence for determining depth distance from the object to the lens using a single image capture from the plenoptic intraoral camera. Refocusing by selection of the appropriate set of pixel data can digitally model the images captured at different image planes, without the need for adjusting the position of the image sensor.
  • a light field that includes the feature of interest is acquired in an acquisition step S710.
  • a focal stack generation step S720 generates a focal stack that corresponds to images obtained at different image distances; this corresponds to different sensor to lens distances, as described with reference to FIG. 6.
  • a pixel contrast computation step S730 computes contrast for the pixel in each image of the focal stack.
  • a focal plane determination step S740 identifies the corresponding focal plane for the pixel according to the peak contrast value. Interpolation can be used to obtain suitable focal distance values, since the focal stack contains images only at a discrete number of focal positions.
  • a distance computation step S750 computes the object to camera distance from the identified focal distance corresponding to the pixel.
  • a display step S760 displays the computed distance as a depth measurement for the feature or features of interest.
  • the display can present a numerical value or can be color--coded or may have high contrast or other image treatment that is indicative of a distance value.
  • the computed depth measurement can be stored. Depth can be indicated using color, shading, or using some other display feature.
  • patterned illumination can help to make it more straightforward to determine best pixel focus.
  • tooth features can be used for determining best focus.
  • a computer program utilizes stored instructions that perform on image data that is accessed from an electronic memory.
  • a computer program for operating the imaging system in an embodiment of the present disclosure can be utilized by a suitable, general-purpose computer system operating as a CPU as described herein, such as a personal computer or workstation.
  • a suitable, general-purpose computer system operating as a CPU as described herein such as a personal computer or workstation.
  • many other types of computer systems can be used to execute the computer program of the present invention, including an arrangement of networked processors, for example.
  • the computer program for performing the method of the present invention may be stored in a computer readable storage medium.
  • This medium may comprise, for example; magnetic storage media such as a magnetic disk such as a hard drive or removable device or magnetic tape; optical storage media such as an optical disc, optical tape, or machine readable optical encoding; solid state electronic storage devices such as random access memory (RAM) , or read only memory (ROM) ; or any other physical device or medium employed to store a computer program.
  • the computer program for performing the method of the present disclosure may also be stored on computer readable storage medium that is connected to the image processor by way of the internet or other network or communication medium. Those skilled in the art will further readily recognize that the equivalent of such a computer program product may also be constructed in hardware.
  • memory can refer to any type of temporary or more enduring data storage workspace used for storing and operating upon image data and accessible to a computer system, including a database, for example.
  • the memory could be non-volatile, using, for example, a long-term storage medium such as magnetic or optical storage. Alternately, the memory could be of a more volatile nature, using an electronic circuit, such as random-access memory (RAM) that is used as a temporary buffer or workspace by a microprocessor or other control logic processor device.
  • Display data for example, is typically stored in a temporary storage buffer that is directly associated with a display device and is periodically refreshed as needed in order to provide displayed data.
  • This temporary storage buffer is also considered to be a type of memory, as the term is used in the present disclosure.
  • Memory is also used as the data workspace for executing and storing intermediate and final results of calculations and other processing.
  • Computer-accessible memory can be volatile, non-volatile, or a hybrid combination of volatile and non-volatile types.
  • the computer program product of the present disclosure may make use of various image manipulation algorithms and processes that are well known. It will be further understood that the computer program product embodiment of the present disclosure may embody algorithms and processes not specifically shown or described herein that are useful for implementation. Such algorithms and processes may include conventional utilities that are within the ordinary skill of the image processing arts. Additional aspects of such algorithms and systems, and hardware and/or software for producing and otherwise processing the images or co-operating with the computer program product of the present disclosure, are not specifically shown or described herein and may be selected from such algorithms, systems, hardware, components and elements known in the art.

Abstract

A handheld optical apparatus for imaging a sample, the apparatus having an illumination source for directing illumination to the sample and camera optics disposed to convey reflected light from the sample to a sensor having an array of pixels. A microlens array is disposed in front of the sensor and configured to convey the reflected light to the sensor, wherein each microlens in the microlens array directs light to pixels in the array of pixels. A control logic processor in signal communication with the sensor is programmed to store sensor data and to render a processed image from the sensor to a display.

Description

3-D INTRAORAL SCANNER USING LIGHT FIELD IMAGING FIELD OF THE INVENTION
The disclosure relates generally to apparatus for intraoral imaging and more particularly to handheld apparatus for intraoral scanning using light field imaging.
BACKGROUND OF THE INVENTION
Intraoral imaging has proven to have significant value as a tool with a diverse range of applications, including use as a diagnostic aid, use in color shade matching, and use for treatment and restoration planning, among other applications. In order to meet the challenges of the intraoral imaging environment, conventional intraoral imaging techniques have adapted traditional photography methods for acquiring detailed information on the surface structure and appearance of patient dentition. Among approaches that have been developed and commercialized with some success for intraoral imaging are structured light imaging using triangularization, active and passive stereo imaging, and confocal imaging.
There are significant limitations to the various techniques and approaches that have been applied to the problems of intraoral imaging. Constraints on camera and scanner size and form factor and the confined space requirements of the intraoral imaging environment make it challenging to accurately characterize intraoral surfaces. It can be difficult to focus with accuracy on individual surface features, to provide image content of broad areas of patient dentition at suitable resolution and focus, and to provide sufficient illumination for diagnostic purposes.
One notable difficulty relates to the challenges in achieving accurate depth measurement using captured images. Confocal imaging, using movable components in  order to adjust the image plane, may allow some depth metrics, but adds cost and complexity to the intraoral imaging camera.
Thus, it can be seen that there is a need for improved optics that address the limitations of conventional scanning cameras for intraoral 3-D imaging.
SUMMARY OF THE INVENTION
It is an object of the present disclosure to advance the art of diagnostic imaging and to address the need for improved 3-D intraoral imaging and scanning.
These objects are given only by way of illustrative example, and such objects may be exemplary of one or more embodiments of the invention. Other desirable objectives and advantages inherently achieved by the disclosed methods may occur or become apparent to those skilled in the art. The invention is defined by the appended claims.
According to an aspect of the present disclosure, there is provided a handheld optical apparatus for imaging a sample, the apparatus comprising:
a) an illumination source for directing illumination to the sample;
b) camera optics disposed to convey reflected light from the sample to a sensor having an array of pixels;
c) a microlens array disposed in front of the sensor and configured to convey the reflected light to the sensor,
wherein each microlens in the microlens array directs light to a corresponding plurality of pixels in the array of pixels;
and
d) a control logic processor in signal communication with the sensor and programmed to store sensor data and to render a processed image from the sensor to a display.
BRIEF DESCRIPTION OF THE DRAWINGS
The foregoing and other objects, features, and advantages of the invention will be apparent from the following more particular description of the embodiments of the invention, as illustrated in the accompanying drawings.
The elements of the drawings are not necessarily to scale relative to each other. Some exaggeration may be necessary in order to emphasize basic structural relationships or principles of operation. Some conventional components that would be needed for implementation of the described embodiments, such as support components used for providing power, for packaging, and for mounting and protecting system optics, for example, are not shown in the drawings in order to simplify description.
FIG. 1A is a schematic diagram showing aspects of conventional camera lens operation for light handling.
FIG. 1B is a schematic diagram showing aspects of conventional camera lens operation as viewed with respect to a single 2-D plane.
FIG. 2 shows a ray space diagram for a conventional camera lens.
FIG. 3 shows a portion of the simplified ray-space diagram D for a plenoptic camera that uses light field imaging.
FIG. 4 shows a schematic diagram of a plenoptic intraoral camera that is used for light-field imaging according to an embodiment of the present disclosure.
FIGs. 5A and 5B are schematic diagrams that show how the image information allowing different focus is obtained.
FIG. 6 is a schematic diagram that shows what the plenoptic camera achieves from a single image capture, using the principles outlined in FIGs. 5A and 5B.
FIG. 7 is a logic flow diagram that shows a sequence for determining depth distance for an intraoral feature using plenoptic imaging.
DETAILED DESCRIPTION OF THE INVENTION
The following is a detailed description of the preferred embodiments, reference being made to the drawings in which the same reference numerals identify the same elements of structure in each of the several figures.
Where they are used in the context of the present disclosure, the terms “first” , “second” , and so on, do not necessarily denote any ordinal, sequential, or priority relation, but are simply used to more clearly distinguish one step, element, or set of elements from another, unless specified otherwise.
As used herein, the term “energizable” relates to a device or set of components that perform an indicated function upon receiving power and, optionally, upon receiving an enabling signal.
In the context of the present disclosure, the term "optics" is used generally to refer to lenses and other refractive, diffractive, and reflective components or apertures used for shaping and orienting a light beam. An individual component of this type is termed an optic.
In the context of the present disclosure, the term "scattered light" is used generally to include light that is reflected and backscattered from an object.
In the context of the present disclosure, the terms “viewer” , “operator” , and “user” are considered to be equivalent and refer to the viewing practitioner, technician, or other person who may operate a camera or scanner and may also view and manipulate an image, such as a dental image, on a display monitor. An “operator instruction” or “viewer instruction” is obtained from explicit commands entered by the viewer, such as by clicking a button on the camera or scanner or by using a computer mouse or by touch screen or keyboard entry.
In the context of the present disclosure, the phrase “in signal communication” indicates that two or more devices and/or components are capable of communicating with each other via signals that travel over some type of signal path. Signal communication may be wired or wireless. The signals may be communication, power, data,  or energy signals. The signal paths may include physical, electrical, magnetic, electromagnetic, optical, wired, and/or wireless connections between the first device and/or component and second device and/or component. The signal paths may also include additional devices and/or components between the first device and/or component and second device and/or component.
In the context of the present disclosure, the term "camera" relates to a device that is enabled to acquire a reflectance, 2-D digital image from reflected visible or NIR light, such as structured light that is reflected from the surface of teeth and supporting structures.
The general term "scanner" relates to an optical system that projects a scanned light beam to the tooth surface and provides image content based on the resulting scattered or reflected light content.
The term “set” , as used herein, refers to a non-empty set, as the concept of a collection of elements or members of a set is widely understood in elementary mathematics. The terms “subset” or "partial subset" , unless otherwise explicitly stated, are used herein to refer to a non-empty proper subset, that is, to a subset of the larger set, having one or more members. For a set S, a subset may comprise the complete set S. A “proper subset” of set S, however, is strictly contained in set S and excludes at least one member of set S. A "partition of a set" is a grouping of the set′s elements into non-empty subsets so that every element is included in one and only one of the subsets. Two sets are "disjoint" when they have no element in common.
The term "subject" refers to the tooth or other portion of a patient that is being imaged and, in optical terms, can be considered equivalent to the "object" of the corresponding imaging system.
In the context of the present disclosure, a "light field" refers to the 4D function that defines the amount of light (e.g., radiance) traveling along each ray within a defined spatial region. In the embodiments discussed herein, the region of space is the interior of the recording camera or other optical device. In the imaging system, primary  interest is directed to the rays of light flowing onto an imaging plane. The imaging plane corresponds to the focal plane defined by the photosensor array in a conventional digital camera. With respect to this imaging plane, "spatial resolution" refers to the sampling density within the 2D imaging plane itself and "directional resolution" refers to the sampling density in the 2D angular domain of rays incident on the imaging plane.
One shortcoming of conventional reflectance imaging relates to how light from an object is recorded on the image sensor array or film at the imaging plane. The sensor pixel, or the corresponding localized point on the photosensitive film at the image plane, provides accurate information on the total amount of light energy that is received at that point (pixel) , but does not convey any information on the overall angular or directional distribution of the light received. The camera offers spatial resolution, but not directional resolution. That is, there is no information that characterizes the geometry of the light received at numerous angles from the different points of the camera lens. As a result, the acquired 2-D image shows only the spatially-resolved information about incoming light that can be obtained with a fixed focus and a defined depth of field. That is, the acquired image content collects 2-D information for each pixel (x, y) at the image plane, expressed as:
L(x, y)
Light field imaging overview
Light field imaging is a method that adapts the conventional camera with additional modifications, so that the modified camera can acquire the added dimensions of directional information that tells the geometric distribution of the light and provides directional resolution for the recorded image content. Also termed 4-D imaging, light field imaging characterizes the light field and light ray paths through the plane of the camera lens, with coordinates (u, v) , and to the image plane of the camera, assigned coordinates (x, y) . By modeling the ray-space that represents the light field inside the camera or scanner, light field imaging collects 4-D information on each light ray that leads from the lens to the  image plane, as represented in the following expression:
L(x, y, u, v)
The distinction between conventional 2-D camera imaging and 4-D light field imaging is typically represented using a ray-space diagram that models the relationship of the lens plane (u, v) with the image plane (x, y) . In order to understand what the ray-space diagram shows, it is useful to consider the paths of light entering the camera lens. As shown in the schematic diagram of FIG. 1A, the conventional camera lens 20 geometry focuses light from the object, through the lens plane Q to image plane L. In order to use ray-space representation, the 3-D spatial pattern of light on the lens is simplified to a 2-D pattern as shown in FIG. 1B. For ray-space mapping, rays are considered along a single plane K through the lens, with plane K extending along the u-axis direction, with corresponding rays extending in the x-axis direction at image plane I.
A ray-space diagram D for conventional camera lens is shown in the simplified schematic diagram of FIG. 2. Here, light from a single point P1 in the object is focused at a single pixel P1’on the sensor array detector or film. As the ray-space diagram shows, there is no information recorded that tells at what u axis angle the light is incident on the pixel; only the accumulated amount of light from all angles is recorded for the pixel P1’at the image plane L for a sensor array 10 at a position x.
By contrast, FIG. 3 shows a portion of the simplified ray-space diagram D for a plenoptic camera that uses light field imaging. Plenoptic or light field imaging records information on light direction within an imaging apparatus. A microlens array A, placed in front of the sensor, redirects light to the image sensor array 10 at plane L. For image quality at the sensor plane L, it is preferable to have the sensor in the focal plane of the microlens. Each column in the ray-space diagram represents all of the light that is conveyed through a single microlens M. Each “cell” shown within the column represents light that goes to a single pixel.
Compared against the mapping of FIG. 2, it can be appreciated that the x  pixel resolution, shown in the direction of the horizontal axis, is reduced. This corresponds to some reduction in spatial resolution. However, the light received at sensor array 10 now has added information about the angle at which light traveled through the lens 20, providing directional resolution information relative to the u axis to light distributed from the microlens. The result, recorded at representative pixels P1’, P2’, P3’, P4’, P5’and P6’allows some directional or angular resolution of the light with respect to the lens plane (u, v) . Using appropriate pixels from each microlens M, a number of different focal planes can be obtained for the imaged light, as described in more detail subsequently.
In this way, 4D light field imaging stores angular or directional data, thus recording considerably more information about the imaging geometry than is stored by the conventional 2D camera, but at the expense of some pixel resolution. However, with finer resolution available from tiny microlenses, the slight sacrifice in spatial resolution can be more than compensated by the advantages of identifying the source direction of light within the light field and capability for manipulating the directional resolution data. These advantages include the ability to refocus a single obtained image along a number of focal planes, with corresponding advantages for depth resolution.
Plenoptic intraoral camera for 4D light field imaging
FIG. 4 shows a schematic diagram of a plenoptic intraoral camera 100 that is used for light-field imaging according to an embodiment of the present disclosure. Illumination directed toward the subject is generated by a light source 12 that directs light to a collimating lens 16 and optionally through a pattern mask 22, which can be provided for imparting a pattern to the illumination, such as using a spatial light modulator or other suitable device. The light source 12 can be a solid-state light source such as a light-emitting diode (LED) , for example. The illumination can provide a uniform flat field of light or can provide a patterned light, advantaged for improving the capability to sense focus. The spatial light modulator can be, for example, a liquid crystal device (LCD) or a digital micromirror array, such as a Digital Light Processor (DLP) from Texas Instruments, Inc., Dallas, TX.
The illumination is transmitted through a beam splitter BS, which can be a polarizing beam splitter PBS, and through a main lens 30 and quarter-wave plate (QWP) 32 or other polarization component. The polarized light is then reflected from a mirror or other reflective surface 36 onto the tooth 38 surface. Reflected surface 36 can be a coated surface, such as a dichroic mirror, for example.
Reflected light returned from the tooth initially tracks the illumination path, in the reverse direction. This returned light reflects from reflective surface 36 and is directed back through through QWP 32 and through lens 30 to beam splitter BS. This returned light is reflected through microlens array A and to sensor 10. Sensor 10 records the image data it receives through microlens array A. As described previously, the stored information includes data on light direction, useful in providing image content over a series of focal distances, effectively providing a “focal stack” for the image content.
The use of a QWP in the optical path allows the PBS to combine the illumination and image-bearing light onto a single path. Each transit through the QWP rotates the polarization state by one quarter wave, so that the light returning from the tooth 38 has its polarization state orthogonal to the illumination light from source 12. The PBS transmits light of one polarization state and reflects light of the orthogonal polarization state, directing the imaged light to sensor 10 for recording or display.
A control logic processor, such as a microprocessor, a dedicated processor, or other computer 40 is in signal communication with the sensor 10 for acquiring and storing image data content and for processing the stored image content for rendering an image on a display 42.
It should be noted that FIG. 4 shows one architecture that can be used for intraoral plenoptic imaging. Other designs for obtaining similar results can be used. The intraoral camera can be a 2D camera or a 3D camera that reconstructs 3D depth information from a focal stack. The 3D camera can acquire 3D images directly, such as acquiring a 3D mesh or point cloud image.
Sensor array 10 can be a CCD (Charge-Coupled Device) or CMOS  (Complementary Metal-Oxide Semiconductor) sensor array, for example.
Focal stack generation and manipulation
The focal stack (a stack of images at different focal planes) can be manipulated in order to form an image that has features at different focal distances appearing in focus at the same time or an image of a selectable focal length, generated from the image captured by the light field camera in one single exposure.
The image with each pixel (x’, y’) at an arbitrary focal plane can be modeled as follows:
Figure PCTCN2018103741-appb-000001
wherein F is the lens-sensor distance, value α is the depth of the virtual image plane relative to F, 
Figure PCTCN2018103741-appb-000002
is the sub-aperture image at the coordinate (u, v) and E α·F is the intensity image at the plane α·F.
FIGs. 5A and 5B show how the image information allowing different focus is obtained. With respect to FIG. 5A, parameterization of the light field L F (u, v, x, y) indicates the intensity of the ray formed by the coordinate (u, v) on the lens plane and (x, y) on the sensor plane.
FIG. 5B represents a sub-aperture image. The sub-aperture image can be considered as the contribution of a small aperture (u, v) to the image content recorded at the main lens plane. It is constructed by identifying the pixel behind each microlens M of array A which corresponds to the aperture (u, v) and reorganizing all of the identified pixels into a single image. Thus, the sub-aperture image can be characterized as a re-combination or down-sampled version of the raw image pixels stored by sensor 10. Dilation and shift processing is performed to form a re-focused image. Shift aligns image features from the different sub-aperture images. The digital re-focused image is actually a dilated shifted summation image of the recorded sub-aperture images.
Considering the equation (1) given previously:
the sub-aperture image is denoted
Figure PCTCN2018103741-appb-000003
the dilated version is
Figure PCTCN2018103741-appb-000004
the shift value is then given as
Figure PCTCN2018103741-appb-000005
Thus, equation (1) given above provides a summation of dilated and shifted sub-aperture images of the object.
The schematic diagram of FIG. 6 shows, highly exaggerated as to scale, what the plenoptic camera 100 achieves from a single image capture, using the principles outlined in FIGs. 5A and 5B. The plenoptic imager provides image content capable of manipulation to effectively form a focal image stack S having multiple image planes L1, L2, L3 corresponding to multiple object planes O1, O2, O3. The different image planes effectively correspond to different sensor to lens distances. Advantages provided by this arrangement include:
(i) The ability to form a focused image of an object at a given depth by appropriate selection of a set of pixels from pixels associated with each microlens M. As shown in FIG. 6, this means that the image content for multiple focal planes is available from the single image capture.
(ii) The ability to form an image that has a sizable depth of field, so that features at different focal distances all appear to be in focus in a single refocused image.
(iii) Relaxation of the need for the operator to obtain precise focus when scanning intraoral features. Because data from a single image capture can be manipulated to provide a focused image from various distances within a range, the operator can scan more quickly, with less attention needed for maintaining an exact focal distance.
(iv) Depth data can be readily computed from the image content for a single image. Different surfaces can be detected as in focus from different distances, within a  range. Feature depth can be estimated via ray tracing or using the Gaussian lens formula, for example.
A number of methods, well known to those skilled in the imaging arts, are available for determining focus distance from the image content. This includes both active and passive methods.
The logic flow diagram of FIG. 7 shows a sequence for determining depth distance from the object to the lens using a single image capture from the plenoptic intraoral camera. Refocusing by selection of the appropriate set of pixel data can digitally model the images captured at different image planes, without the need for adjusting the position of the image sensor. A light field that includes the feature of interest is acquired in an acquisition step S710. A focal stack generation step S720 generates a focal stack that corresponds to images obtained at different image distances; this corresponds to different sensor to lens distances, as described with reference to FIG. 6.
For each pixel of the acquired image, the actual focus plane is then determined. The pixel focus occurs at the point of highest contrast. Using the FIG. 7 procedure, a pixel contrast computation step S730 computes contrast for the pixel in each image of the focal stack. A focal plane determination step S740 then identifies the corresponding focal plane for the pixel according to the peak contrast value. Interpolation can be used to obtain suitable focal distance values, since the focal stack contains images only at a discrete number of focal positions. A distance computation step S750 computes the object to camera distance from the identified focal distance corresponding to the pixel. A display step S760 then displays the computed distance as a depth measurement for the feature or features of interest. The display can present a numerical value or can be color--coded or may have high contrast or other image treatment that is indicative of a distance value. The computed depth measurement can be stored. Depth can be indicated using color, shading, or using some other display feature.
It can be appreciated that the use of patterned illumination can help to make  it more straightforward to determine best pixel focus. For a flat field illumination, tooth features can be used for determining best focus.
Consistent with an embodiment of the present invention, a computer program utilizes stored instructions that perform on image data that is accessed from an electronic memory. As can be appreciated by those skilled in the image processing arts, a computer program for operating the imaging system in an embodiment of the present disclosure can be utilized by a suitable, general-purpose computer system operating as a CPU as described herein, such as a personal computer or workstation. However, many other types of computer systems can be used to execute the computer program of the present invention, including an arrangement of networked processors, for example. The computer program for performing the method of the present invention may be stored in a computer readable storage medium. This medium may comprise, for example; magnetic storage media such as a magnetic disk such as a hard drive or removable device or magnetic tape; optical storage media such as an optical disc, optical tape, or machine readable optical encoding; solid state electronic storage devices such as random access memory (RAM) , or read only memory (ROM) ; or any other physical device or medium employed to store a computer program. The computer program for performing the method of the present disclosure may also be stored on computer readable storage medium that is connected to the image processor by way of the internet or other network or communication medium. Those skilled in the art will further readily recognize that the equivalent of such a computer program product may also be constructed in hardware.
It should be noted that the term “memory” , equivalent to “computer-accessible memory” in the context of the present disclosure, can refer to any type of temporary or more enduring data storage workspace used for storing and operating upon image data and accessible to a computer system, including a database, for example. The memory could be non-volatile, using, for example, a long-term storage medium such as magnetic or optical storage. Alternately, the memory could be of a more volatile nature, using an electronic circuit, such as random-access memory (RAM) that is used as a  temporary buffer or workspace by a microprocessor or other control logic processor device. Display data, for example, is typically stored in a temporary storage buffer that is directly associated with a display device and is periodically refreshed as needed in order to provide displayed data. This temporary storage buffer is also considered to be a type of memory, as the term is used in the present disclosure. Memory is also used as the data workspace for executing and storing intermediate and final results of calculations and other processing. Computer-accessible memory can be volatile, non-volatile, or a hybrid combination of volatile and non-volatile types.
It will be understood that the computer program product of the present disclosure may make use of various image manipulation algorithms and processes that are well known. It will be further understood that the computer program product embodiment of the present disclosure may embody algorithms and processes not specifically shown or described herein that are useful for implementation. Such algorithms and processes may include conventional utilities that are within the ordinary skill of the image processing arts. Additional aspects of such algorithms and systems, and hardware and/or software for producing and otherwise processing the images or co-operating with the computer program product of the present disclosure, are not specifically shown or described herein and may be selected from such algorithms, systems, hardware, components and elements known in the art.
The invention has been described in detail, and may have been described with particular reference to a suitable or presently preferred embodiment, but it will be understood that variations and modifications can be effected within the spirit and scope of the invention. The presently disclosed embodiments are therefore considered in all respects to be illustrative and not restrictive. The scope of the invention is indicated by the appended claims, and all changes that come within the meaning and range of equivalents thereof are intended to be embraced therein.

Claims (12)

  1. A handheld optical apparatus for imaging a sample, the apparatus comprising:
    a) an illumination source for directing illumination to the sample;
    b) camera optics disposed to convey reflected light from the sample to a sensor having an array of pixels;
    c) a microlens array disposed in front of the sensor and configured to convey the reflected light to the sensor,
    wherein each microlens in the microlens array directs light to a corresponding plurality of pixels in the array of pixels;
    and
    d) a control logic processor in signal communication with the sensor and programmed to store sensor data and to render a processed image from the sensor to a display.
  2. The optical apparatus of claim 1 further comprising a spatial light modulator for imparting a pattern to the illumination.
  3. The optical apparatus of claim 1 further comprising a polarization beam splitter and a polarizer in the path of the illumination and the reflected light.
  4. The optical apparatus of claim 1 further comprising a polarization beam splitter and a quarter wave plate in the path of the illumination and the reflected illumination.
  5. The optical apparatus of claim 1 wherein the sensor is at the focal  plane of the microlens array.
  6. A method for forming a 2D image of an intraoral feature, the method comprising:
    a) acquiring image content of the intraoral feature as image pixels from a plenoptic camera having a microlens array and a sensor array;
    b) identifying a plurality of sub-aperture images at the sensor array of the plenoptic camera;
    c) performing a shift processing on each of the identified plurality of sub-aperture images.
    d) summing the identified plurality of sub-aperture images to form refocused image content;
    and
    e) rendering the refocused image content onto a display.
  7. The method of claim 6 wherein the rendered refocused image has more than one focus plane.
  8. The method of claim 6 wherein acquiring the image content comprises using a beam splitter in the optical path for directing illumination to the intraoral feature and for directing the plurality of sub-aperture images to the sensor array.
  9. The method of claim 8 wherein directing illumination to the intraoral feature directs the illumination through a quarter wave plate.
  10. A method for depth measurement of an intraoral feature, the method comprising:
    a) acquiring image content of the intraoral feature as a set of image pixels from a  plenoptic camera;
    b) generating a focal stack of images of the feature obtained at different focal distances;
    c) computing contrast for each pixel from the set using the focal stack images;
    d) finding the in focus plane for each pixel from the set according to the computed contrast;
    e) computing a depth distance for each pixel of the intraoral feature based on the in focus plane;
    and
    f) displaying the computed depth distances for the intraoral feature.
  11. The method of claim 10 wherein computing contrast comprises interpolating between focus values from the focal stack.
  12. The method of claim 10 wherein displaying the computed depth distances comprises representing the distance using a color.
PCT/CN2018/103741 2018-09-03 2018-09-03 3-d intraoral scanner using light field imaging WO2020047692A1 (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
PCT/CN2018/103741 WO2020047692A1 (en) 2018-09-03 2018-09-03 3-d intraoral scanner using light field imaging

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
PCT/CN2018/103741 WO2020047692A1 (en) 2018-09-03 2018-09-03 3-d intraoral scanner using light field imaging

Publications (1)

Publication Number Publication Date
WO2020047692A1 true WO2020047692A1 (en) 2020-03-12

Family

ID=69721991

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/CN2018/103741 WO2020047692A1 (en) 2018-09-03 2018-09-03 3-d intraoral scanner using light field imaging

Country Status (1)

Country Link
WO (1) WO2020047692A1 (en)

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN117008351A (en) * 2023-10-07 2023-11-07 北京朗视仪器股份有限公司 Mouth scanning optical path system and scanner

Citations (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN105125162A (en) * 2015-09-17 2015-12-09 苏州佳世达光电有限公司 Oral cavity scanner
CN105467607A (en) * 2015-12-08 2016-04-06 苏州佳世达光电有限公司 A scanning device
WO2016092452A1 (en) * 2014-12-09 2016-06-16 Basf Se Optical detector
CN106803892A (en) * 2017-03-13 2017-06-06 中国科学院光电技术研究所 A kind of light field high-resolution imaging method based on Optical field measurement
CN107995424A (en) * 2017-12-06 2018-05-04 太原科技大学 Light field total focus image generating method based on depth map
CN108107003A (en) * 2017-12-15 2018-06-01 哈尔滨工业大学 Fast illuminated light field-polarization imager and imaging method based on microlens array

Patent Citations (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2016092452A1 (en) * 2014-12-09 2016-06-16 Basf Se Optical detector
CN105125162A (en) * 2015-09-17 2015-12-09 苏州佳世达光电有限公司 Oral cavity scanner
CN105467607A (en) * 2015-12-08 2016-04-06 苏州佳世达光电有限公司 A scanning device
CN106803892A (en) * 2017-03-13 2017-06-06 中国科学院光电技术研究所 A kind of light field high-resolution imaging method based on Optical field measurement
CN107995424A (en) * 2017-12-06 2018-05-04 太原科技大学 Light field total focus image generating method based on depth map
CN108107003A (en) * 2017-12-15 2018-06-01 哈尔滨工业大学 Fast illuminated light field-polarization imager and imaging method based on microlens array

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN117008351A (en) * 2023-10-07 2023-11-07 北京朗视仪器股份有限公司 Mouth scanning optical path system and scanner

Similar Documents

Publication Publication Date Title
US10888401B2 (en) Viewfinder with real-time tracking for intraoral scanning
EP3019117B1 (en) Video-based auto-capture for dental surface imaging apparatus
US8134719B2 (en) 3-D imaging using telecentric defocus
JP6007178B2 (en) 3D imaging system
JP5583761B2 (en) 3D surface detection method and apparatus using dynamic reference frame
US9679360B2 (en) High-resolution light-field imaging
KR20210024469A (en) Intraoral 3D scanner using multiple small cameras and multiple small pattern projectors
JP6305053B2 (en) Image processing apparatus, imaging apparatus, image processing method, and program
JP2019532451A (en) Apparatus and method for obtaining distance information from viewpoint
US20140253686A1 (en) Color 3-d image capture with monochrome image sensor
TW201033938A (en) Reference image techniques for three-dimensional sensing
US10463243B2 (en) Structured light generation for intraoral 3D camera using 1D MEMS scanning
EP2076870A2 (en) 3d photogrammetry using projected patterns
Ettl et al. Flying triangulation—an optical 3D sensor for the motion-robust acquisition of complex objects
JP7409443B2 (en) Imaging device
JP6234401B2 (en) Image processing apparatus, imaging apparatus, image processing method, and program
US11497392B2 (en) Extended depth of field intraoral imaging apparatus
WO2020047692A1 (en) 3-d intraoral scanner using light field imaging
Liu et al. An integrated calibration technique for variable-boresight three-dimensional imaging system
US8092024B2 (en) Eye measurement apparatus and methods of using same
KR101857977B1 (en) Image apparatus for combining plenoptic camera and depth camera, and image processing method
JP6824833B2 (en) Distance data generation system, distance data generation method and program
RU2790049C1 (en) Method for anisotropic recording of the light field and apparatus for implementation thereof
Zhang et al. Simulation-based investigation on optical 3D surface measurement with composite spectral patterns

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 18932680

Country of ref document: EP

Kind code of ref document: A1

NENP Non-entry into the national phase

Ref country code: DE

122 Ep: pct application non-entry in european phase

Ref document number: 18932680

Country of ref document: EP

Kind code of ref document: A1