WO2022236304A1 - System and method for converting three-dimensional sensors for object and limb scanning - Google Patents

System and method for converting three-dimensional sensors for object and limb scanning Download PDF

Info

Publication number
WO2022236304A1
WO2022236304A1 PCT/US2022/072138 US2022072138W WO2022236304A1 WO 2022236304 A1 WO2022236304 A1 WO 2022236304A1 US 2022072138 W US2022072138 W US 2022072138W WO 2022236304 A1 WO2022236304 A1 WO 2022236304A1
Authority
WO
WIPO (PCT)
Prior art keywords
electronic device
modification component
hardware modification
optical element
sensor
Prior art date
Application number
PCT/US2022/072138
Other languages
French (fr)
Inventor
Paulo E. XAVIER DA SILVEIRA
Ravi Shah
Jonathan Dana EDWARDS
Original Assignee
Occipital, Inc.
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Occipital, Inc. filed Critical Occipital, Inc.
Publication of WO2022236304A1 publication Critical patent/WO2022236304A1/en

Links

Classifications

    • GPHYSICS
    • G01MEASURING; TESTING
    • G01SRADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
    • G01S17/00Systems using the reflection or reradiation of electromagnetic waves other than radio waves, e.g. lidar systems
    • G01S17/88Lidar systems specially adapted for specific applications
    • G01S17/89Lidar systems specially adapted for specific applications for mapping or imaging
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01SRADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
    • G01S7/00Details of systems according to groups G01S13/00, G01S15/00, G01S17/00
    • G01S7/48Details of systems according to groups G01S13/00, G01S15/00, G01S17/00 of systems according to group G01S17/00
    • G01S7/481Constructional features, e.g. arrangements of optical elements
    • G01S7/4814Constructional features, e.g. arrangements of optical elements of transmitters alone
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01SRADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
    • G01S7/00Details of systems according to groups G01S13/00, G01S15/00, G01S17/00
    • G01S7/48Details of systems according to groups G01S13/00, G01S15/00, G01S17/00 of systems according to group G01S17/00
    • G01S7/481Constructional features, e.g. arrangements of optical elements
    • G01S7/4816Constructional features, e.g. arrangements of optical elements of receivers alone

Definitions

  • 3D sensors are increasingly being adopted and integrated into mobile or hand-held electronic devices. These sensors, however, are usually designed with AR applications in mind. As such, these sensors are often ill suited for other applications, such as object detection, use in healthcare, or the like.
  • FIG. 1 illustrates an example diffractive optical element suitable for use as a hardware modification component for modifying a field of view of one or more sensor of a hand-held electronic device according to some implementations.
  • FIG. 2 illustrates an example field of view of a sensor system of a hand-held electronic device according to some implementations.
  • FIG. 3 illustrates an example modification to the field of view of a sensor system of a hand-held electronic device using a hardware modification component according to some implementations.
  • FIG. 4 illustrates another example modification to the field of view of a sensor system of a hand-held electronic device using a hardware modification component according to some implementations.
  • FIG. 5 illustrates a hand-held electronic device including the hardware modification component according to some implementations.
  • FIG. 6 illustrates another hand-held electronic device including the hardware modification component according to some implementations.
  • FIG. 7 illustrates an example flow diagram showing a process for determining object data via an electronic device equipped with a hardware modification component according to some implementations.
  • FIG. 8 is an example electronic device hosting calibration and object scanning software for an equipped hardware modification component according to some implementations.
  • This disclosure includes techniques and implementations, consisting of a hardware and software components, for converting conventional three-dimensional (3D) sensors of hand-held or mobile electronic devices to provide improved accuracy and detail when scanning objects, such as physical objects and/or individual body parts.
  • 3D sensors typically associated with mobile devices include sensor systems, such as time- of-flight (ToF) sensors, structured light sensors and LIDAR sensors, that are designed for use in AR applications.
  • TOF time- of-flight
  • LIDAR sensors structured light sensors
  • These AR designed sensors usually are configured with a large or high FOV to enable rapid scanning of an environment, such as a room, in order to project virtual objects in a virtual view of the scanned environment.
  • the ability to scan a room quickly is favored over accuracy and detail of the scanned object, generally rendering the sensors inadequate for fields such as healthcare and object scanning.
  • the system discussed herein may include hardware modification components (such as one or more lens) to the 3D sensor of the mobile device as well as a software package that allows for the calibration (such as recover of depth calibration) of the 3D sensors once the hardware modification components are installed with respect to the mobile device and/or integrated 3D sensor.
  • hardware modification components such as one or more lens
  • software package that allows for the calibration (such as recover of depth calibration) of the 3D sensors once the hardware modification components are installed with respect to the mobile device and/or integrated 3D sensor.
  • the hardware modification components may include a modification for the light output by an illumination component of the 3D sensor.
  • 3D sensors typically rely on an active illumination component such as, for example, one or more VCSELs, edge emitting lasers, fiber laser or LEDs.
  • a diffractive element may be placed in the path of the light generated by the illumination component to multiply or divide the illuminating pattern into a larger number of patterns.
  • the hardware modification components, discussed herein, may be positioned in the path of the light generated by the illumination component of the mobile device after or following the diffractive element.
  • an optical element such as a Fresnel lens that is formed by “flattening” a regular lens and, thereby, breaking up the surface of the optical element into discontinuities that are integers of the wavelength of light.
  • the optical elements are significantly thinner and lighter than conventional Fresnel lenses.
  • the optical element may be equal to or less than half a millimeter. In other examples, the optical element may be less than 2 mm or less than 3 mm thick. The thickness of the optical elements, discussed herein, allows the optical elements to be mass produced using injection molded plastics (e.g., PMMA, polycarbonate, or the like).
  • the optical elements may be a plano-convex lens converted into a Fresnel-lens equivalent, as discussed below.
  • the optical elements may be configured to be a thinner diffractive element than typically associated with a Fresnel lens.
  • the thinner diffractive element may operate under the same principle of Fresnel lenses, with the difference that the element, discussed herein, may be thinner and/or the grooves may be close to a single wavelength and, accordingly more efficient.
  • the thinner diffractive element may be designed with a specific illuminator or mobile device in mind.
  • the optical elements may be tailored to specific wavelengths.
  • the optical elements may be configured for wavelengths within a range of detection of low-cost Silicon photodetectors, such as within the range from 400 nanometers (nm) to 1000 nm.
  • the range of the wavelengths may be in the near infrared region of the spectrum, such as from 700 nm to 1000 nm.
  • the optical elements may be configured for a wavelength range of 400 nm to 1000 nm.
  • the optical elements may be configured for illuminators that generate wavelengths between 935 nm and 945 nm.
  • the optical elements may be produced using a master. For example, by diamond-turning a steel master that would then be used to injection-mold plastic attachments.
  • the plastic optical elements could then be injection-molded using, for example, PMMA or polycarbonate.
  • PMMA the resulting index of refraction is 1.483 at 940 nm
  • the optical elements may have an index of refraction from 1.487 to 1.482 within the wavelength range from 700 nm to 1000 nm, respectively.
  • the index of refraction is 1.566 at 940 nm
  • the optical elements may have an index of refraction from ranges from 1.576 to 1.565 within the wavelength range from 700 nm to 1000 nm, respectively.
  • the optical elements may have a single or a multiple aperture architecture, as described below.
  • the optical elements may include a single optical element that is placed equally offset with respect to the illuminator or emitter and photoreceptor of the imaging systems of the mobile device.
  • the optical element may be aligned along a spatial dimension that separates the illuminator and the photoreceptors. Such an alignment results in the FOV of the mobile device being deviated towards the optical axes of the correcting optical element.
  • the optical elements, discussed herein may have a multiple aperture architecture.
  • the optical element may have an aperture that is aligned with each of the components of the imaging system (such as over the illuminator or emitter and the receptor and/or over each emitter and/or receptor in the case of multiple emitters and/or receptors).
  • each of the elements may be substantially similar, however, in some cases, the aperture of each optical element may be adjusted to provide increased flexibility between the FOV of the illuminators/emitters and the receptors.
  • both the single and the multi aperture architecture are configured to narrow the FOV of the illuminators and the receptors to increase the amount of light, and number of pixels, concentrated within the FOV and thereby increase the accuracy and detail with respect to the imaging system.
  • narrowing the FOV may result in an increase in the angular resolution (e.g., the number of pixels present at per unit of solid angle).
  • the equivalent increase in angular resolution is 2.84x. This increase in resolution may result in an increase in resolution and/or scanning accuracy.
  • the increase in resolution could result in a proportional reduction in the amount of time required for a user to perform a scan of an object or limb.
  • FIG. 1 illustrates example diffractive optical element 100 suitable for use as a hardware modification component for modifying a field of view of one or more sensor of a hand-held electronic device according to some implementations.
  • 3D sensors may rely on active illumination that is provided by a light source, for example, by one or more VCSELs, edge emitting lasers, fiber lasers or LEDs.
  • the light source may be followed by a diffractive element to multiply the illuminating pattern into a larger number of patterns.
  • the light sources often emit light at or over a narrow range of wavelengths. In some cases, the range is narrow enough to permit the use of diffractive optical element 100 as part of the hardware modification component.
  • the diffractive optical element 100 may consist of a regular lens that has been “flattened” by breaking up its surface into discontinuities, generally indicated by 102(A)-(H) in the present example.
  • the discontinuities 102 may be formed in the diffractive optical element to correspond to integers of the wavelength of light.
  • the diffractive optical element 100 may be significantly thinner and lighter than a regular lens and may be more easily mass produced using injection molded plastics (e.g., PMMA, polycarbonate).
  • the choice of wavelengths for the emitters of the electronic device may also vary, but the wavelengths usually lies within the range of detection of low-cost Silicon photodetectors. That is, within the range from 400nm to lOOOnm (the visible and near infrared regions of electromagnetic radiation). In addition, out of this range the near infrared region of the spectrum (from 700nm to lOOOnm) is preferred since this preferred range results in light that is invisible to the human eye. In some cases, out of the near infrared region wavelengths approaching 940nm (plus or minus 5nm) are of special interest given that corresponds to one of the absorption peaks of water.
  • the exact power of the diffractive optical element 100 may be calculated based at least in part on the wavelength of the light emitted by the emitters, on the index of refraction (and dispersion) of the selected materials, and on the final or desired FOV.
  • the wavelength of the emitters can be easily determined using optical instruments capable of measuring peak emissions of light with great precision (within lnm).
  • the index of refraction of optical materials can be measured with great accuracy using, for example, an ellipsometer.
  • the resulting index of refraction is 1.483 at 940nm, and ranges from 1.487 to 1.482 within the wavelength range from 700nm to lOOOnm, respectively.
  • the index of refraction is 1.566 at 940nm, and ranges from 1.576 to 1.565 within the wavelength range from 700nm to lOOOnm, respectively.
  • FIGS. 2-4 illustrates example FOVs of a sensor system of a hand-held electronic device according to some implementations.
  • FIG. 2 illustrates the FOV 202 of the hand held electronic device prior to being equipped with the hardware modification component.
  • 3D sensor systems of hand-held electronic devices typically include sensor systems that are designed for use in AR applications. These AR designed sensors usually are configured with a large or high FOV to enable rapid scanning of an environment, such as a room, in order to project virtual objects in a virtual view of the scanned environment. For these AR applications, the ability to scan a room quickly is favored over accuracy and detail of the scanned object, generally rendering the sensors inadequate for fields such as healthcare and object scanning.
  • the senor 204 and emitter 206 are shown with a large FOV 202 that may scan various objects, generally indicated by 208, within a environment (such as a room) at substantially the same time without generating high detail or accuracy with respect to the environment or the individual objects 208.
  • a window 210 is shown covering both the sensor 204 and emitter 206.
  • the window 210 is to physically protect the emitter 206 and sensor 204.
  • the window 210 is usually made of a material that is transparent to NIR light but that blocks visible light. That is, the window 210 is transparent to the 3D sensor but the window 210 looks “black” to a user.
  • the window 210 provides a location at which to equip or to place the hardware modification component discussed herein.
  • FIG. 3 illustrates an example 300 showing a modification to the FOV 202 of FIG. 2 of the sensor system including the sensor 204 (such as a photodetector imaging systems, photoreceptor, or the like) and the emitter 206 (such as an emitter array) using a hardware modification component 302 according to some implementations.
  • the hardware modification component 302 is equipped to the hand held electronic device with, for example, the diffractive optical element 100 of FIG. 1 to generate the modified FOV 304 of the sensor 204 to improve the detail and accuracy with respect to scanning the object 306.
  • the hardware modification component 208 incudes a single aperture version of the design, in which a single optical element is placed equally offset with respect to the emitter 206 and sensor 204.
  • the hardware modification component 302 may be positioned along the spatial dimension that separates the emitter 206 and the sensor 204. Such a configuration may result in the modified FOV 304 being deviated towards the optical axes of the correcting element.
  • the correcting element may comprise one or more of a Fresnel lens, a plano-convex lens, a bi-convex lens, or the like. In some cases, given the singular wavelength nature of 3D sensors, the correcting element of the hardware modification component 302 should be optimized for the singular wavelength.
  • FIG. 4 illustrates another example modification to the FOV 202 of a sensor system including the sensor 204 (such as a photodetector imaging systems, photoreceptor, or the like) and the emitter 206 (such as an emitter array) using a hardware modification 402 component according to some implementations.
  • the hardware modification component 402 is equipped to the hand held electronic device with a diffractive optical element or correcting element to generate the modified FOV 404 of the sensor 204 to improve the detail and accuracy with respect to scanning the object 306.
  • the corrective element of the hardware modification component 402 comprises a two aperture configuration.
  • each aperture is aligned over one of the components of the sensor system (e.g., the emitter 206 or the sensor 204).
  • the corrective optical element present in each aperture is substantially similar or identical to the other.
  • the multi-aperture configuration provides flexibility to customize each correcting element (e.g., the elements may differ in one or more characteristic or feature).
  • FIGS. 2-4 the chief rays are illustrated in solid lines and marginal rays are illustrated in dotted lines at different field positions (pixels in the emitter or photodetector arrays). For the sake of clarity, only the marginal rays cast by the emitter 206 positioned in the center of the emitter array are shown. In the sensor 204, the FOVs 202, 304, and 404 of the sensor system is given by the angle sub-intended by the chief rays cast by their edge pixels.
  • the modified FOVs 304 and 404 are shown as a reduced angle between chief rays at the field stops, after the correcting optical element.
  • the resulting increase in angular resolution i.e., the number of pixels present at per unit of solid angle
  • (FOV 202/FOV 304)*2 for the single aperture configuration and (FOV 202/FOV 404)*2 for the multi aperture configuration.
  • the equivalent increase in angular resolution is 2.84x.
  • the increase in angular resolution also results in an increase in scanning accuracy and in a proportional reduction in the amount of time required for a user to perform a scan of an object (such as object 306).
  • the 3D sensors use time-multiplexing to increase spatial resolution.
  • specific depth points may be aligned with specific pixels in the sensor or photodetector array. That is, the emitter 206 knows which light source is driven at a given point in time, and the emitter 206 knows which pixels in the sensor 204 (photodetector) array will be illuminated by those sources.
  • the hardware modification components 302 or 402 may change the emitter-to-sensor (detector) correspondence (e.g., mismatching the illumination and detection ray angles), which may result in the loss of resolution of the sensor or image system.
  • the multi-aperture configuration provides additional degrees of freedom to perform adjustments, as each aperture may be adjusted independently of each other. That is, in case the optical design, or empirical data, demonstrates that certain emitted pixels are illuminating incorrect pixels in the detector sensor array, the multi -aperture configuration provides the system designer with the freedom to adjust the angles of either the illuminator or detector pixels independently of each other. This may be done, for example, by adjusting either the height of the separation of the grooves of the diffractive lens on top of each element (detector or emitter), thus enabling local adjustments of the diffracted angle of each emitter-detector pixel pair.
  • the hardware modifications components are positive (converging) lenses to provide a reduction in the FOV 202 of the sensor systems. That is, positive lenses bend the marginal rays (which determine the FOV) towards their optical axis, resulting in a concentration of rays towards that axis, and a reduction in the angle of the marginal rays thereby reducing scanning time and increasing accuracy when scanning objects.
  • the reduction in scanning times results as other relevant object scanning parameters are kept constant, including voxel size and bounding box size.
  • the hardware modification component may result in a 2.84x speedup, in the example provided herein, thus allowing the scan to be completed in an amount of time directly proportional to the increase in angular resolution.
  • FIGS. 5 and 6 illustrates example 500 and 600 of a hand-held electronic device including the hardware modification component 502 and 602 according to some implementations.
  • the hardware modification component 502 includes a single aperture correcting optical element 504, such as the correcting optical element 302 of FIG. 3.
  • the hardware modification component 502 may also include an ultra-wide angle lens 506, a telephoto lens 508, a wide angle lens 510, and a lidar sensor 512.
  • Each of the lenses 506-512 may be clear or act to adjust a FOV or other characteristics associated with the corresponding sensor, emitter, or combination thereof.
  • the hardware modification component 502 may include one or more coupling elements to couple the hardware modification component 502 to a hand held electronic device 514, such that each of the single aperture correcting optical element 504 and the other lenses 506-512 align the corresponding sensor of the electronic device 514.
  • the hardware modification component 502 may extend around the edge of the electronic device 514 as shown while in other cases the hardware modification component 502 may be surface mounted to the back of the electronic device 514 via the coupling elements.
  • the hardware modification component 502 may be formed of a material (such as a plastic) that is selected to form the single aperture correcting optical element 504, such that the hardware modification component 502 and the optical element 504 are formed from the same material.
  • the hardware modification component 502 may be thin, from half a millimeter up to a 3 millimeters in thickness.
  • the entire hardware modification component 502 may be of similar thickness and transparent so that it may be used to cover each of the sensor systems without the need for individual lenses.
  • the hardware modification component 502 includes a dual or multi aperture correcting optical element 604, such as the correcting optical element 402 of FIG. 4.
  • the hardware modification component 502 includes one aperture for the emitter and one aperture for the detector. Each aperture is aligned to its respective sensor aperture.
  • the hardware modification component 602 may also include the ultra-wide angle lens 506, the telephoto lens 508, the wide angle lens 510, and the lidar lens 512. Each of the lenses 506-512 may be clear or act to adjust a FOV or other characteristics associated with the corresponding sensor, emitter, or combination thereof.
  • the hardware modification component 602 may also include one or more coupling elements to couple the hardware modification component 602 to a hand-held electronic device 614, as discussed above.
  • the integrated hardware modification components 502 and 602 are advantages over conventional systems in that the physical attachments are all contained within a single physical component. This, allows for the alignment of the optical element(s) 504 or 604 with respect to the emitter and photodetector by the alignment of one or more physical reference point(s) present in the electronic device 514, such as the comer(s) of the sensor plateau, or the edge(s) of the device 514.
  • the hardware modification components are configured to narrow a FOV of a sensor system to improve the accuracy and reduce the amount of time required to generate an object scan or 3D object data of an object.
  • the hardware modification component may be used in reverse, to expand a FOV of a device that is used for object scanning for use with other applications such as environment or room scanning.
  • concave lenses i.e., diverging lenses
  • convex lenses including their Fresnel and diffractive equivalent forms.
  • FIG. 7 is a flow diagrams illustrating example processes associated with generating a three-dimensional scene according to some implementations.
  • the processes are illustrated as a collection of blocks in a logical flow diagram, which represent a sequence of operations, some or all of which can be implemented in hardware, software or a combination thereof.
  • the blocks represent computer-executable instructions stored on one or more computer-readable media that, which when executed by one or more processors, perform the recited operations.
  • computer-executable instructions include routines, programs, objects, components, encryption, deciphering, compressing, recording, data structures and the like that perform particular functions or implement particular abstract data types.
  • FIG. 7 illustrates an example flow diagram showing a process 700 for determining obj ect data via an electronic device equipped with a hardware modification component according to some implementations.
  • the sensor may be used to provide a higher level of detail with respect to the scanned environment.
  • the device may host or otherwise download calibration and/or depth map determining instructions that may assist the electronic device in determining object data while the hardware modification component is attached or equipped.
  • the electronic device may determine whether a hardware modification component has been equipped. For example, the device may detect a coupling of the hardware modification component to the exterior of the device. In other cases, the device may detect the coupling via a failure of the original depth mapping between emitter and/or the sensor (e.g., the photodetector). In still other cases, a user may select an option on a user interface indicating the hardware modification component is in use, such as an on/off toggle or the like.
  • the electronic device may determine a new depth map associated with the sensor system. For example, each pixel of the original depth map in the emitter array corresponds to a given point in real physical space that may be determined by the pixels unique horizontal and vertical directions. By counting an amount of time a pulse of light takes to travel from the emitter to the sensor or photodetector, the device is able to determine the three-dimensional coordinates of that specific point in the depth map (e.g., the 3D physical environmental coordinates). However, when the hardware modification component is equipped, the hardware modification component changes the pixel -to-angle mapping, and a new depth map should be determined and applied. The remapping may be done on an individual basis for each device.
  • FOVi FOV'(N/2-i)/FOVo
  • N the sensor resolution in the horizontal (vertical) direction
  • FOVo the original resolution of the 3D sensor
  • FOV the new FOV after using the FOV corrector.
  • the electronic device may determine, for individual pixels, an amount of time for light emitted by the emitter to travel to the sensor based on the new depth map. For example, the device may be aware when the emitter outputs the light and then the time at which or clock signal when the light is detected by the sensors or photodetector for each pixel of the new depth map.
  • the electronic device may determine, for the individual pixels of the new depth map, three-dimensional coordinates of the physical environment based at least in part on the amount of time.
  • the electronic device may determine object data based at least in part on the per pixel three-dimensional coordinates. For example, the device may convert the three-dimensional coordinates of the new depth map for one or more scans into object data using one or more techniques.
  • the object data may include 3D representations and/or a mesh representing the object being scanned.
  • the three-dimensional coordinates of each new depth map for each scan may be used for applications including simultaneous location and mapping (SLAM), object scanning, mesh creation and texturing, odometry, and the like.
  • FIG. 8 is an example electronic device 800 hosting calibration and object scanning software for an equipped hardware modification component according to some implementations.
  • the device 800 may be equipped with a hardware modification component including a lens or optical element that may change the FOV of the sensors of the device to, for instance, improve the accuracy and time associated with scanning an object.
  • the device 800 may also be equipped with software or instructions to assist with generating object data associated with the scanned objects.
  • the device 800 may include one or more emitters 802.
  • the emitters 802 may be mounted on an exterior surface of the device 800 in order to output illumination or light into a physical environment.
  • the emitters 802 may include, but are not limited to, visible lights emitters, infrared emitters, ultraviolet light emitters, LIDAR systems, and the like. In some cases, the emitters 802 may output light in predetermined patterns, varying wavelengths, or at various time intervals (e.g., such as pulsed light).
  • the device 800 may also include one or more sensors 804.
  • the sensors 804 may include image devices, spectral sensors, LIDAR sensors, depth sensors, infrared sensors, or other photoreceptors capable of detecting light within an environment.
  • the device 800 may also include one or more communication interfaces 806 configured to facilitate communication between one or more networks, one or more cloud-based system(s), and/or one or more mobile or user devices.
  • the communication interfaces 806 may be configured to send and receive data associated with objects being scanned within the physical environment (such as the object data, object scans, stored models of one or more object, or the like).
  • the communications interfaces(s) 806 may enable Wi-Fi-based communication such as via frequencies defined by the IEEE 802.11 standards, short range wireless frequencies such as Bluetooth, cellular communication (e.g., 2G, 3G, 4G, 4G LTE, 5G, etc.), satellite communication, dedicated short-range communications (DSRC), or any suitable wired or wireless communications protocol that enables the respective computing device to interface with the other computing device(s).
  • Wi-Fi-based communication such as via frequencies defined by the IEEE 802.11 standards, short range wireless frequencies such as Bluetooth, cellular communication (e.g., 2G, 3G, 4G, 4G LTE, 5G, etc.), satellite communication, dedicated short-range communications (DSRC), or any suitable wired or wireless communications protocol that enables the respective computing device to interface with the other computing device(s).
  • the device 800 also includes an input and/or output interface 808, such as a projector, a virtual environment display, a traditional 2D display, buttons, knobs, and/or other input/output interface.
  • the interfaces 808 may include a flat display surface, such as a touch screen configured to allow a user of the device 800 to consume content.
  • the device 800 may also include one or more processors 810, such as at least one or more access components, control logic circuits, central processing units, or processors, as well as one or more computer-readable media 812 to perform the function associated with the virtual environment. Additionally, each of the processors 810 may itself comprise one or more processors or processing cores.
  • the computer-readable media 812 may be an example of tangible non-transitory computer storage media and may include volatile and nonvolatile memory and/or removable and non-removable media implemented in any type of technology for storage of information such as computer-readable instructions or modules, data structures, program modules or other data.
  • Such computer-readable media may include, but is not limited to, RAM, ROM, EEPROM, flash memory or other computer-readable media technology, CD-ROM, digital versatile disks (DVD) or other optical storage, magnetic cassettes, magnetic tape, solid state storage, magnetic disk storage, RAID storage systems, storage arrays, network attached storage, storage area networks, cloud storage, or any other medium that can be used to store information and which can be accessed by the processors 810.
  • the computer-readable media 812 may include scanning instructions 814, depth map generation instructions 816, object data generating instructions 818, modeling instructions 820, as well as other instructions 822.
  • the computer-readable media 812 may also store data such as depth map data 824, field of view data 826, object data 828, object models 830, and environment data 832.
  • the scanning instructions 814 may be configured to cause the emitters 802 and the sensors 804 to scan or otherwise map the physical environment surrounding the device 800.
  • the emitters 802 may output light and photoreceptors may capture the light as it returns to the device 800.
  • the depth map generation instructions 816 may be configured to modify a depth mapping usable to determine 3D coordinates between light output by the emitters 802 and detected by the sensors 804. For example, the depth map generation instructions 816 may modify the depth based on the FOV data 826 (e.g., the original FOV and the modified FOV).
  • the FOV data 826 e.g., the original FOV and the modified FOV.
  • the object data generating instructions 820 may be configured to generate object data 828 based on the depth map and the 3D coordinates determined during scanning of the physical environment.
  • the object data 828 may include meshes, contours, surfaces, point clouds, and the like.
  • the modeling instructions 822 may be configured to generate 3D models or object model data 830 from the object data 828.
  • the modeling instructions 822 may generate CAD or other professional 3D models of the object based on the object data 828.

Landscapes

  • Engineering & Computer Science (AREA)
  • Physics & Mathematics (AREA)
  • Computer Networks & Wireless Communication (AREA)
  • General Physics & Mathematics (AREA)
  • Radar, Positioning & Navigation (AREA)
  • Remote Sensing (AREA)
  • Electromagnetism (AREA)
  • Length Measuring Devices By Optical Means (AREA)

Abstract

A hardware modification component (100) for a hand-held electronic device configured to increase accuracy and reduce scanning time associated with generating object data of a physical object. In some cases, the hardware modification component (100) may include a lens or optical element to reduce the field of view of a sensor system of an electronic device.

Description

SYSTEM AND METHOD FOR CONVERTING THREE-
DIMENSIONAL SENSORS FOR OBJECT AND LIMB SCANNING
CROSS-REFERENCE TO RELATED APPLICATION(S)
[0001] This application claims priority to U.S. Provisional Application No. 63/201,657 filed on May 7, 2021 and entitled “SYSTEM AND METHOD FOR CONVERTING 3D SENSORS FOR OBJECT AND LIMB SCANNING,” which is incorporated herein by reference in their entirety.
BACKGROUND
[0002] The presence of three-dimensional (3D) sensing and imaging is becoming more and more common in industries such as healthcare, printing, and augmented reality (AR). In this regard, 3D sensors are increasingly being adopted and integrated into mobile or hand-held electronic devices. These sensors, however, are usually designed with AR applications in mind. As such, these sensors are often ill suited for other applications, such as object detection, use in healthcare, or the like.
BRIEF DESCRIPTION OF THE DRAWINGS [0003] The detailed description is described with reference to the accompanying figures. In the figures, the left-most digit(s) of a reference number identifies the figure in which the reference number first appears. The use of the same reference numbers in different figures indicates similar or identical components or features.
[0004] FIG. 1 illustrates an example diffractive optical element suitable for use as a hardware modification component for modifying a field of view of one or more sensor of a hand-held electronic device according to some implementations. [0005] FIG. 2 illustrates an example field of view of a sensor system of a hand-held electronic device according to some implementations.
[0006] FIG. 3 illustrates an example modification to the field of view of a sensor system of a hand-held electronic device using a hardware modification component according to some implementations.
[0007] FIG. 4 illustrates another example modification to the field of view of a sensor system of a hand-held electronic device using a hardware modification component according to some implementations.
[0008] FIG. 5 illustrates a hand-held electronic device including the hardware modification component according to some implementations.
[0009] FIG. 6 illustrates another hand-held electronic device including the hardware modification component according to some implementations.
[0010] FIG. 7 illustrates an example flow diagram showing a process for determining object data via an electronic device equipped with a hardware modification component according to some implementations.
[0011] FIG. 8 is an example electronic device hosting calibration and object scanning software for an equipped hardware modification component according to some implementations.
DETAILED DESCRIPTION
[0012] This disclosure includes techniques and implementations, consisting of a hardware and software components, for converting conventional three-dimensional (3D) sensors of hand-held or mobile electronic devices to provide improved accuracy and detail when scanning objects, such as physical objects and/or individual body parts. These techniques allow for the use of conventional mobile sensors, such as mobile LIDAR sensors, in healthcare and object scanning applications. For example, 3D sensors typically associated with mobile devices include sensor systems, such as time- of-flight (ToF) sensors, structured light sensors and LIDAR sensors, that are designed for use in AR applications. These AR designed sensors usually are configured with a large or high FOV to enable rapid scanning of an environment, such as a room, in order to project virtual objects in a virtual view of the scanned environment. For these AR applications, the ability to scan a room quickly is favored over accuracy and detail of the scanned object, generally rendering the sensors inadequate for fields such as healthcare and object scanning.
[0013] The system discussed herein, may include hardware modification components (such as one or more lens) to the 3D sensor of the mobile device as well as a software package that allows for the calibration (such as recover of depth calibration) of the 3D sensors once the hardware modification components are installed with respect to the mobile device and/or integrated 3D sensor.
[0014] The hardware modification components may include a modification for the light output by an illumination component of the 3D sensor. For example, 3D sensors typically rely on an active illumination component such as, for example, one or more VCSELs, edge emitting lasers, fiber laser or LEDs. A diffractive element may be placed in the path of the light generated by the illumination component to multiply or divide the illuminating pattern into a larger number of patterns. The hardware modification components, discussed herein, may be positioned in the path of the light generated by the illumination component of the mobile device after or following the diffractive element. For instance, an optical element, such as a Fresnel lens that is formed by “flattening” a regular lens and, thereby, breaking up the surface of the optical element into discontinuities that are integers of the wavelength of light. As such, the optical elements, discussed herein, are significantly thinner and lighter than conventional Fresnel lenses. For example, the optical element may be equal to or less than half a millimeter. In other examples, the optical element may be less than 2 mm or less than 3 mm thick. The thickness of the optical elements, discussed herein, allows the optical elements to be mass produced using injection molded plastics (e.g., PMMA, polycarbonate, or the like). In some examples, the optical elements, discussed herein, may be a plano-convex lens converted into a Fresnel-lens equivalent, as discussed below. [0015] Depending on the coherence of the illumination component of the mobile device, the optical elements may be configured to be a thinner diffractive element than typically associated with a Fresnel lens. In the case of coherent, or nearly coherent light, the thinner diffractive element may operate under the same principle of Fresnel lenses, with the difference that the element, discussed herein, may be thinner and/or the grooves may be close to a single wavelength and, accordingly more efficient. In some cases, the thinner diffractive element may be designed with a specific illuminator or mobile device in mind.
[0016] In some examples, the optical elements, discussed herein, may be tailored to specific wavelengths. For example, the optical elements may be configured for wavelengths within a range of detection of low-cost Silicon photodetectors, such as within the range from 400 nanometers (nm) to 1000 nm. In other examples, the range of the wavelengths may be in the near infrared region of the spectrum, such as from 700 nm to 1000 nm. In one specific example, the optical elements may be configured for a wavelength range of 400 nm to 1000 nm. In some specific examples, the optical elements may be configured for illuminators that generate wavelengths between 935 nm and 945 nm.
[0017] In some examples, the optical elements may be produced using a master. For example, by diamond-turning a steel master that would then be used to injection-mold plastic attachments. The plastic optical elements could then be injection-molded using, for example, PMMA or polycarbonate. In the case of PMMA, the resulting index of refraction is 1.483 at 940 nm, and the optical elements may have an index of refraction from 1.487 to 1.482 within the wavelength range from 700 nm to 1000 nm, respectively. In the case of polycarbonate, the index of refraction is 1.566 at 940 nm, and the optical elements may have an index of refraction from ranges from 1.576 to 1.565 within the wavelength range from 700 nm to 1000 nm, respectively.
[0018] In some examples, the optical elements, discussed herein, may have a single or a multiple aperture architecture, as described below. In a single-aperture architecture, the optical elements may include a single optical element that is placed equally offset with respect to the illuminator or emitter and photoreceptor of the imaging systems of the mobile device. For example, the optical element may be aligned along a spatial dimension that separates the illuminator and the photoreceptors. Such an alignment results in the FOV of the mobile device being deviated towards the optical axes of the correcting optical element.
[0019] Given the symmetry of the single aperture architecture, the deviation in the optical axis of the illuminator or emitter imaging system is balanced by a deviation of similar amount but opposite direction in the optical axis of the receptor of the imaging system, resulting in both emitter and detector ending up with a FOV that match the other (e.g., the FOV of the illuminator and the FOV of the receptor are substantially the same). [0020] In some examples, the optical elements, discussed herein, may have a multiple aperture architecture. For example, the optical element may have an aperture that is aligned with each of the components of the imaging system (such as over the illuminator or emitter and the receptor and/or over each emitter and/or receptor in the case of multiple emitters and/or receptors). In some cases, each of the elements may be substantially similar, however, in some cases, the aperture of each optical element may be adjusted to provide increased flexibility between the FOV of the illuminators/emitters and the receptors.
[0021] In general, both the single and the multi aperture architecture, are configured to narrow the FOV of the illuminators and the receptors to increase the amount of light, and number of pixels, concentrated within the FOV and thereby increase the accuracy and detail with respect to the imaging system. In some cases, narrowing the FOV may result in an increase in the angular resolution (e.g., the number of pixels present at per unit of solid angle). As a particular example, if the FOV is reduced from 77.5 degrees to 46 degrees, the equivalent increase in angular resolution is 2.84x. This increase in resolution may result in an increase in resolution and/or scanning accuracy. In another example, the increase in resolution could result in a proportional reduction in the amount of time required for a user to perform a scan of an object or limb.
[0022] FIG. 1 illustrates example diffractive optical element 100 suitable for use as a hardware modification component for modifying a field of view of one or more sensor of a hand-held electronic device according to some implementations. In some cases, 3D sensors may rely on active illumination that is provided by a light source, for example, by one or more VCSELs, edge emitting lasers, fiber lasers or LEDs. In some examples, the light source may be followed by a diffractive element to multiply the illuminating pattern into a larger number of patterns. The light sources often emit light at or over a narrow range of wavelengths. In some cases, the range is narrow enough to permit the use of diffractive optical element 100 as part of the hardware modification component. The diffractive optical element 100 (e.g., a Fresnel lens, diffractive lens, or the like) may consist of a regular lens that has been “flattened” by breaking up its surface into discontinuities, generally indicated by 102(A)-(H) in the present example. In some cases, the discontinuities 102 may be formed in the diffractive optical element to correspond to integers of the wavelength of light. As such, the diffractive optical element 100 may be significantly thinner and lighter than a regular lens and may be more easily mass produced using injection molded plastics (e.g., PMMA, polycarbonate).
[0023] The choice of wavelengths for the emitters of the electronic device may also vary, but the wavelengths usually lies within the range of detection of low-cost Silicon photodetectors. That is, within the range from 400nm to lOOOnm (the visible and near infrared regions of electromagnetic radiation). In addition, out of this range the near infrared region of the spectrum (from 700nm to lOOOnm) is preferred since this preferred range results in light that is invisible to the human eye. In some cases, out of the near infrared region wavelengths approaching 940nm (plus or minus 5nm) are of special interest given that corresponds to one of the absorption peaks of water. As such, that also corresponds to a null in the solar spectrum, which gets absorbed by water present in the atmosphere. Selecting emitters that operate within 940nm range of wavelengths allow the sensor to operate with reduced interference from Sun light as the detector usually includes a narrowband optical filter, tuned to pass most light within the narrow spectrum of the emitter while blocking light (including Sun light) outside that narrow spectrum. [0024] In some cases, the exact power of the diffractive optical element 100 may be calculated based at least in part on the wavelength of the light emitted by the emitters, on the index of refraction (and dispersion) of the selected materials, and on the final or desired FOV. The wavelength of the emitters can be easily determined using optical instruments capable of measuring peak emissions of light with great precision (within lnm). For example, using a spectrometer or a spectrophotometer. Likewise, the index of refraction of optical materials can be measured with great accuracy using, for example, an ellipsometer. In the case of a PMMA diffractive optical element 100, the resulting index of refraction is 1.483 at 940nm, and ranges from 1.487 to 1.482 within the wavelength range from 700nm to lOOOnm, respectively. In the case of a polycarbonate diffractive optical element 100, the index of refraction is 1.566 at 940nm, and ranges from 1.576 to 1.565 within the wavelength range from 700nm to lOOOnm, respectively.
[0025] FIGS. 2-4 illustrates example FOVs of a sensor system of a hand-held electronic device according to some implementations. FIG. 2 illustrates the FOV 202 of the hand held electronic device prior to being equipped with the hardware modification component. As discussed above, 3D sensor systems of hand-held electronic devices typically include sensor systems that are designed for use in AR applications. These AR designed sensors usually are configured with a large or high FOV to enable rapid scanning of an environment, such as a room, in order to project virtual objects in a virtual view of the scanned environment. For these AR applications, the ability to scan a room quickly is favored over accuracy and detail of the scanned object, generally rendering the sensors inadequate for fields such as healthcare and object scanning. [0026] Accordingly, in this example, the sensor 204 and emitter 206 are shown with a large FOV 202 that may scan various objects, generally indicated by 208, within a environment (such as a room) at substantially the same time without generating high detail or accuracy with respect to the environment or the individual objects 208.
[0027] In this example, a window 210 is shown covering both the sensor 204 and emitter 206. The window 210 is to physically protect the emitter 206 and sensor 204. The window 210 is usually made of a material that is transparent to NIR light but that blocks visible light. That is, the window 210 is transparent to the 3D sensor but the window 210 looks “black” to a user. In this example, the window 210 provides a location at which to equip or to place the hardware modification component discussed herein.
[0028] FIG. 3 illustrates an example 300 showing a modification to the FOV 202 of FIG. 2 of the sensor system including the sensor 204 (such as a photodetector imaging systems, photoreceptor, or the like) and the emitter 206 (such as an emitter array) using a hardware modification component 302 according to some implementations. In the current example, the hardware modification component 302 is equipped to the hand held electronic device with, for example, the diffractive optical element 100 of FIG. 1 to generate the modified FOV 304 of the sensor 204 to improve the detail and accuracy with respect to scanning the object 306.
[0029] In this example, the hardware modification component 208 incudes a single aperture version of the design, in which a single optical element is placed equally offset with respect to the emitter 206 and sensor 204. For example, the hardware modification component 302 may be positioned along the spatial dimension that separates the emitter 206 and the sensor 204. Such a configuration may result in the modified FOV 304 being deviated towards the optical axes of the correcting element. As discussed above, the correcting element may comprise one or more of a Fresnel lens, a plano-convex lens, a bi-convex lens, or the like. In some cases, given the singular wavelength nature of 3D sensors, the correcting element of the hardware modification component 302 should be optimized for the singular wavelength. Given the symmetry of this configuration, the deviation in the optical axis of the emitter 206 is balanced by a deviation of similar amount but opposite direction in the optical axis of the sensor 204, resulting in both emitter 206 and sensor 204 ending up with a FOV 304 that matches each other. This setup presents the advantage in simplicity in design and implementation.
[0030] FIG. 4 illustrates another example modification to the FOV 202 of a sensor system including the sensor 204 (such as a photodetector imaging systems, photoreceptor, or the like) and the emitter 206 (such as an emitter array) using a hardware modification 402 component according to some implementations. In the current example, the hardware modification component 402 is equipped to the hand held electronic device with a diffractive optical element or correcting element to generate the modified FOV 404 of the sensor 204 to improve the detail and accuracy with respect to scanning the object 306.
[0031] In this example, the corrective element of the hardware modification component 402 comprises a two aperture configuration. In this configuration, each aperture is aligned over one of the components of the sensor system (e.g., the emitter 206 or the sensor 204). In this configuration, the corrective optical element present in each aperture is substantially similar or identical to the other. Nevertheless, it should be understood that in some implementations, the multi-aperture configuration provides flexibility to customize each correcting element (e.g., the elements may differ in one or more characteristic or feature).
[0032] In FIGS. 2-4, the chief rays are illustrated in solid lines and marginal rays are illustrated in dotted lines at different field positions (pixels in the emitter or photodetector arrays). For the sake of clarity, only the marginal rays cast by the emitter 206 positioned in the center of the emitter array are shown. In the sensor 204, the FOVs 202, 304, and 404 of the sensor system is given by the angle sub-intended by the chief rays cast by their edge pixels.
[0033] As illustrated, in both examples of FIGS. 3 and 4, the modified FOVs 304 and 404 are shown as a reduced angle between chief rays at the field stops, after the correcting optical element. The resulting increase in angular resolution (i.e., the number of pixels present at per unit of solid angle) is given approximately by (FOV 202/FOV 304)*2 for the single aperture configuration and (FOV 202/FOV 404)*2 for the multi aperture configuration. In a particular example, when the FOV is reduced from 77.5degees to 46degees, the equivalent increase in angular resolution is 2.84x. The increase in angular resolution also results in an increase in scanning accuracy and in a proportional reduction in the amount of time required for a user to perform a scan of an object (such as object 306).
[0034] In some devices, the 3D sensors use time-multiplexing to increase spatial resolution. In these cases, specific depth points may be aligned with specific pixels in the sensor or photodetector array. That is, the emitter 206 knows which light source is driven at a given point in time, and the emitter 206 knows which pixels in the sensor 204 (photodetector) array will be illuminated by those sources. In some instances, the hardware modification components 302 or 402 may change the emitter-to-sensor (detector) correspondence (e.g., mismatching the illumination and detection ray angles), which may result in the loss of resolution of the sensor or image system. Both configurations address this requirement, but the multi-aperture configuration provides additional degrees of freedom to perform adjustments, as each aperture may be adjusted independently of each other. That is, in case the optical design, or empirical data, demonstrates that certain emitted pixels are illuminating incorrect pixels in the detector sensor array, the multi -aperture configuration provides the system designer with the freedom to adjust the angles of either the illuminator or detector pixels independently of each other. This may be done, for example, by adjusting either the height of the separation of the grooves of the diffractive lens on top of each element (detector or emitter), thus enabling local adjustments of the diffracted angle of each emitter-detector pixel pair.
[0035] Note that, in FIGS. 2-4, for convenience, all of the hardware modifications components and correcting elements are depicted as Fresnel lenses but, without loss of generality, hardware modification components could represent diffractive or convex lenses, or the like. In some cases, the hardware modifications components are positive (converging) lenses to provide a reduction in the FOV 202 of the sensor systems. That is, positive lenses bend the marginal rays (which determine the FOV) towards their optical axis, resulting in a concentration of rays towards that axis, and a reduction in the angle of the marginal rays thereby reducing scanning time and increasing accuracy when scanning objects. The reduction in scanning times results as other relevant object scanning parameters are kept constant, including voxel size and bounding box size. Then, more depth map points are available within the volume of interest. This allows the user to obtain an increase in depth maps points (e.g., more pixels per depth map over the same volume of environment) available within any given period of time. In some cases, the hardware modification component may result in a 2.84x speedup, in the example provided herein, thus allowing the scan to be completed in an amount of time directly proportional to the increase in angular resolution.
[0036] The increase in final mesh resolution and accuracy comes from the case when the total scanning time and bounding box volume are kept constant, but the increased angular resolution is applied to enable the addressing of a larger number of voxels inside that volume (i.e., using a smaller voxel size). It should be understood that it is also possible to tradeoff combinations of voxel size, bounding box size, and/or scanning time to optimize the resolution, accuracy, and/or scanning time tailored to a particular application or use.
[0037] FIGS. 5 and 6 illustrates example 500 and 600 of a hand-held electronic device including the hardware modification component 502 and 602 according to some implementations. In the example of FIG. 5, the hardware modification component 502 includes a single aperture correcting optical element 504, such as the correcting optical element 302 of FIG. 3. In this example, the hardware modification component 502 may also include an ultra-wide angle lens 506, a telephoto lens 508, a wide angle lens 510, and a lidar sensor 512. Each of the lenses 506-512 may be clear or act to adjust a FOV or other characteristics associated with the corresponding sensor, emitter, or combination thereof.
[0038] In this example, the hardware modification component 502 may include one or more coupling elements to couple the hardware modification component 502 to a hand held electronic device 514, such that each of the single aperture correcting optical element 504 and the other lenses 506-512 align the corresponding sensor of the electronic device 514. In some cases, the hardware modification component 502 may extend around the edge of the electronic device 514 as shown while in other cases the hardware modification component 502 may be surface mounted to the back of the electronic device 514 via the coupling elements.
[0039] In some implementations, the hardware modification component 502 may be formed of a material (such as a plastic) that is selected to form the single aperture correcting optical element 504, such that the hardware modification component 502 and the optical element 504 are formed from the same material. In some cases, the hardware modification component 502 may be thin, from half a millimeter up to a 3 millimeters in thickness. In some implementations, the entire hardware modification component 502 may be of similar thickness and transparent so that it may be used to cover each of the sensor systems without the need for individual lenses.
[0040] In the example of FIG. 6, the hardware modification component 502 includes a dual or multi aperture correcting optical element 604, such as the correcting optical element 402 of FIG. 4. In this example, the hardware modification component 502 includes one aperture for the emitter and one aperture for the detector. Each aperture is aligned to its respective sensor aperture.
[0041] Similar to the example of FIG 5., the hardware modification component 602 may also include the ultra-wide angle lens 506, the telephoto lens 508, the wide angle lens 510, and the lidar lens 512. Each of the lenses 506-512 may be clear or act to adjust a FOV or other characteristics associated with the corresponding sensor, emitter, or combination thereof. Again, the hardware modification component 602 may also include one or more coupling elements to couple the hardware modification component 602 to a hand-held electronic device 614, as discussed above.
[0042] The integrated hardware modification components 502 and 602 are advantages over conventional systems in that the physical attachments are all contained within a single physical component. This, allows for the alignment of the optical element(s) 504 or 604 with respect to the emitter and photodetector by the alignment of one or more physical reference point(s) present in the electronic device 514, such as the comer(s) of the sensor plateau, or the edge(s) of the device 514.
[0043] As discussed herein, the hardware modification components are configured to narrow a FOV of a sensor system to improve the accuracy and reduce the amount of time required to generate an object scan or 3D object data of an object. However, it should be understood, that the hardware modification component may be used in reverse, to expand a FOV of a device that is used for object scanning for use with other applications such as environment or room scanning. In this case, concave lenses (i.e., diverging lenses) should be used instead of convex lenses, including their Fresnel and diffractive equivalent forms.
[0044] FIG. 7 is a flow diagrams illustrating example processes associated with generating a three-dimensional scene according to some implementations. The processes are illustrated as a collection of blocks in a logical flow diagram, which represent a sequence of operations, some or all of which can be implemented in hardware, software or a combination thereof. In the context of software, the blocks represent computer-executable instructions stored on one or more computer-readable media that, which when executed by one or more processors, perform the recited operations. Generally, computer-executable instructions include routines, programs, objects, components, encryption, deciphering, compressing, recording, data structures and the like that perform particular functions or implement particular abstract data types.
[0045] The order in which the operations are described should not be construed as a limitation. Any number of the described blocks can be combined in any order and/or in parallel to implement the process, or alternative processes, and not all of the blocks need be executed. For discussion purposes, the processes herein are described with reference to the frameworks, architectures and environments described in the examples herein, although the processes may be implemented in a wide variety of other frameworks, architectures or environments.
[0046] FIG. 7 illustrates an example flow diagram showing a process 700 for determining obj ect data via an electronic device equipped with a hardware modification component according to some implementations. As discussed above, by narrowing the FOV of the sensors of the electronic device using a hardware modification component, the sensor may be used to provide a higher level of detail with respect to the scanned environment. In some cases, the device may host or otherwise download calibration and/or depth map determining instructions that may assist the electronic device in determining object data while the hardware modification component is attached or equipped.
[0047] At 702, the electronic device may determine whether a hardware modification component has been equipped. For example, the device may detect a coupling of the hardware modification component to the exterior of the device. In other cases, the device may detect the coupling via a failure of the original depth mapping between emitter and/or the sensor (e.g., the photodetector). In still other cases, a user may select an option on a user interface indicating the hardware modification component is in use, such as an on/off toggle or the like.
[0048] At 704, the electronic device may determine a new depth map associated with the sensor system. For example, each pixel of the original depth map in the emitter array corresponds to a given point in real physical space that may be determined by the pixels unique horizontal and vertical directions. By counting an amount of time a pulse of light takes to travel from the emitter to the sensor or photodetector, the device is able to determine the three-dimensional coordinates of that specific point in the depth map (e.g., the 3D physical environmental coordinates). However, when the hardware modification component is equipped, the hardware modification component changes the pixel -to-angle mapping, and a new depth map should be determined and applied. The remapping may be done on an individual basis for each device. In some cases, the remapping may be performed using the following equation: FOVi = FOV'(N/2-i)/FOVo where FOVi is the horizontal (vertical) FOV of the pixel in the ith row (column) in the re-calibrated or new depth map, starting from one, N is the sensor resolution in the horizontal (vertical) direction, FOVo is the original resolution of the 3D sensor, and FOV is the new FOV after using the FOV corrector. It should be understood that the relation does not necessarily have to be linear, in which case a non-linear equation can be used in the place of the equation above. For example, a polynomial approximation of the actual relation or, for greater generality, a look-up table can be used to implement any arbitrary relation.
[0049] At 706, the electronic device may determine, for individual pixels, an amount of time for light emitted by the emitter to travel to the sensor based on the new depth map. For example, the device may be aware when the emitter outputs the light and then the time at which or clock signal when the light is detected by the sensors or photodetector for each pixel of the new depth map.
[0050] At 708, the electronic device may determine, for the individual pixels of the new depth map, three-dimensional coordinates of the physical environment based at least in part on the amount of time.
[0051] At 710, the electronic device may determine object data based at least in part on the per pixel three-dimensional coordinates. For example, the device may convert the three-dimensional coordinates of the new depth map for one or more scans into object data using one or more techniques. In some case, the object data may include 3D representations and/or a mesh representing the object being scanned. In addition, the three-dimensional coordinates of each new depth map for each scan may be used for applications including simultaneous location and mapping (SLAM), object scanning, mesh creation and texturing, odometry, and the like. [0052] FIG. 8 is an example electronic device 800 hosting calibration and object scanning software for an equipped hardware modification component according to some implementations. As discussed above, the device 800 may be equipped with a hardware modification component including a lens or optical element that may change the FOV of the sensors of the device to, for instance, improve the accuracy and time associated with scanning an object. The device 800 may also be equipped with software or instructions to assist with generating object data associated with the scanned objects. [0053] The device 800 may include one or more emitters 802. The emitters 802 may be mounted on an exterior surface of the device 800 in order to output illumination or light into a physical environment. The emitters 802 may include, but are not limited to, visible lights emitters, infrared emitters, ultraviolet light emitters, LIDAR systems, and the like. In some cases, the emitters 802 may output light in predetermined patterns, varying wavelengths, or at various time intervals (e.g., such as pulsed light).
[0054] The device 800 may also include one or more sensors 804. The sensors 804 may include image devices, spectral sensors, LIDAR sensors, depth sensors, infrared sensors, or other photoreceptors capable of detecting light within an environment. [0055] The device 800 may also include one or more communication interfaces 806 configured to facilitate communication between one or more networks, one or more cloud-based system(s), and/or one or more mobile or user devices. In some cases, the communication interfaces 806 may be configured to send and receive data associated with objects being scanned within the physical environment (such as the object data, object scans, stored models of one or more object, or the like). The communications interfaces(s) 806 may enable Wi-Fi-based communication such as via frequencies defined by the IEEE 802.11 standards, short range wireless frequencies such as Bluetooth, cellular communication (e.g., 2G, 3G, 4G, 4G LTE, 5G, etc.), satellite communication, dedicated short-range communications (DSRC), or any suitable wired or wireless communications protocol that enables the respective computing device to interface with the other computing device(s).
[0056] In the illustrated example, the device 800 also includes an input and/or output interface 808, such as a projector, a virtual environment display, a traditional 2D display, buttons, knobs, and/or other input/output interface. For instance, in one example, the interfaces 808 may include a flat display surface, such as a touch screen configured to allow a user of the device 800 to consume content.
[0057] The device 800 may also include one or more processors 810, such as at least one or more access components, control logic circuits, central processing units, or processors, as well as one or more computer-readable media 812 to perform the function associated with the virtual environment. Additionally, each of the processors 810 may itself comprise one or more processors or processing cores.
[0058] Depending on the configuration, the computer-readable media 812 may be an example of tangible non-transitory computer storage media and may include volatile and nonvolatile memory and/or removable and non-removable media implemented in any type of technology for storage of information such as computer-readable instructions or modules, data structures, program modules or other data. Such computer-readable media may include, but is not limited to, RAM, ROM, EEPROM, flash memory or other computer-readable media technology, CD-ROM, digital versatile disks (DVD) or other optical storage, magnetic cassettes, magnetic tape, solid state storage, magnetic disk storage, RAID storage systems, storage arrays, network attached storage, storage area networks, cloud storage, or any other medium that can be used to store information and which can be accessed by the processors 810. [0059] Several modules such as instruction, data stores, and so forth may be stored within the computer-readable media 812 and configured to execute on the processors 810. For example, as illustrated, the computer-readable media 812 may include scanning instructions 814, depth map generation instructions 816, object data generating instructions 818, modeling instructions 820, as well as other instructions 822. The computer-readable media 812 may also store data such as depth map data 824, field of view data 826, object data 828, object models 830, and environment data 832. [0060] The scanning instructions 814 may be configured to cause the emitters 802 and the sensors 804 to scan or otherwise map the physical environment surrounding the device 800. For example, the emitters 802 may output light and photoreceptors may capture the light as it returns to the device 800.
[0061] The depth map generation instructions 816 may be configured to modify a depth mapping usable to determine 3D coordinates between light output by the emitters 802 and detected by the sensors 804. For example, the depth map generation instructions 816 may modify the depth based on the FOV data 826 (e.g., the original FOV and the modified FOV).
[0062] The object data generating instructions 820 may be configured to generate object data 828 based on the depth map and the 3D coordinates determined during scanning of the physical environment. For example, the object data 828 may include meshes, contours, surfaces, point clouds, and the like.
[0063] The modeling instructions 822 may be configured to generate 3D models or object model data 830 from the object data 828. For example, the modeling instructions 822 may generate CAD or other professional 3D models of the object based on the object data 828. [0064] Although the subject matter has been described in language specific to structural features, it is to be understood that the subject matter defined in the appended claims is not necessarily limited to the specific features described. Rather, the specific features are disclosed as illustrative forms of implementing the claims.

Claims

CLAIMS WHAT IS CLAIMED IS:
1. A hardware modification component comprising: a coupling mechanism to releasably couple the hardware modification component to an electronic device; and an optical element aligned with a field of view of a sensor system of the electronic device and configured to modify the field of view of the sensor system when the hardware modification component is coupled.
2. The hardware modification component as recited in claim 1, wherein modifying the field of view of the sensor system includes adjusting a pixel-to-angle mapping between an emitter of the sensor system and a photoreceptor of the sensor system.
3. The hardware modification component as recited in claim 1, wherein the optical element narrows the field of view and increases a number of pixels per volume associated with a scan of a physical environment.
4. The hardware modification component as recited in claim 1, wherein: the sensor system includes a photoreceptor and an emitter; and the optical element includes a first optical element aligned with a field of view of the photoreceptor and a second optical element aligned with the field of view of the emitter.
5. The hardware modification component as recited in claim 1 , wherein the optical element includes discontinuities that are integers of the wavelength of light.
6. The hardware modification component as recited in claim 1, further comprising: an ultra-wide angle lens aligned with a first wide angle sensor of the electronic device when the hardware modification component is coupled to the electronic device; a telephoto lens aligned with a telephoto sensor of the electronic device when the hardware modification component is coupled to the electronic device; a wide angle lens aligned with a second wide angle sensor of the electronic device when the hardware modification component is coupled to the electronic device; and a lidar lens aligned with a lidar sensor of the electronic device when the hardware modification component is coupled to the electronic device.
7. The hardware modification component as recited in claim 1 , wherein the optical element is configured for wavelengths between 700 nm and 1000 nm.
8. The hardware modification component as recited in claim 1, wherein the optical element is configured for wavelengths between 935 nm and 945 nm.
9. A system comprising: a hardware modification component comprising: a coupling mechanism to releasably couple the hardware modification component to an electronic device; and an optical element configured to generate a modified field of view for the sensor system when the hardware modification component is coupled; the electric device comprising: the sensor system configured to generate data associated with a physical environment surrounding the electronic device; one or more processors; and one or more non-transitory computer-readable media storing instructions that, when executed by the one or more processors, cause the one or more processors to perform operations comprising: determining a depth map associated with the sensor system based at least in part on an original field of view of the sensor system and the modified field of view of the sensor system.
10. The system as recited in claim 9, wherein the operations further comprise: determining, for each pixel of the depth map, an amount of time for light generated by an emitter of the sensor system to be detected by a photoreceptor of the sensor system; determining, for each pixel of the depth map, a three-dimensional coordinate associated with the physical environment; and determining object data based at least in part on the three-dimensional coordinates.
11. The system as recited in claim 9, wherein the optical element includes discontinuities that are integers of the wavelength of light.
12. The system as recited in claim 9, wherein the hardware modification component further comprises: an ultra-wide angle lens aligned with a first wide angle sensor of the electronic device when the hardware modification component is coupled to the electronic device; a telephoto lens aligned with a telephoto sensor of the electronic device when the hardware modification component is coupled to the electronic device; a wide angle lens aligned with a second wide angle sensor of the electronic device when the hardware modification component is coupled to the electronic device; and a lidar lens aligned with a lidar sensor of the electronic device when the hardware modification component is coupled to the electronic device.
13. The system as recited in claim 9, wherein the optical element is configured for wavelengths between 935 nm and 945 nm.
14. system as recited in claim 9, wherein the optical element is at least one of: a Fresnel lens; a plano-convex lens; or a diffractive lens.
15. A hardware modification component comprising: a coupling mechanism to releasably couple the hardware modification component to an electronic device; and a first optical element aligned with a field of view of an emitter of a sensor system the electronic device and configured to modify the field of view of the sensor system when the hardware modification component is coupled; and a second optical element aligned with a field of view of a photoreceptor of a sensor system the electronic device and configured to modify the field of view of the sensor system when the hardware modification component is coupled.
16. The hardware modification component as recited in claim 15, wherein the first optical element is configured for wavelengths between 700 nm and 1000 nm.
17. The hardware modification component as recited in claim 15, wherein the first optical element is configured for wavelengths between 935 nm and 945 nm.
18. The hardware modification component as recited in claim 15, further comprising: an ultra-wide angle lens aligned with a first wide angle sensor of the electronic device when the hardware modification component is coupled to the electronic device; a telephoto lens aligned with a telephoto sensor of the electronic device when the hardware modification component is coupled to the electronic device; a wide angle lens aligned with a second wide angle sensor of the electronic device when the hardware modification component is coupled to the electronic device; and a lidar lens aligned with a lidar sensor of the electronic device when the hardware modification component is coupled to the electronic device.
19. The hardware modification component as recited in claim 15, wherein the first optical element is configured to narrow the field of view of the sensor system and the second optical element is configured to narrow the field of view of the photoreceptor.
20. The hardware modification component as recited in claim 15, wherein the first optical element is configured to expand the field of view of the sensor system and the second optical element is configured to expand the field of view of the photoreceptor.
PCT/US2022/072138 2021-05-07 2022-05-05 System and method for converting three-dimensional sensors for object and limb scanning WO2022236304A1 (en)

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
US202163201657P 2021-05-07 2021-05-07
US63/201,657 2021-05-07

Publications (1)

Publication Number Publication Date
WO2022236304A1 true WO2022236304A1 (en) 2022-11-10

Family

ID=82020243

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/US2022/072138 WO2022236304A1 (en) 2021-05-07 2022-05-05 System and method for converting three-dimensional sensors for object and limb scanning

Country Status (1)

Country Link
WO (1) WO2022236304A1 (en)

Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2013070091A2 (en) * 2011-11-09 2013-05-16 Mark Ross Hampton Improvements in and in relation to a lens system for a camera
US20150220766A1 (en) * 2012-10-04 2015-08-06 The Code Corporation Barcode-reading enhancement system for a computing device that comprises a camera and an illumination system
US20170374253A1 (en) * 2016-06-24 2017-12-28 Moondog Optics, Inc. Lens attachment for multi-camera device
US20180100929A1 (en) * 2016-09-25 2018-04-12 James Thomas O'Keeffe Remote lidar with coherent fiber optic image bundle

Patent Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2013070091A2 (en) * 2011-11-09 2013-05-16 Mark Ross Hampton Improvements in and in relation to a lens system for a camera
US20150220766A1 (en) * 2012-10-04 2015-08-06 The Code Corporation Barcode-reading enhancement system for a computing device that comprises a camera and an illumination system
US20170374253A1 (en) * 2016-06-24 2017-12-28 Moondog Optics, Inc. Lens attachment for multi-camera device
US20180100929A1 (en) * 2016-09-25 2018-04-12 James Thomas O'Keeffe Remote lidar with coherent fiber optic image bundle

Similar Documents

Publication Publication Date Title
JP6914158B2 (en) Distance measurement sensor
US9967545B2 (en) System and method of acquiring three-dimensional coordinates using multiple coordinate measurment devices
KR102568462B1 (en) A detector for determining the position of at least one object
CN107923737B (en) Method and apparatus for superpixel modulation and ambient light rejection
CN107749070B (en) Depth information acquisition method and device and gesture recognition equipment
CN100592029C (en) Ranging apparatus
US7742635B2 (en) Artifact mitigation in three-dimensional imaging
CN110178156A (en) Range sensor including adjustable focal length imaging sensor
KR20220103962A (en) Depth measurement via display
CN108718406B (en) Variable-focus 3D depth camera and imaging method thereof
JP5133626B2 (en) Surface reflection characteristic measuring device
CN102855626A (en) Methods and devices for light source direction calibration and human information three-dimensional collection
KR20220006638A (en) Synchronized Image Capture for Electronic Scanning LIDAR Systems
US9805454B2 (en) Wide field-of-view depth imaging
EP4047386B1 (en) Depth detection apparatus and electronic device
KR20220134753A (en) Detector for object recognition
US20220103797A1 (en) Integrated Spatial Phase Imaging
WO2022236304A1 (en) System and method for converting three-dimensional sensors for object and limb scanning
KR20190102093A (en) System and method for three-dimensional profile determination using model-based peak selection
TWI712005B (en) Multi-spectrum high-precision object identification method
TWI630431B (en) Device and system for capturing 3-d images
EP3350982B1 (en) An apparatus and a method for generating data representing a pixel beam
WO2021049490A1 (en) Image registration device, image generation system, image registration method and image registration program
KR20230107574A (en) Depth measurement via display
CN113643415A (en) Using method and device of three-dimensional scanner based on full-color scanning

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 22730021

Country of ref document: EP

Kind code of ref document: A1

NENP Non-entry into the national phase

Ref country code: DE

122 Ep: pct application non-entry in european phase

Ref document number: 22730021

Country of ref document: EP

Kind code of ref document: A1