US20150042757A1 - Laser scanning systems and methods - Google Patents

Laser scanning systems and methods Download PDF

Info

Publication number
US20150042757A1
US20150042757A1 US14/456,052 US201414456052A US2015042757A1 US 20150042757 A1 US20150042757 A1 US 20150042757A1 US 201414456052 A US201414456052 A US 201414456052A US 2015042757 A1 US2015042757 A1 US 2015042757A1
Authority
US
United States
Prior art keywords
calibration
turntable
camera
laser
component
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Abandoned
Application number
US14/456,052
Inventor
Taylor S. Goodman
Vishnu Anantha
Benjamin R. McCallum
Jamie M. Charry
William B. Buel
Quynh Dinh
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
MakerBot Industries LLC
Original Assignee
MakerBot Industries LLC
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by MakerBot Industries LLC filed Critical MakerBot Industries LLC
Priority to US14/456,052 priority Critical patent/US20150042757A1/en
Assigned to MAKERBOT INDUSTRIES, LLC reassignment MAKERBOT INDUSTRIES, LLC ASSIGNMENT OF ASSIGNORS INTEREST (SEE DOCUMENT FOR DETAILS). Assignors: BUEL, WILLIAM B., ANANTHA, VISHNU, DINH, QUYNH, MCCALLUM, BENJAMIN R., CHARRY, JAMIE M., GOODMAN, TAYLOR S.
Publication of US20150042757A1 publication Critical patent/US20150042757A1/en
Abandoned legal-status Critical Current

Links

Images

Classifications

    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N13/00Stereoscopic video systems; Multi-view video systems; Details thereof
    • H04N13/20Image signal generators
    • H04N13/257Colour aspects
    • G06T7/002
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01BMEASURING LENGTH, THICKNESS OR SIMILAR LINEAR DIMENSIONS; MEASURING ANGLES; MEASURING AREAS; MEASURING IRREGULARITIES OF SURFACES OR CONTOURS
    • G01B11/00Measuring arrangements characterised by the use of optical techniques
    • G01B11/24Measuring arrangements characterised by the use of optical techniques for measuring contours or curvatures
    • G01B11/25Measuring arrangements characterised by the use of optical techniques for measuring contours or curvatures by projecting a pattern, e.g. one or more lines, moiré fringes on the object
    • G01B11/2504Calibration devices
    • FMECHANICAL ENGINEERING; LIGHTING; HEATING; WEAPONS; BLASTING
    • F21LIGHTING
    • F21VFUNCTIONAL FEATURES OR DETAILS OF LIGHTING DEVICES OR SYSTEMS THEREOF; STRUCTURAL COMBINATIONS OF LIGHTING DEVICES WITH OTHER ARTICLES, NOT OTHERWISE PROVIDED FOR
    • F21V14/00Controlling the distribution of the light emitted by adjustment of elements
    • F21V14/02Controlling the distribution of the light emitted by adjustment of elements by movement of light sources
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01BMEASURING LENGTH, THICKNESS OR SIMILAR LINEAR DIMENSIONS; MEASURING ANGLES; MEASURING AREAS; MEASURING IRREGULARITIES OF SURFACES OR CONTOURS
    • G01B11/00Measuring arrangements characterised by the use of optical techniques
    • G01B11/24Measuring arrangements characterised by the use of optical techniques for measuring contours or curvatures
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01BMEASURING LENGTH, THICKNESS OR SIMILAR LINEAR DIMENSIONS; MEASURING ANGLES; MEASURING AREAS; MEASURING IRREGULARITIES OF SURFACES OR CONTOURS
    • G01B11/00Measuring arrangements characterised by the use of optical techniques
    • G01B11/24Measuring arrangements characterised by the use of optical techniques for measuring contours or curvatures
    • G01B11/25Measuring arrangements characterised by the use of optical techniques for measuring contours or curvatures by projecting a pattern, e.g. one or more lines, moiré fringes on the object
    • G01B11/2518Projection by scanning of the object
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01CMEASURING DISTANCES, LEVELS OR BEARINGS; SURVEYING; NAVIGATION; GYROSCOPIC INSTRUMENTS; PHOTOGRAMMETRY OR VIDEOGRAMMETRY
    • G01C15/00Surveying instruments or accessories not provided for in groups G01C1/00 - G01C13/00
    • G01C15/002Active optical surveying means
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01CMEASURING DISTANCES, LEVELS OR BEARINGS; SURVEYING; NAVIGATION; GYROSCOPIC INSTRUMENTS; PHOTOGRAMMETRY OR VIDEOGRAMMETRY
    • G01C15/00Surveying instruments or accessories not provided for in groups G01C1/00 - G01C13/00
    • G01C15/002Active optical surveying means
    • G01C15/004Reference lines, planes or sectors
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T15/003D [Three Dimensional] image rendering
    • G06T15/50Lighting effects
    • G06T15/60Shadow generation
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T17/00Three dimensional [3D] modelling, e.g. data description of 3D objects
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/80Analysis of captured images to determine intrinsic or extrinsic camera parameters, i.e. camera calibration
    • G06T7/85Stereo camera calibration
    • H04N13/0246
    • H04N13/0296
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N17/00Diagnosis, testing or measuring for television systems or their details
    • H04N17/002Diagnosis, testing or measuring for television systems or their details for television cameras
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N23/00Cameras or camera modules comprising electronic image sensors; Control thereof
    • H04N23/60Control of cameras or camera modules
    • H04N23/62Control of parameters via user interfaces
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N23/00Cameras or camera modules comprising electronic image sensors; Control thereof
    • H04N23/80Camera processing pipelines; Components thereof
    • H04N23/84Camera processing pipelines; Components thereof for processing colour signals
    • H04N23/88Camera processing pipelines; Components thereof for processing colour signals for colour balance, e.g. white-balance circuits or colour temperature control
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/10Image acquisition modality
    • G06T2207/10004Still image; Photographic image
    • G06T2207/10012Stereo images
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N2209/00Details of colour television systems
    • H04N2209/04Picture signal generators
    • H04N2209/041Picture signal generators using solid-state devices
    • H04N2209/042Picture signal generators using solid-state devices having a single pick-up sensor

Definitions

  • a three-dimensional scanner uses a rotatable mounting structure to secure a laser line source in a manner that permits rotation of a projected laser line about an axis of the laser, along with movement of the laser through an arc in order to conveniently position and orient the resulting laser line.
  • a progressive calibration scheme may be employed with a calibration fixture to calibrate a camera, a turntable, and a laser for coordinated use as a three-dimensional scanner.
  • parameters for a scan may be automatically created to control, e.g., laser intensity and camera exposure based on characteristics of a scan subject such as surface characteristics or color gradient.
  • FIG. 1 shows a three-dimensional scanner.
  • FIG. 2 shows a block diagram of a three-dimensional scanner system.
  • FIG. 3 shows a perspective view of a device for aligning a laser.
  • FIG. 4 shows a cross section of a laser housing.
  • FIG. 5 shows a calibration component
  • FIG. 6 shows a method for calibrating a three-dimensional scanner.
  • FIG. 7 shows a user interface for automatically selecting three-dimensional scan parameters.
  • FIG. 8 shows a method for automatically selecting three-dimensional scan parameters.
  • FIG. 1 shows a three-dimensional scanner.
  • the scanner 100 may include a turntable 102 , one or more lasers 104 , a camera 106 , and a controller 108 .
  • the turntable 102 may be any rotating surface such as a rigid plate or the like, which may be rotated to present various surfaces of an object on the turntable to the lasers 104 and the camera 106 .
  • the one or more lasers 104 may be any lasers suitable for projecting lines onto an object that is being scanned on the turntable 102 .
  • the lasers 104 can be 3.2 V line lasers or the like with 55 degree fans or any other laser or combination of lasers suitable for a three-dimensional scanning system.
  • the camera 106 can be a USB 2.0 Board Camera. While any resolution consistent with desired scan resolution may be used, a 1.3 MP or better color complementary metal-oxide semiconductor (CMOS) image sensor is cheaply commercially available and suitable for many applications.
  • the camera 106 can, for example, operate at 30 frames-per-second with a rolling shutter and a 12 inch focal distance. In another aspect, the camera 106 can operate at 7.5 frames-per-second. In other aspects, the camera can be any camera that can work in a three-dimensional scanning system.
  • the camera 106 can also take video footage and provide a video feed to a user device 206 (as shown in FIG. 2 ) via a user interface.
  • the scanner 100 can also include a red band-pass filter for the camera 106 , which may be fixed or removable/replaceable.
  • the filter may for example be a 25 mm or 27 mm removable and/or retractable 650 nmCW band pass filter with 40 nm pass band.
  • the band-pass filter can remain on the camera 106 during scans for optimal scans. In another aspect, the band-pass filter can be removed for a scan.
  • an item can be placed on the turntable 102 .
  • the lasers 104 can create laser lines that reflect off the object.
  • the camera 106 can take rapid photographs of the laser lines and a point cloud can be generated via the controller 108 connected to the scanner 100 .
  • the controller 108 can be electrically or otherwise coupled in a communicating relationship with the turntable 102 , the lasers 104 and the camera 106 .
  • the controller 108 is operable to control the components of the scanner 100 .
  • the controller 108 may include any combination of software and/or processing circuitry suitable for controlling the various components of the scanner 100 described herein including without limitation microprocessors, microcontrollers, application-specific integrated circuits, programmable gate arrays, and any other digital and/or analog components, as well as combinations of the foregoing, along with inputs and outputs for transceiving control signals, drive signals, power signals, sensor signals, and so forth.
  • this may include circuitry directly and physically associated with the scanner 100 such as an on-board processor.
  • this may be a processor associated with a personal computer or other computing device (e.g., a user device 206 as shown in FIG. 2 ) coupled to the scanner 100 , e.g., through a wired or wireless connection.
  • controller or “processor” as used herein, unless a different meaning is explicitly provided or otherwise clear from the context.
  • FIG. 2 shows a three-dimensional scanner system 200 .
  • the scanner 100 can be coupled to a user device 206 via a USB cable or any other connector used for locally connecting electronic devices to each other.
  • the scanner 100 can alternatively or additionally be coupled to the user device 206 through a data network 202 .
  • the data network 202 may be any network(s) or internetwork(s) suitable for communicating data and control information among participants in the environment 200 .
  • This may include public networks such as the Internet, private networks, telecommunications networks such as the Public Switched Telephone Network or cellular networks using third generation (e.g., 3G or IMT-2000), fourth generation (e.g., LTE (E-UTRA) or WiMax-Advanced (IEEE 802.16m)) and/or other technologies, as well as any of a variety of corporate area or local area networks and other switches, routers, hubs, gateways, and the like that might be used to carry data among participants
  • third generation e.g., 3G or IMT-2000
  • fourth generation e.g., LTE (E-UTRA) or WiMax-Advanced (IEEE 802.16m)
  • IEEE 802.16m WiMax-Advanced
  • the scanner 100 can include a network interface for connecting to the data network 202 .
  • the network interface may comprise, e.g., a network interface card, which term is used broadly herein to include any hardware (along with software, firmware, or the like to control operation of same) suitable for establishing and maintaining wired and/or wireless communications.
  • the network interface card may include without limitation wired Ethernet network interface cards (“NICs”), wireless 802.11 networking cards, wireless 802.11 USB devices, or other hardware for wireless local area networking.
  • NICs wired Ethernet network interface cards
  • the network interface may also or instead include cellular network hardware, wide area wireless network hardware or any other hardware for centralized, ad hoc, peer-to-peer, or other radio communications that might be used to carry data.
  • the network interface may include a serial or USB port to directly connect to a computing device such as a desktop computer that, in turn, provides more general network connectivity to the data network.
  • the user device 206 may be a computing device such as a laptop computer, desktop computer, tablet, smart phone, or other computing device that can be operated by a user to provide a user input to control the scanner 100 .
  • the scanner 100 may be configured with a display, user input devices, and the like so that the scanner 100 acts as the user device 206 .
  • the user input devices may include a display, buttons, or other physical user interface element(s) on the scanner 100 that a user can interact with.
  • the scanner 100 can begin analyzing the object that is placed on the turntable 102 via the controller 108 .
  • the controller 108 creates a point cloud
  • the user device 206 can convert the point cloud into a viewable mesh that can be saved as a Thing file, STL, or other supported mesh formats.
  • the object can revolve on the turntable twice.
  • the first right laser 104 can create a laser line that reflects off of the object during the first revolution and the second left laser 104 can create a laser line that reflects off of the object during the second revolution.
  • the left laser 104 can create a laser line that reflects off of the object during the first revolution and the right laser 104 can create a laser line that reflects off of the object during the second revolution. In another aspect only one of the lasers can scan the object during the scan.
  • the information from the camera 106 or the two or more lasers 104 can be combined to create a point cloud.
  • the user device 206 can convert the point cloud into a continuous mesh via any combination of software and/or processing circuitry located on the user device 206 .
  • the three-dimensional scanner system 200 can be used for scanning, calibration and automatically sending the scan data to a social networking platform hosted, e.g., on a server 204 , which may be a general social networking platform or a special purpose platform dedicated to, e.g., three-dimensional printing, three-dimensional modeling, computer automated design, or the like.
  • a social networking platform hosted, e.g., on a server 204 , which may be a general social networking platform or a special purpose platform dedicated to, e.g., three-dimensional printing, three-dimensional modeling, computer automated design, or the like.
  • the server 204 may include data storage, a network interface and a processor and/or or processing circuitry. In general, the server 204 may be configured to perform a variety of processing tasks related to the three-dimensional scanning of objects. For example, the server 204 may manage scan jobs received from one or more of the user devices 206 , and provide related supporting functions such as content search and management. The server 204 may also include a web server that provides web-based access by the user device 206 to the capabilities of the server 204 . The server 204 may also communicate periodically with the scanner 100 in order to obtain status information concerning, e.g., the status of particular scan jobs, any of which may be subsequently presented to a user through the web server or any other suitable interface.
  • status information e.g., the status of particular scan jobs, any of which may be subsequently presented to a user through the web server or any other suitable interface.
  • scanning can begin.
  • the user input can first include clicking a physical or digital button to automatically back up the scans to the server 204 via the data network 202 .
  • the processor on the user device 206 or the scanner 100 can prompt this user input either before the first scan of the scanner 100 and/or before or after every scan.
  • User input can be entered via the user interface on the user device 206 to initiate the scan.
  • the user interface can prompt the user to place an object on the turntable 102 in the correct position.
  • the user interface can prompt the user to place the object on the center of the turntable 102 .
  • the user interface can prompt the user to place the object on a positioning stand (not pictured). More generally, the user interface may provide step-by-step menus or other interactive elements to guide a user through a scanning procedure.
  • a positioning stand can be used to secure objects that are not stable without support. The stand can also be used to elevate small objects or to secure an object in a specific orientation.
  • the positioning stand can comprise a platform, a rod and one or more arms.
  • a video feed can be shown via the user interface on the client device 206 to assist a user in a placement of an object.
  • the user interface can prompt the user to start the scan.
  • the user interface can show the time remaining in the scan, a video feed of the object as it is being scanned, and or a point cloud assembly as it is being generated via the controller 108 .
  • a user can be given an option of cropping the scanned object model.
  • Other post-scanning features such as smoothing, leveling, labeling (e.g., with three-dimensional text or the like), hole filling, and so forth may also be automatically or semi-automatically performed through the user interface.
  • the model can be (automatically or otherwise) shared with a social network, sent to a three-dimensional printer coupled to the scanner 100 for printing, and/or exported for saving.
  • the user interface can allow the user device 206 to take a picture of the object scanned and associate it with the three-dimensional model. For example a “Take a Photo” dialog can open and show a view of what the scanner 100 sees via the camera. A prompt to slide the red band-pass filter away from the camera lens can be shown before the picture is taken.
  • the scanner 100 may provide processing circuitry to control operation of the scanner systems contemplated herein, such as by performing the various processes and functions described below.
  • FIG. 3 shows a device for aligning a laser.
  • a scanner may have one or more lasers, as noted above, which are preferably aligned to project a line or other pattern in a predetermined manner across a scanning volume.
  • the device 300 may be used to align a laser 302 so that a desired orientation may be obtained.
  • the laser 302 may be any of the lasers described above, or any other laser that can be configured to project a line or other pattern on a target of a scan.
  • the laser 302 may, for example, be a 3.2 Volt line laser, and/or the laser may have a 55 degree fan over which a laser line is projected.
  • the pattern may be a line or the like, which may be obtained using an suitable optics, filters or the like to focus and direct light from the laser.
  • the laser 302 may have an axis 304 with an orientation that, when the laser 302 is placed for use in the device 300 (which is in the scanner), directs the laser 302 toward a desired target, or more generally, in a desired direction.
  • the laser 302 and a line or other pattern from the laser 302 may have a rotational orientation about the axis 304 .
  • the rotational orientation of the laser 302 may be controlled.
  • a laser housing 308 secures the laser 302 in a desired orientation within a mount 316 .
  • the laser housing 308 may include a cavity 310 to receive the laser 302 .
  • the laser housing 308 may also include a toothed wheel 312 with a plurality of teeth 314 radially spaced about the axis 304 of the laser 302 when the laser 302 is placed for use in the cavity 310 .
  • a mount 316 for the laser housing 308 may include a base 318 configured to be coupled to an external assembly such as a scanner housing.
  • the base 318 may include any suitable slots, tabs, registration features, screw holes, and the like, or any other suitable mechanism(s) for rigidly coupling to the external assembly in a fixed orientation.
  • a holder 320 of the mount 316 may be configured to retain the laser housing 308 in a predetermined orientation while permitting rotation of the laser housing 308 (and the laser 302 ) about the axis 304 of the laser 302 .
  • the laser housing 308 generally retains the laser housing 308 in rotational engagement about the axis 304 of the laser 302 .
  • the mount 316 may further include a hinge 322 that hingably couples the base 318 to the holder 320 .
  • the laser housing 308 may be configured to snap-fit into the mount 316 where it may be retained by a number of fingers 340 , flanges, or the like, or alternatively stated, the mount 316 may be configured to receive the laser housing 308 and retain the laser housing 308 with any of a variety of snap-fit mechanisms.
  • the adjustment wheel 324 may be operable as a thumb click wheel or the like, or a supplemental drive wheel 325 may be provided for manual or automated activation.
  • the adjustment wheel 324 is operable to rotate the laser housing 308 around the axis 304 of the laser 302 . This rotation may be performed, e.g., by manually rotating the adjustment wheel 324 , or by rotating the supplemental adjustment wheel 325 with a motor or other electro-mechanical drive.
  • the adjustment wheel 324 may be ratcheted or otherwise mechanically secured by the plurality of teeth 314 against free rotation after a desired rotational orientation has been established.
  • the adjustment wheel 324 may be a click wheel that moves in discrete units of rotation accompanied by an audible or tactile click.
  • the click wheel may be thumb operable, and may move in fixed increments such as three degree increments, or at any other suitable, regular intervals of rotation.
  • the click wheel may click against a nub, spring, or the like on the mount 316 .
  • An adjustment rod 326 may also be provided that couples the base 318 to the holder 320 at a position away from the hinge 322 .
  • the adjustment rod 326 may be operable to displace the base 318 relative to the holder 320 along a second axis 330 of the adjustment rod.
  • the hinge 322 is rotated (or hinged) thus moving the axis 304 of the laser relative to the base 318 .
  • the adjustment rod 326 can be used to steer the axis 304 through an arc by flexing the hinge 322 .
  • the adjustment rod 326 may be a threaded rod that is threaded through a threaded insert 342 that is coupled to a fixed location such as a location in the base 318 or in the holder 320 .
  • the threaded insert 342 may travel along the threaded rod, thus moving the holder 320 relative to the base 318 and flexing the hinge 322 to reorient the laser 302 .
  • the laser housing 308 and the mount 316 may be formed of any suitable materials according to desired weight, strength, durability, and so forth.
  • the laser housing 308 and the mount 316 may be formed of an injection molded plastic and/or a plastic such as a polycarbonate acrylonitrile butadiene styrene or an acetal homopolymer.
  • the mount 316 may include a spring 350 such as a coil spring or any other suitable compression spring or the like that urges the holder 320 and the base 318 into a predetermined relative orientation.
  • This spring 350 may thus bias the laser 302 toward a predetermined orientation relative to the mount 316 when the laser 302 is placed for use in the cavity 310 of the holder 320 .
  • the spring 350 may be a separate, discrete component, the spring may also or instead be a living plastic spring formed for example by a resilient material of the hinge 322 .
  • the living plastic spring (or any other spring 350 as contemplated herein) may generally bias the laser 302 toward any predetermined position or orientation such as toward a predetermined position relative to the mount 316 when the laser 302 is placed for use in the cavity 310 .
  • This configuration advantageously provides convenient positioning and rotation of a line laser within a scan volume with a relatively simple mechanical arrangement and a small number of moving parts. Additional adjustments may be necessary or desirable, and as such a supplemental positioning assembly 360 may be provided in order to provide additional degrees of rotational or translational freedom for adjusting the laser 302 .
  • the positioning assembly 360 may facilitate translation of the axis 304 within a plane perpendicular to the axis 304 , or alignment of the axis 304 of the laser 302 with one or more additional degrees of freedom, that is, degrees of freedom not provided by the mount 316 and laser housing 308 described above.
  • This may include any suitable fixture, set screws, and so forth, for adjusting position and orientation of the base 318 relative to a fixed physical reference that the base 318 is attached to (such as a scanner housing).
  • a variety of suitable mechanisms are known in the art and may be adapted for use as a positioning assembly 360 as contemplated herein.
  • FIG. 4 shows a cross section of a laser housing such as the laser housing described above.
  • the laser housing 400 may generally include a cavity 402 to receive a laser as described above.
  • the laser housing 400 may also include a plurality of engagement elements 404 such as ribs, fins, protrusions or the like within the cavity 402 that secure a laser in a desired position and orientation within the cavity 402 .
  • the engagement elements 404 may in general be shaped and sized in any suitable manner to hold a laser when the laser is positioned in the cavity.
  • the engagement elements 404 may include ribs as illustrated, which may secure the laser with a press-fit or interference fit to frictionally engage the laser in the desired position.
  • a second cavity 406 may be included that is formed to receive a drive head such as a screw driver, hex wrench or the like.
  • the second cavity 406 may be positioned within the mount 316 of FIG. 3 such that the second cavity 406 is accessible externally with a screw driver or the like to adjust the rotational orientation of the laser.
  • a portion of the adjustment wheel 324 may be exposed outside a scanner housing to facilitate convenient manual adjustment.
  • FIG. 5 shows a calibration component.
  • a scanner such as any of the scanners described herein may be calibrated prior to use in order to obtain more accurate scan results.
  • this involves placing a calibration component such as the calibration component 500 shown in FIG. 5 onto the turntable of a scanner and capturing images in a variety of poses and under a variety of lighting conditions.
  • the calibration component 500 may be a multi-part component that can be configured to present a variety of different surfaces, patterns and the like.
  • the calibration component 500 may have a base 502 with angled surfaces and a checkerboard pattern or the like, as well as a removable plate 504 that can be removed from and replaced to the base 502 to provide a horizontal surface for calibration-related data acquisition.
  • a checkerboard is shown as the calibration pattern 506 , it will be understood that a variety of calibration patterns may also or instead be employed including, without limitation a dot grid, a line grid, a random pattern, and so forth.
  • the calibration pattern 506 may also or instead include a predetermined three-dimensional shape of the calibration component 500 , such as the angled surfaces of the base 502 .
  • the calibration component 500 may include a plurality of surfaces. This may include at least three panels 510 , 512 , 514 each including the calibration pattern 506 (i.e., the same pattern) or different calibration patterns, or some combination of these.
  • the calibration component 500 may also include two different faces such as a first face formed by one of the panels 510 , and a second face formed by the other panels 512 , 514 .
  • one of the panels 510 may be removable and the face of the first panel 510 may occlude the calibration pattern on the other panels 512 , 514 when attached to the base 502 . This permits a single calibration fixture to provide various different patterns and three-dimensional shapes to facilitate various calibration steps as discussed below.
  • the calibration component 500 may include a tab 516 or other protrusion or the like configured to couple the calibration component 500 , or the base 502 of the calibration component 500 , to a turntable or other base for a scanning system in order to retain the calibration component 500 in a predetermined position and orientation during calibration. Any other number of tabs may be provided to secure the calibration component 500 , or the base 502 or one of the panels 510 , 512 , 514 in a desired orientation for use in calibrating a scanner.
  • FIG. 6 shows a method for calibrating a three-dimensional scanner.
  • a multi-configuration calibration component may provide a variety of configurable and positionable surfaces that can be used in different calibration steps. With this calibration component, a progressive calibration of a camera, a turntable, and a laser may be performed. Configuration and positioning of the calibration component may be orchestrated by a user interface that interactively guides a user through various positioning and configuration steps.
  • the method 600 may begin with receiving user input including a request to initiate calibration of a three-dimensional scanner.
  • the three-dimensional scanner may include a turntable, a laser, and a camera as generally described above.
  • the request may be received, for example, from a user through a user interface, which may be a user interface rendered on the scanner or any suitable device coupled to the scanner such as a local desktop or laptop computer.
  • the method 600 may include providing information to the user for positioning a calibration component on the turntable in a first position for camera calibration.
  • the calibration component may be any of the calibration components described herein, and may for example include a plurality of surfaces with at least two of the plurality of surfaces include calibration patterns.
  • the information may be provided, for example, by displaying instructions to the user in the user interface.
  • the instructions may specify a configuration of the calibration component, particularly where the component has removable surfaces or other multiple configurations, and may more specifically identify slots, tabs or the like on the turntable where the calibration component should be placed.
  • the method 600 may include receiving an indication that the calibration component is properly positioned on the turntable for camera calibration.
  • This confirmation may be received, for example by a user pressing a button on the scanner or operating a control in the user interface (after suitably placing the component). Placement may also or instead be confirmed automatically or semi-automatically by capturing and analyzing images from the uncalibrated camera(s).
  • receiving the indication that the calibration component is properly positioned or configured may in general include receiving a manual user input, receiving a computer generated input such as an input from a computer vision system, or some combination of these.
  • the method 600 may include rotating the turntable about a rotation axis thereby rotating the calibration component.
  • the method 600 may include capturing images of the calibration component on the turntable with the camera as the turntable is rotating, thereby providing a first plurality of images. This may include capturing video images, or capturing still images at a predetermined rate, e.g., at particular time intervals or at particular rotational intervals of the turntable.
  • the method 600 may include performing a first calibration calculation with the first plurality of images to calibrate the camera, thereby providing a calibrated camera.
  • Camera calibration is a necessary step in three-dimensional processing to facilitate extraction of three-dimensional data from two-dimensional images.
  • suitable techniques are known and well characterized in the art, and these techniques are not repeated here except to note generally that known features and/or displacements can be used to recover three-dimensional characteristics or parameters of a camera system in a manner that permits subsequent three-dimensional measurements with improved accuracy.
  • the method 600 may include providing information to the user for positioning the calibration component on the turntable for turntable calibration. This may also or instead include providing information to reconfigure the calibration component, e.g., by adding or removing a panel, or by changing a position or orientation of a panel or other element of the calibration component.
  • the information may be provided, for example, by displaying instructions to the user in the user interface.
  • the instructions may specify a configuration of the calibration component, particularly where the component has removable surfaces or other multiple configurations, and may more specifically identify slots, tabs or the like on the turntable where the calibration component should be placed.
  • the method 600 may include receiving an indication that the calibration component is properly positioned for turntable calibration. This confirmation may be received, for example by a user pressing a button on the scanner or operating a control in the user interface. Placement may additionally be confirmed automatically or semi-automatically by capturing and analyzing images from the camera(s).
  • receiving the indication that the calibration component is properly positioned or configured may in general include receiving a manual user input, receiving a computer generated input such as an input from a computer vision system, or some combination of these.
  • the method 600 may include rotating the turntable about the rotation axis thereby rotating the calibration component.
  • the method 600 may include capturing a second plurality of images of the calibration pattern included on at least one of the plurality of surfaces of the calibration component using the calibrated camera. This may include capturing video images, or capturing still images at a predetermined rate, e.g., at particular time intervals or at particular rotational intervals of the turntable.
  • the method 600 may include determining locations of predetermined points on the calibration pattern using the captured images. This may be, e.g., corners of the calibration pattern on the calibration component, or other interstitial locations within the checkerboard pattern or the like. In one aspect, determining locations may include using computer vision to determine the corners of the checkerboard or any other suitable feature or location within a calibration pattern.
  • the method 600 may include determining a rotational position of the rotation axis of the turntable with respect to the camera based upon the locations of the predetermined points, thereby providing a calibrated turntable.
  • the turntable may be calibrated so that it can produce accurate, controllable rotational orientations.
  • a variety of calibration techniques are known in the art that may be suitably adapted for use in providing a calibrated turntable as contemplated herein.
  • determining the rotational position of the rotation axis of the turntable with respect to the camera may include computing centers for circles created by rotation of the predetermined points of the calibration pattern about the rotation axis and averaging the centers to determine an average center representing the rotational position of the rotation axis.
  • the method 600 may include providing information to the user for positioning the calibration component on the turntable in a third position (and/or configuration) for laser calibration.
  • the third position may include the calibration component oriented such that the calibration patterns of the at least two of the plurality of surfaces are non-planar with respect to each other and are disposed in a field of view of the calibrated camera. For example, by removing a horizontal panel to expose to non-planar panels such as those described above with reference to FIG. 5 , a suitable calibration surface may be presented.
  • the calibration component may include a removable panel that is removed to configure the calibration component for laser calibration.
  • the method 600 may include receiving an indication from the user that the calibration component is properly positioned for laser calibration. This confirmation may be received, for example by a user pressing a button on the scanner or operating a control in the user interface. Placement may additionally be confirmed automatically or semi-automatically by capturing and analyzing images from the camera(s).
  • receiving the indication that the calibration component is properly positioned or configured may in general include receiving a manual user input, receiving a computer generated input such as an input from a computer vision system, or some combination of these.
  • the method 600 may include directing a beam of the laser on the calibration patterns of the calibration component in the field of view of the calibrated camera.
  • the method 600 may include capturing a third plurality of images of the beam on the calibration patterns of the calibration component.
  • the method 600 may include performing a calibration calculation for the laser based on the third plurality of images, thereby providing a calibrated laser.
  • This may generally include any suitable calibration calculations for improving accuracy of the laser in terms of, e.g., focus, position, intensity, or any other controllable aspect of the laser.
  • the method 600 may include removing the calibration component from the turntable so that a scanning volume is available for a scan target.
  • the method 600 may include capturing a scan of an object with the calibrated camera, the calibrated turntable, and the calibrated laser.
  • FIG. 7 shows a user interface 700 for automatically selecting three-dimensional scan parameters.
  • a three-dimensional scanner such as a scanner with a turntable, laser, and camera as described herein
  • the optical properties of a scan target can significantly influence scan results.
  • a user may not be able to readily balance, e.g., laser output parameters and camera exposure parameters to achieve the best results.
  • a semi-automated process may be provided that permits a user to specify various optical properties such as shade, color and surface texture.
  • the scanner or other processing circuitry associated with the scanner, such as a locally coupled computer
  • the user interface 700 of FIG. 7 specifically depicts a user selection of one of three possible shades (light, medium, dark), it will be understood that any other user-perceptible optical characteristics may also or instead be used including without limitation surface texture, opacity, transparency, glossiness, and so forth.
  • FIG. 8 shows a method for automatically selecting three-dimensional scan parameters.
  • the method 800 may include receiving user selections of various optical properties, and then adjusting specific system parameters according to the user-provided information.
  • the method 800 may begin with providing a first prompt in a user interface configured to receive a user input selecting a color gradient that best matches an object to be scanned by the three-dimensional scanner.
  • the color gradient may, for example, include a shade selected from a group consisting of light, medium, and dark, or any other suitable categories at any desired level of granularity or detail.
  • the method 800 may include providing manual decision support to a user in the user interface.
  • a decision may be assisted with any of a variety of visual aids for the user.
  • this step may include displaying examples of each of the two or more shades to assist the user in selecting the color gradient at the first prompt.
  • This step may further include displaying a video feed of the object to be scanned for a direct, on-screen comparison of the object to the examples within the user interface.
  • the method 800 may also or instead include providing automated decision support to a user. For example, this may include capturing an image of the object that is to be scanned, and performing a comparison of the image to a number of images of previously scanned objects using, for example, any of a variety of similarity measures or the like.
  • the method may include providing relevant information to a user, such as by presenting a selection of a color gradient or surface characteristic that resulted in a successful scan of one or more of the previously scanned objects.
  • the method may proceed fully automatically, e.g., by automatically selecting a color gradient or a surface characteristic for a scan based on the comparison when one of the number of images appears to include closely corresponding optical properties.
  • the method 800 may include providing a second prompt in the user interface configured to receive a user input selecting a surface characteristic that best matches a surface of the object to be scanned, the surface characteristic including at least one of glossiness, fuzziness, and texture.
  • the method 800 may include adjusting camera parameters based on the color gradient and the surface characteristic thereby adjusting an exposure of the camera. For example, this may include adjusting a shutter speed and a lens aperture of the camera to suitable selections best matched to the characteristics of the object. For example, where the color gradient is light, the camera may be responsively adjusted to a lower exposure. Where the color gradient is dark, the camera may be responsively adjusted to a higher exposure. In another aspect, a fixed exposure may be maintained independent of the color gradient, but the exposure may vary in response to other factors such as a color composition or surface texture.
  • the method 800 may include adjusting an intensity of the laser based on the color gradient and the surface characteristic. For example, where the color gradient is light, the laser may be responsively adjusted to a higher intensity. Where the color gradient id dark, the laser may be responsively adjusted to a lower intensity. In another aspect, a fixed laser intensity may be maintained independent of the color gradient, but the laser intensity may vary in response to other factors such as a color composition or surface texture.
  • the method 800 may include scanning an object using the adjusted laser and camera parameters.
  • the above systems, devices, methods, processes, and the like may be realized in hardware, software, or any combination of these suitable for the control, data acquisition, and data processing described herein.
  • a realization of the processes or devices described above may include computer-executable code created using a structured programming language such as C, an object oriented programming language such as C++, or any other high-level or low-level programming language (including assembly languages, hardware description languages, and database programming languages and technologies) that may be stored, compiled or interpreted to run on one of the above devices, as well as heterogeneous combinations of processors, processor architectures, or combinations of different hardware and software.
  • processing may be distributed across devices such as the various systems described above, or all of the functionality may be integrated into a dedicated, standalone device. All such permutations and combinations are intended to fall within the scope of the present disclosure.
  • Embodiments disclosed herein may include computer program products comprising computer-executable code or computer-usable code that, when executing on one or more computing devices, performs any and/or all of the steps of the control systems described above.
  • the code may be stored in a non-transitory fashion in a computer memory, which may be a memory from which the program executes (such as random access memory associated with a processor), or a storage device such as a disk drive, flash memory or any other optical, electromagnetic, magnetic, infrared or other device or combination of devices.
  • any of the control systems described above may be embodied in any suitable transmission or propagation medium carrying computer-executable code and/or any inputs or outputs from same.
  • performing the step of X includes any suitable method for causing another party such as a remote user, a remote processing resource (e.g., a server or cloud computer) or a machine to perform the step of X.
  • performing steps X, Y and Z may include any method of directing or controlling any combination of such other individuals or resources to perform steps X, Y and Z to obtain the benefit of such steps.

Abstract

A three-dimensional scanner uses a rotatable mounting structure to secure a laser line source in a manner that permits rotation of a projected laser line about an axis of the laser, along with movement of the laser through an arc in order to conveniently position and orient the resulting laser line. Where the laser scanner uses a turntable or the like, a progressive calibration scheme may be employed with a calibration fixture to calibrate a camera, a turntable, and a laser for coordinated use as a three-dimensional scanner. Finally, parameters for a scan may be automatically created to control, e.g., laser intensity and camera exposure based on characteristics of a scan subject such as surface characteristics or color gradient.

Description

    RELATED MATTERS
  • This application claims the benefit of U.S. Prov. App. No. 61/864,158 filed on Aug. 9, 2013, U.S. Prov. App. No. 61/875,360 filed on Sep. 9, 2013, and U.S. Prov. App. No. 61/906,171 filed on Nov. 19, 2013. The content of each of these applications is hereby incorporated by reference in its entirety.
  • BACKGROUND
  • There remains a need for improved techniques for three-dimensional scanning.
  • SUMMARY
  • A three-dimensional scanner uses a rotatable mounting structure to secure a laser line source in a manner that permits rotation of a projected laser line about an axis of the laser, along with movement of the laser through an arc in order to conveniently position and orient the resulting laser line. Where the laser scanner uses a turntable or the like, a progressive calibration scheme may be employed with a calibration fixture to calibrate a camera, a turntable, and a laser for coordinated use as a three-dimensional scanner. Finally, parameters for a scan may be automatically created to control, e.g., laser intensity and camera exposure based on characteristics of a scan subject such as surface characteristics or color gradient.
  • BRIEF DESCRIPTION OF THE DRAWINGS
  • The foregoing and other objects, features and advantages of the devices, systems, and methods described herein will be apparent from the following description of particular embodiments thereof, as illustrated in the accompanying drawings. The drawings are not necessarily to scale, emphasis instead being placed upon illustrating the principles of the devices, systems, and methods described herein.
  • FIG. 1 shows a three-dimensional scanner.
  • FIG. 2 shows a block diagram of a three-dimensional scanner system.
  • FIG. 3 shows a perspective view of a device for aligning a laser.
  • FIG. 4 shows a cross section of a laser housing.
  • FIG. 5 shows a calibration component.
  • FIG. 6 shows a method for calibrating a three-dimensional scanner.
  • FIG. 7 shows a user interface for automatically selecting three-dimensional scan parameters.
  • FIG. 8 shows a method for automatically selecting three-dimensional scan parameters.
  • DETAILED DESCRIPTION
  • The embodiments will now be described more fully hereinafter with reference to the accompanying figures, in which preferred embodiments are shown. The foregoing may, however, be embodied in many different forms and should not be construed as limited to the illustrated embodiments set forth herein. Rather, these illustrated embodiments are provided so that this disclosure will convey the scope to those skilled in the art.
  • All documents mentioned herein are hereby incorporated by reference in their entirety. References to items in the singular should be understood to include items in the plural, and vice versa, unless explicitly stated otherwise or clear from the text. Grammatical conjunctions are intended to express any and all disjunctive and conjunctive combinations of conjoined clauses, sentences, words, and the like, unless otherwise stated or clear from the context. Thus, the term “or” should generally be understood to mean “and/or” and so forth.
  • Recitation of ranges of values herein are not intended to be limiting, referring instead individually to any and all values falling within the range, unless otherwise indicated herein, and each separate value within such a range is incorporated into the specification as if it were individually recited herein. The words “about,” “approximately,” or the like, when accompanying a numerical value, are to be construed as indicating a deviation as would be appreciated by one of ordinary skill in the art to operate satisfactorily for an intended purpose. Ranges of values and/or numeric values are provided herein as examples only, and do not constitute a limitation on the scope of the described embodiments. The use of any and all examples, or exemplary language (“e.g.,” “such as,” or the like) provided herein, is intended merely to better illuminate the embodiments and does not pose a limitation on the scope of the embodiments. No language in the specification should be construed as indicating any unclaimed element as essential to the practice of the embodiments.
  • In the following description, it is understood that terms such as “first,” “second,” “top,” “bottom,” “above,” “below,” and the like, are words of convenience and are not to be construed as limiting terms.
  • FIG. 1 shows a three-dimensional scanner. The scanner 100 may include a turntable 102, one or more lasers 104, a camera 106, and a controller 108.
  • The turntable 102 may be any rotating surface such as a rigid plate or the like, which may be rotated to present various surfaces of an object on the turntable to the lasers 104 and the camera 106.
  • The one or more lasers 104 may be any lasers suitable for projecting lines onto an object that is being scanned on the turntable 102. The lasers 104 can be 3.2 V line lasers or the like with 55 degree fans or any other laser or combination of lasers suitable for a three-dimensional scanning system.
  • The camera 106 can be a USB 2.0 Board Camera. While any resolution consistent with desired scan resolution may be used, a 1.3 MP or better color complementary metal-oxide semiconductor (CMOS) image sensor is cheaply commercially available and suitable for many applications. The camera 106 can, for example, operate at 30 frames-per-second with a rolling shutter and a 12 inch focal distance. In another aspect, the camera 106 can operate at 7.5 frames-per-second. In other aspects, the camera can be any camera that can work in a three-dimensional scanning system. The camera 106 can also take video footage and provide a video feed to a user device 206 (as shown in FIG. 2) via a user interface. The scanner 100 can also include a red band-pass filter for the camera 106, which may be fixed or removable/replaceable. The filter may for example be a 25 mm or 27 mm removable and/or retractable 650 nmCW band pass filter with 40 nm pass band. The band-pass filter can remain on the camera 106 during scans for optimal scans. In another aspect, the band-pass filter can be removed for a scan.
  • In general operation, an item can be placed on the turntable 102. As the item rotates on the turntable 102, the lasers 104 can create laser lines that reflect off the object. The camera 106 can take rapid photographs of the laser lines and a point cloud can be generated via the controller 108 connected to the scanner 100. The controller 108 can be electrically or otherwise coupled in a communicating relationship with the turntable 102, the lasers 104 and the camera 106. In general the controller 108 is operable to control the components of the scanner 100. The controller 108 may include any combination of software and/or processing circuitry suitable for controlling the various components of the scanner 100 described herein including without limitation microprocessors, microcontrollers, application-specific integrated circuits, programmable gate arrays, and any other digital and/or analog components, as well as combinations of the foregoing, along with inputs and outputs for transceiving control signals, drive signals, power signals, sensor signals, and so forth. In one aspect, this may include circuitry directly and physically associated with the scanner 100 such as an on-board processor. In another aspect, this may be a processor associated with a personal computer or other computing device (e.g., a user device 206 as shown in FIG. 2) coupled to the scanner 100, e.g., through a wired or wireless connection. Similarly, various functions described herein may be allocated between an on-board processor for the scanner 100 and a separate computer. All such computing devices and environments are intended to fall within the meaning of the term “controller” or “processor” as used herein, unless a different meaning is explicitly provided or otherwise clear from the context.
  • FIG. 2 shows a three-dimensional scanner system 200. As shown in FIG. 2, the scanner 100 can be coupled to a user device 206 via a USB cable or any other connector used for locally connecting electronic devices to each other.
  • The scanner 100 can alternatively or additionally be coupled to the user device 206 through a data network 202. The data network 202 may be any network(s) or internetwork(s) suitable for communicating data and control information among participants in the environment 200. This may include public networks such as the Internet, private networks, telecommunications networks such as the Public Switched Telephone Network or cellular networks using third generation (e.g., 3G or IMT-2000), fourth generation (e.g., LTE (E-UTRA) or WiMax-Advanced (IEEE 802.16m)) and/or other technologies, as well as any of a variety of corporate area or local area networks and other switches, routers, hubs, gateways, and the like that might be used to carry data among participants
  • The scanner 100 can include a network interface for connecting to the data network 202. The network interface may comprise, e.g., a network interface card, which term is used broadly herein to include any hardware (along with software, firmware, or the like to control operation of same) suitable for establishing and maintaining wired and/or wireless communications. The network interface card may include without limitation wired Ethernet network interface cards (“NICs”), wireless 802.11 networking cards, wireless 802.11 USB devices, or other hardware for wireless local area networking. The network interface may also or instead include cellular network hardware, wide area wireless network hardware or any other hardware for centralized, ad hoc, peer-to-peer, or other radio communications that might be used to carry data. In another aspect, the network interface may include a serial or USB port to directly connect to a computing device such as a desktop computer that, in turn, provides more general network connectivity to the data network.
  • The user device 206 may be a computing device such as a laptop computer, desktop computer, tablet, smart phone, or other computing device that can be operated by a user to provide a user input to control the scanner 100. In another aspect, the scanner 100 may be configured with a display, user input devices, and the like so that the scanner 100 acts as the user device 206. The user input devices may include a display, buttons, or other physical user interface element(s) on the scanner 100 that a user can interact with.
  • Upon user input via the user device 206 and/or the scanner 100, the scanner 100 can begin analyzing the object that is placed on the turntable 102 via the controller 108. Once the controller 108 creates a point cloud, the user device 206 can convert the point cloud into a viewable mesh that can be saved as a Thing file, STL, or other supported mesh formats. During each scan, the object can revolve on the turntable twice. The first right laser 104 can create a laser line that reflects off of the object during the first revolution and the second left laser 104 can create a laser line that reflects off of the object during the second revolution. In another aspect, the left laser 104 can create a laser line that reflects off of the object during the first revolution and the right laser 104 can create a laser line that reflects off of the object during the second revolution. In another aspect only one of the lasers can scan the object during the scan.
  • After the scan, the information from the camera 106 or the two or more lasers 104 can be combined to create a point cloud. The user device 206 can convert the point cloud into a continuous mesh via any combination of software and/or processing circuitry located on the user device 206.
  • The three-dimensional scanner system 200 can be used for scanning, calibration and automatically sending the scan data to a social networking platform hosted, e.g., on a server 204, which may be a general social networking platform or a special purpose platform dedicated to, e.g., three-dimensional printing, three-dimensional modeling, computer automated design, or the like.
  • The server 204 may include data storage, a network interface and a processor and/or or processing circuitry. In general, the server 204 may be configured to perform a variety of processing tasks related to the three-dimensional scanning of objects. For example, the server 204 may manage scan jobs received from one or more of the user devices 206, and provide related supporting functions such as content search and management. The server 204 may also include a web server that provides web-based access by the user device 206 to the capabilities of the server 204. The server 204 may also communicate periodically with the scanner 100 in order to obtain status information concerning, e.g., the status of particular scan jobs, any of which may be subsequently presented to a user through the web server or any other suitable interface. Upon user input via the user interface on the user device 206, scanning can begin. The user input can first include clicking a physical or digital button to automatically back up the scans to the server 204 via the data network 202. The processor on the user device 206 or the scanner 100 can prompt this user input either before the first scan of the scanner 100 and/or before or after every scan.
  • User input can be entered via the user interface on the user device 206 to initiate the scan. The user interface can prompt the user to place an object on the turntable 102 in the correct position. For example, the user interface can prompt the user to place the object on the center of the turntable 102. In another aspect the user interface can prompt the user to place the object on a positioning stand (not pictured). More generally, the user interface may provide step-by-step menus or other interactive elements to guide a user through a scanning procedure. A positioning stand can be used to secure objects that are not stable without support. The stand can also be used to elevate small objects or to secure an object in a specific orientation. The positioning stand can comprise a platform, a rod and one or more arms. A video feed can be shown via the user interface on the client device 206 to assist a user in a placement of an object.
  • Once an object is placed, the user interface can prompt the user to start the scan. During the scan, the user interface can show the time remaining in the scan, a video feed of the object as it is being scanned, and or a point cloud assembly as it is being generated via the controller 108. Once a scan is complete, a user can be given an option of cropping the scanned object model. Other post-scanning features such as smoothing, leveling, labeling (e.g., with three-dimensional text or the like), hole filling, and so forth may also be automatically or semi-automatically performed through the user interface. Once the model is completed, it can be (automatically or otherwise) shared with a social network, sent to a three-dimensional printer coupled to the scanner 100 for printing, and/or exported for saving.
  • In another aspect, the user interface can allow the user device 206 to take a picture of the object scanned and associate it with the three-dimensional model. For example a “Take a Photo” dialog can open and show a view of what the scanner 100 sees via the camera. A prompt to slide the red band-pass filter away from the camera lens can be shown before the picture is taken.
  • In general, the scanner 100, the user device 206, or the controller 108 (or any combination of these) may provide processing circuitry to control operation of the scanner systems contemplated herein, such as by performing the various processes and functions described below.
  • FIG. 3 shows a device for aligning a laser. In general, a scanner may have one or more lasers, as noted above, which are preferably aligned to project a line or other pattern in a predetermined manner across a scanning volume. The device 300 may be used to align a laser 302 so that a desired orientation may be obtained.
  • In general, the laser 302 may be any of the lasers described above, or any other laser that can be configured to project a line or other pattern on a target of a scan. The laser 302 may, for example, be a 3.2 Volt line laser, and/or the laser may have a 55 degree fan over which a laser line is projected. More generally, the pattern may be a line or the like, which may be obtained using an suitable optics, filters or the like to focus and direct light from the laser. The laser 302 may have an axis 304 with an orientation that, when the laser 302 is placed for use in the device 300 (which is in the scanner), directs the laser 302 toward a desired target, or more generally, in a desired direction. Additionally, the laser 302 and a line or other pattern from the laser 302 may have a rotational orientation about the axis 304. By rotating the laser 302 about the axis 304 as indicated by an arrow 306, the rotational orientation of the laser 302 may be controlled.
  • In general, a laser housing 308 secures the laser 302 in a desired orientation within a mount 316. The laser housing 308 may include a cavity 310 to receive the laser 302. The laser housing 308 may also include a toothed wheel 312 with a plurality of teeth 314 radially spaced about the axis 304 of the laser 302 when the laser 302 is placed for use in the cavity 310.
  • A mount 316 for the laser housing 308 may include a base 318 configured to be coupled to an external assembly such as a scanner housing. The base 318 may include any suitable slots, tabs, registration features, screw holes, and the like, or any other suitable mechanism(s) for rigidly coupling to the external assembly in a fixed orientation. A holder 320 of the mount 316 may be configured to retain the laser housing 308 in a predetermined orientation while permitting rotation of the laser housing 308 (and the laser 302) about the axis 304 of the laser 302. The laser housing 308 generally retains the laser housing 308 in rotational engagement about the axis 304 of the laser 302. The mount 316 may further include a hinge 322 that hingably couples the base 318 to the holder 320. The laser housing 308 may be configured to snap-fit into the mount 316 where it may be retained by a number of fingers 340, flanges, or the like, or alternatively stated, the mount 316 may be configured to receive the laser housing 308 and retain the laser housing 308 with any of a variety of snap-fit mechanisms.
  • One end of the laser housing 308 may form an adjustment wheel 324 with the plurality of teeth 314 engaging another surface to secure the laser housing 308 in a desired rotational orientation. The adjustment wheel 324 may be operable as a thumb click wheel or the like, or a supplemental drive wheel 325 may be provided for manual or automated activation. In general, the adjustment wheel 324 is operable to rotate the laser housing 308 around the axis 304 of the laser 302. This rotation may be performed, e.g., by manually rotating the adjustment wheel 324, or by rotating the supplemental adjustment wheel 325 with a motor or other electro-mechanical drive. The adjustment wheel 324 may be ratcheted or otherwise mechanically secured by the plurality of teeth 314 against free rotation after a desired rotational orientation has been established. The adjustment wheel 324 may be a click wheel that moves in discrete units of rotation accompanied by an audible or tactile click. The click wheel may be thumb operable, and may move in fixed increments such as three degree increments, or at any other suitable, regular intervals of rotation. The click wheel may click against a nub, spring, or the like on the mount 316.
  • An adjustment rod 326 may also be provided that couples the base 318 to the holder 320 at a position away from the hinge 322. In this configuration, the adjustment rod 326 may be operable to displace the base 318 relative to the holder 320 along a second axis 330 of the adjustment rod. In this manner, the hinge 322 is rotated (or hinged) thus moving the axis 304 of the laser relative to the base 318. Thus when the base 318 is fixed to an external support, the adjustment rod 326 can be used to steer the axis 304 through an arc by flexing the hinge 322. In one aspect, the adjustment rod 326 may be a threaded rod that is threaded through a threaded insert 342 that is coupled to a fixed location such as a location in the base 318 or in the holder 320. By rotating the threaded rod, the threaded insert 342 may travel along the threaded rod, thus moving the holder 320 relative to the base 318 and flexing the hinge 322 to reorient the laser 302.
  • The laser housing 308 and the mount 316 may be formed of any suitable materials according to desired weight, strength, durability, and so forth. For example, the laser housing 308 and the mount 316 may be formed of an injection molded plastic and/or a plastic such as a polycarbonate acrylonitrile butadiene styrene or an acetal homopolymer.
  • The mount 316 may include a spring 350 such as a coil spring or any other suitable compression spring or the like that urges the holder 320 and the base 318 into a predetermined relative orientation. This spring 350 may thus bias the laser 302 toward a predetermined orientation relative to the mount 316 when the laser 302 is placed for use in the cavity 310 of the holder 320. While the spring 350 may be a separate, discrete component, the spring may also or instead be a living plastic spring formed for example by a resilient material of the hinge 322. The living plastic spring (or any other spring 350 as contemplated herein) may generally bias the laser 302 toward any predetermined position or orientation such as toward a predetermined position relative to the mount 316 when the laser 302 is placed for use in the cavity 310.
  • This configuration advantageously provides convenient positioning and rotation of a line laser within a scan volume with a relatively simple mechanical arrangement and a small number of moving parts. Additional adjustments may be necessary or desirable, and as such a supplemental positioning assembly 360 may be provided in order to provide additional degrees of rotational or translational freedom for adjusting the laser 302. For example, the positioning assembly 360 may facilitate translation of the axis 304 within a plane perpendicular to the axis 304, or alignment of the axis 304 of the laser 302 with one or more additional degrees of freedom, that is, degrees of freedom not provided by the mount 316 and laser housing 308 described above. This may include any suitable fixture, set screws, and so forth, for adjusting position and orientation of the base 318 relative to a fixed physical reference that the base 318 is attached to (such as a scanner housing). A variety of suitable mechanisms are known in the art and may be adapted for use as a positioning assembly 360 as contemplated herein.
  • FIG. 4 shows a cross section of a laser housing such as the laser housing described above. The laser housing 400 may generally include a cavity 402 to receive a laser as described above. The laser housing 400 may also include a plurality of engagement elements 404 such as ribs, fins, protrusions or the like within the cavity 402 that secure a laser in a desired position and orientation within the cavity 402. The engagement elements 404 may in general be shaped and sized in any suitable manner to hold a laser when the laser is positioned in the cavity. For example, the engagement elements 404 may include ribs as illustrated, which may secure the laser with a press-fit or interference fit to frictionally engage the laser in the desired position. A second cavity 406 may be included that is formed to receive a drive head such as a screw driver, hex wrench or the like. The second cavity 406 may be positioned within the mount 316 of FIG. 3 such that the second cavity 406 is accessible externally with a screw driver or the like to adjust the rotational orientation of the laser. Similarly, a portion of the adjustment wheel 324 may be exposed outside a scanner housing to facilitate convenient manual adjustment.
  • FIG. 5 shows a calibration component. In general, a scanner such as any of the scanners described herein may be calibrated prior to use in order to obtain more accurate scan results. In general, this involves placing a calibration component such as the calibration component 500 shown in FIG. 5 onto the turntable of a scanner and capturing images in a variety of poses and under a variety of lighting conditions.
  • In one aspect, the calibration component 500 may be a multi-part component that can be configured to present a variety of different surfaces, patterns and the like. For example, as illustrated, the calibration component 500 may have a base 502 with angled surfaces and a checkerboard pattern or the like, as well as a removable plate 504 that can be removed from and replaced to the base 502 to provide a horizontal surface for calibration-related data acquisition. While a checkerboard is shown as the calibration pattern 506, it will be understood that a variety of calibration patterns may also or instead be employed including, without limitation a dot grid, a line grid, a random pattern, and so forth. The calibration pattern 506 may also or instead include a predetermined three-dimensional shape of the calibration component 500, such as the angled surfaces of the base 502.
  • In one aspect, the calibration component 500 may include a plurality of surfaces. This may include at least three panels 510, 512, 514 each including the calibration pattern 506 (i.e., the same pattern) or different calibration patterns, or some combination of these. The calibration component 500 may also include two different faces such as a first face formed by one of the panels 510, and a second face formed by the other panels 512, 514. As noted above, one of the panels 510 may be removable and the face of the first panel 510 may occlude the calibration pattern on the other panels 512, 514 when attached to the base 502. This permits a single calibration fixture to provide various different patterns and three-dimensional shapes to facilitate various calibration steps as discussed below.
  • The calibration component 500 may include a tab 516 or other protrusion or the like configured to couple the calibration component 500, or the base 502 of the calibration component 500, to a turntable or other base for a scanning system in order to retain the calibration component 500 in a predetermined position and orientation during calibration. Any other number of tabs may be provided to secure the calibration component 500, or the base 502 or one of the panels 510, 512, 514 in a desired orientation for use in calibrating a scanner.
  • FIG. 6 shows a method for calibrating a three-dimensional scanner. In general a multi-configuration calibration component may provide a variety of configurable and positionable surfaces that can be used in different calibration steps. With this calibration component, a progressive calibration of a camera, a turntable, and a laser may be performed. Configuration and positioning of the calibration component may be orchestrated by a user interface that interactively guides a user through various positioning and configuration steps.
  • As shown in step 602, the method 600 may begin with receiving user input including a request to initiate calibration of a three-dimensional scanner. The three-dimensional scanner may include a turntable, a laser, and a camera as generally described above. The request may be received, for example, from a user through a user interface, which may be a user interface rendered on the scanner or any suitable device coupled to the scanner such as a local desktop or laptop computer.
  • As shown in step 604, the method 600 may include providing information to the user for positioning a calibration component on the turntable in a first position for camera calibration. The calibration component may be any of the calibration components described herein, and may for example include a plurality of surfaces with at least two of the plurality of surfaces include calibration patterns. The information may be provided, for example, by displaying instructions to the user in the user interface. The instructions may specify a configuration of the calibration component, particularly where the component has removable surfaces or other multiple configurations, and may more specifically identify slots, tabs or the like on the turntable where the calibration component should be placed.
  • As shown in step 606, the method 600 may include receiving an indication that the calibration component is properly positioned on the turntable for camera calibration. This confirmation may be received, for example by a user pressing a button on the scanner or operating a control in the user interface (after suitably placing the component). Placement may also or instead be confirmed automatically or semi-automatically by capturing and analyzing images from the uncalibrated camera(s). Thus receiving the indication that the calibration component is properly positioned or configured may in general include receiving a manual user input, receiving a computer generated input such as an input from a computer vision system, or some combination of these.
  • As shown in step 608, the method 600 may include rotating the turntable about a rotation axis thereby rotating the calibration component.
  • As shown in step 610, the method 600 may include capturing images of the calibration component on the turntable with the camera as the turntable is rotating, thereby providing a first plurality of images. This may include capturing video images, or capturing still images at a predetermined rate, e.g., at particular time intervals or at particular rotational intervals of the turntable.
  • As shown in step 612, the method 600 may include performing a first calibration calculation with the first plurality of images to calibrate the camera, thereby providing a calibrated camera. Camera calibration is a necessary step in three-dimensional processing to facilitate extraction of three-dimensional data from two-dimensional images. A variety of suitable techniques are known and well characterized in the art, and these techniques are not repeated here except to note generally that known features and/or displacements can be used to recover three-dimensional characteristics or parameters of a camera system in a manner that permits subsequent three-dimensional measurements with improved accuracy.
  • As shown in step 614, the method 600 may include providing information to the user for positioning the calibration component on the turntable for turntable calibration. This may also or instead include providing information to reconfigure the calibration component, e.g., by adding or removing a panel, or by changing a position or orientation of a panel or other element of the calibration component. The information may be provided, for example, by displaying instructions to the user in the user interface. The instructions may specify a configuration of the calibration component, particularly where the component has removable surfaces or other multiple configurations, and may more specifically identify slots, tabs or the like on the turntable where the calibration component should be placed.
  • As shown in step 616, the method 600 may include receiving an indication that the calibration component is properly positioned for turntable calibration. This confirmation may be received, for example by a user pressing a button on the scanner or operating a control in the user interface. Placement may additionally be confirmed automatically or semi-automatically by capturing and analyzing images from the camera(s). Thus receiving the indication that the calibration component is properly positioned or configured may in general include receiving a manual user input, receiving a computer generated input such as an input from a computer vision system, or some combination of these.
  • As shown in step 618, once the calibration component is properly positioned for turntable calibration, the method 600 may include rotating the turntable about the rotation axis thereby rotating the calibration component.
  • As shown in step 620, the method 600 may include capturing a second plurality of images of the calibration pattern included on at least one of the plurality of surfaces of the calibration component using the calibrated camera. This may include capturing video images, or capturing still images at a predetermined rate, e.g., at particular time intervals or at particular rotational intervals of the turntable.
  • As shown in step 622, the method 600 may include determining locations of predetermined points on the calibration pattern using the captured images. This may be, e.g., corners of the calibration pattern on the calibration component, or other interstitial locations within the checkerboard pattern or the like. In one aspect, determining locations may include using computer vision to determine the corners of the checkerboard or any other suitable feature or location within a calibration pattern.
  • As shown in step 624, the method 600 may include determining a rotational position of the rotation axis of the turntable with respect to the camera based upon the locations of the predetermined points, thereby providing a calibrated turntable. In this manner the turntable may be calibrated so that it can produce accurate, controllable rotational orientations. As noted above, a variety of calibration techniques are known in the art that may be suitably adapted for use in providing a calibrated turntable as contemplated herein. By way of example and not of limitation, determining the rotational position of the rotation axis of the turntable with respect to the camera may include computing centers for circles created by rotation of the predetermined points of the calibration pattern about the rotation axis and averaging the centers to determine an average center representing the rotational position of the rotation axis.
  • As shown in step 626, the method 600 may include providing information to the user for positioning the calibration component on the turntable in a third position (and/or configuration) for laser calibration. The third position may include the calibration component oriented such that the calibration patterns of the at least two of the plurality of surfaces are non-planar with respect to each other and are disposed in a field of view of the calibrated camera. For example, by removing a horizontal panel to expose to non-planar panels such as those described above with reference to FIG. 5, a suitable calibration surface may be presented. Thus in one aspect, the calibration component may include a removable panel that is removed to configure the calibration component for laser calibration.
  • As shown in step 628, the method 600 may include receiving an indication from the user that the calibration component is properly positioned for laser calibration. This confirmation may be received, for example by a user pressing a button on the scanner or operating a control in the user interface. Placement may additionally be confirmed automatically or semi-automatically by capturing and analyzing images from the camera(s). Thus receiving the indication that the calibration component is properly positioned or configured may in general include receiving a manual user input, receiving a computer generated input such as an input from a computer vision system, or some combination of these.
  • As shown in step 630, the method 600 may include directing a beam of the laser on the calibration patterns of the calibration component in the field of view of the calibrated camera.
  • As shown in step 632, the method 600 may include capturing a third plurality of images of the beam on the calibration patterns of the calibration component.
  • As shown in step 634, the method 600 may include performing a calibration calculation for the laser based on the third plurality of images, thereby providing a calibrated laser. This may generally include any suitable calibration calculations for improving accuracy of the laser in terms of, e.g., focus, position, intensity, or any other controllable aspect of the laser.
  • As shown in step 636, the method 600 may include removing the calibration component from the turntable so that a scanning volume is available for a scan target.
  • As shown in step 638, the method 600 may include capturing a scan of an object with the calibrated camera, the calibrated turntable, and the calibrated laser.
  • FIG. 7 shows a user interface 700 for automatically selecting three-dimensional scan parameters. When operating a three-dimensional scanner such as a scanner with a turntable, laser, and camera as described herein, the optical properties of a scan target can significantly influence scan results. However, a user may not be able to readily balance, e.g., laser output parameters and camera exposure parameters to achieve the best results. To assist a user in selecting the best parameters, a semi-automated process may be provided that permits a user to specify various optical properties such as shade, color and surface texture. The scanner (or other processing circuitry associated with the scanner, such as a locally coupled computer) may then automatically select specific operating parameters for the scanner components based on the user-provided description of an object's optical properties. While the user interface 700 of FIG. 7 specifically depicts a user selection of one of three possible shades (light, medium, dark), it will be understood that any other user-perceptible optical characteristics may also or instead be used including without limitation surface texture, opacity, transparency, glossiness, and so forth.
  • FIG. 8 shows a method for automatically selecting three-dimensional scan parameters. In general, the method 800 may include receiving user selections of various optical properties, and then adjusting specific system parameters according to the user-provided information.
  • As shown in step 802, the method 800 may begin with providing a first prompt in a user interface configured to receive a user input selecting a color gradient that best matches an object to be scanned by the three-dimensional scanner. The color gradient may, for example, include a shade selected from a group consisting of light, medium, and dark, or any other suitable categories at any desired level of granularity or detail.
  • As shown in step 804, the method 800 may include providing manual decision support to a user in the user interface. For example, a decision may be assisted with any of a variety of visual aids for the user. For example, with a color gradient of two or more shades, this step may include displaying examples of each of the two or more shades to assist the user in selecting the color gradient at the first prompt. This step may further include displaying a video feed of the object to be scanned for a direct, on-screen comparison of the object to the examples within the user interface.
  • As shown in step 806, the method 800 may also or instead include providing automated decision support to a user. For example, this may include capturing an image of the object that is to be scanned, and performing a comparison of the image to a number of images of previously scanned objects using, for example, any of a variety of similarity measures or the like. In a semi-automated mode, the method may include providing relevant information to a user, such as by presenting a selection of a color gradient or surface characteristic that resulted in a successful scan of one or more of the previously scanned objects. Alternatively, the method may proceed fully automatically, e.g., by automatically selecting a color gradient or a surface characteristic for a scan based on the comparison when one of the number of images appears to include closely corresponding optical properties.
  • As shown in step 808, the method 800 may include providing a second prompt in the user interface configured to receive a user input selecting a surface characteristic that best matches a surface of the object to be scanned, the surface characteristic including at least one of glossiness, fuzziness, and texture.
  • As shown in step 810, the method 800 may include adjusting camera parameters based on the color gradient and the surface characteristic thereby adjusting an exposure of the camera. For example, this may include adjusting a shutter speed and a lens aperture of the camera to suitable selections best matched to the characteristics of the object. For example, where the color gradient is light, the camera may be responsively adjusted to a lower exposure. Where the color gradient is dark, the camera may be responsively adjusted to a higher exposure. In another aspect, a fixed exposure may be maintained independent of the color gradient, but the exposure may vary in response to other factors such as a color composition or surface texture.
  • As shown in step 812, the method 800 may include adjusting an intensity of the laser based on the color gradient and the surface characteristic. For example, where the color gradient is light, the laser may be responsively adjusted to a higher intensity. Where the color gradient id dark, the laser may be responsively adjusted to a lower intensity. In another aspect, a fixed laser intensity may be maintained independent of the color gradient, but the laser intensity may vary in response to other factors such as a color composition or surface texture.
  • As shown in step 814, the method 800 may include scanning an object using the adjusted laser and camera parameters.
  • The above systems, devices, methods, processes, and the like may be realized in hardware, software, or any combination of these suitable for the control, data acquisition, and data processing described herein. This includes realization in one or more microprocessors, microcontrollers, embedded microcontrollers, programmable digital signal processors or other programmable devices or processing circuitry, along with internal and/or external memory. This may also, or instead, include one or more application specific integrated circuits, programmable gate arrays, programmable array logic components, or any other device or devices that may be configured to process electronic signals. It will further be appreciated that a realization of the processes or devices described above may include computer-executable code created using a structured programming language such as C, an object oriented programming language such as C++, or any other high-level or low-level programming language (including assembly languages, hardware description languages, and database programming languages and technologies) that may be stored, compiled or interpreted to run on one of the above devices, as well as heterogeneous combinations of processors, processor architectures, or combinations of different hardware and software. At the same time, processing may be distributed across devices such as the various systems described above, or all of the functionality may be integrated into a dedicated, standalone device. All such permutations and combinations are intended to fall within the scope of the present disclosure.
  • Embodiments disclosed herein may include computer program products comprising computer-executable code or computer-usable code that, when executing on one or more computing devices, performs any and/or all of the steps of the control systems described above. The code may be stored in a non-transitory fashion in a computer memory, which may be a memory from which the program executes (such as random access memory associated with a processor), or a storage device such as a disk drive, flash memory or any other optical, electromagnetic, magnetic, infrared or other device or combination of devices. In another aspect, any of the control systems described above may be embodied in any suitable transmission or propagation medium carrying computer-executable code and/or any inputs or outputs from same.
  • It will be appreciated that the devices, systems, and methods described above are set forth by way of example and not of limitation. Numerous variations, additions, omissions, and other modifications will be apparent to one of ordinary skill in the art. In addition, the order or presentation of method steps in the description and drawings above is not intended to require this order of performing the recited steps unless a particular order is expressly required or otherwise clear from the context.
  • The method steps of the implementations described herein are intended to include any suitable method of causing such method steps to be performed, consistent with the patentability of the following claims, unless a different meaning is expressly provided or otherwise clear from the context. So for example performing the step of X includes any suitable method for causing another party such as a remote user, a remote processing resource (e.g., a server or cloud computer) or a machine to perform the step of X. Similarly, performing steps X, Y and Z may include any method of directing or controlling any combination of such other individuals or resources to perform steps X, Y and Z to obtain the benefit of such steps. Thus method steps of the implementations described herein are intended to include any suitable method of causing one or more other parties or entities to perform the steps, consistent with the patentability of the following claims, unless a different meaning is expressly provided or otherwise clear from the context. Such parties or entities need not be under the direction or control of any other party or entity, and need not be located within a particular jurisdiction.
  • It will be appreciated that the methods and systems described above are set forth by way of example and not of limitation. Numerous variations, additions, omissions, and other modifications will be apparent to one of ordinary skill in the art. In addition, the order or presentation of method steps in the description and drawings above is not intended to require this order of performing the recited steps unless a particular order is expressly required or otherwise clear from the context. Thus, while particular embodiments have been shown and described, it will be apparent to those skilled in the art that various changes and modifications in form and details may be made therein without departing from the spirit and scope of this disclosure and are intended to form a part of the invention as defined by the following claims, which are to be interpreted in the broadest sense allowable by law.

Claims (20)

What is claimed is:
1. A method for calibrating a three-dimensional scanner comprising:
receiving a user input from a user through a user interface of a three-dimensional scanner, the user input including a request to initiate calibration of the three-dimensional scanner, the three-dimensional scanner including a turntable, a laser, and a camera;
providing information to the user for positioning a calibration component on the turntable in a first position for camera calibration, the calibration component including a plurality of surfaces, wherein at least two of the plurality of surfaces include calibration patterns;
receiving an indication that the calibration component is properly positioned for camera calibration;
rotating the turntable about a rotation axis thereby rotating the calibration component;
capturing images of the calibration component on the turntable with the camera as the turntable is rotating, thereby providing a first plurality of images;
performing a first calibration calculation with the first plurality of images to calibrate the camera, thereby providing a calibrated camera;
providing information to the user for positioning the calibration component on the turntable in a second position for turntable calibration;
receiving an indication that the calibration component is properly positioned for turntable calibration;
rotating the turntable about the rotation axis thereby rotating the calibration component;
capturing a second plurality of images of the calibration pattern included on at least one of the plurality of surfaces of the calibration component using the calibrated camera;
determining locations of predetermined points on the calibration pattern using the captured images;
determining a rotational position of the rotation axis of the turntable with respect to the camera based upon the locations of the predetermined points, thereby providing a calibrated turntable;
providing information to the user for positioning the calibration component on the turntable in a third position for laser calibration, the third position including the calibration component oriented such that the calibration patterns of the at least two of the plurality of surfaces are non-planar with respect to each other and are disposed in a field of view of the calibrated camera;
receiving an indication from the user that the calibration component is properly positioned for laser calibration;
directing a beam of the laser on the calibration patterns of the calibration component in the field of view of the calibrated camera;
capturing a third plurality of images of the beam on the calibration patterns of the calibration component; and
performing a calibration calculation for the laser based on the third plurality of images, thereby providing a calibrated laser.
2. The method of claim 1 further comprising capturing a scan of an object with the calibrated camera, the calibrated turntable, and the calibrated laser.
3. The method of claim 1 wherein the calibration pattern includes at least one of a checkerboard pattern, a dot grid, and a line grid.
4. The method of claim 1 wherein the calibration pattern includes a predetermined three-dimensional shape of the calibration component.
5. The method of claim 1 wherein the predetermined points are corners of the calibration pattern.
6. The method of claim 1 wherein determining the rotational position of the rotation axis of the turntable with respect to the camera includes computing centers for circles created by rotation of the predetermined points of the calibration pattern about the rotation axis and averaging the centers to determine an average center representing the rotational position of the rotation axis.
7. The method of claim 1 wherein the plurality of surfaces include at least three panels and at least two faces, wherein the at least three panels include the calibration patterns.
8. The method of claim 7 wherein at least one of the at least three panels is removable and wherein laser calibration further includes removing the front panel.
9. The method of claim 7 wherein a face occludes the calibration pattern included on at least one of the at least three panels.
10. The method of claim 1 wherein the calibration component includes a tab configured to couple the calibration component to the turntable.
11. The method of claim 1 wherein the calibration pattern includes a checkerboard, the method further comprising using computer vision to determine the corners of the checkerboard.
12. The method of claim 1 further comprising removing the calibration component from the turntable.
13. The method of claim 1 wherein receiving an indication that the calibration component is properly positioned for camera calibration includes receiving a manual user input.
14. The method of claim 1 wherein receiving an indication that the calibration component is properly positioned for camera calibration includes receiving an input from a computer vision system.
15. The method of claim 1 wherein receiving an indication that the calibration component is properly positioned for turntable calibration includes receiving a manual user input.
16. The method of claim 1 wherein receiving an indication that the calibration component is properly positioned for turntable calibration includes receiving an input from a computer vision system.
17. A computer program product for calibrating a three-dimensional scanner, the computer program product comprising non-transitory computer executable code embodied in a non-transitory computer readable medium that, when executing on a three-dimensional scanner, performs the steps of:
receiving a user input from a user through a user interface of a three-dimensional scanner, the user input including a request to initiate calibration of the three-dimensional scanner, the three-dimensional scanner including a turntable, a laser, and a camera;
providing information to the user for positioning a calibration component on the turntable in a first position for camera calibration, the calibration component including a plurality of surfaces, wherein at least two of the plurality of surfaces include calibration patterns;
receiving an indication that the calibration component is properly positioned for camera calibration;
rotating the turntable about a rotation axis thereby rotating the calibration component;
capturing images of the calibration component on the turntable with the camera as the turntable is rotating, thereby providing a first plurality of images;
performing a first calibration calculation with the first plurality of images to calibrate the camera, thereby providing a calibrated camera;
providing information to the user for positioning the calibration component on the turntable in a second position for turntable calibration;
receiving an indication that the calibration component is properly positioned for turntable calibration;
rotating the turntable about the rotation axis thereby rotating the calibration component;
capturing a second plurality of images of the calibration pattern included on at least one of the plurality of surfaces of the calibration component using the calibrated camera;
determining locations of predetermined points on the calibration pattern using the captured images;
determining a rotational position of the rotation axis of the turntable with respect to the camera based upon the locations of the predetermined points, thereby providing a calibrated turntable;
providing information to the user for positioning the calibration component on the turntable in a third position for laser calibration, the third position including the calibration component oriented such that the calibration patterns of the at least two of the plurality of surfaces are non-planar with respect to each other and are disposed in a field of view of the calibrated camera;
receiving an indication from the user that the calibration component is properly positioned for laser calibration;
directing a beam of the laser on the calibration patterns of the calibration component in the field of view of the calibrated camera;
capturing a third plurality of images of the beam on the calibration patterns of the calibration component; and
performing a calibration calculation for the laser based on the third plurality of images, thereby providing a calibrated laser.
18. The computer program product of claim 17 further comprising code that performs the step of capturing a scan of an object with the calibrated camera, the calibrated turntable, and the calibrated laser.
19. The computer program product of claim 17 wherein the calibration pattern includes at least one of a checkerboard pattern, a dot grid, and a line grid.
20. A device comprising a three dimensional scanner and processing circuitry programmed to perform the steps of:
receiving a user input from a user through a user interface of a three-dimensional scanner, the user input including a request to initiate calibration of the three-dimensional scanner, the three-dimensional scanner including a turntable, a laser, and a camera;
providing information to the user for positioning a calibration component on the turntable in a first position for camera calibration, the calibration component including a plurality of surfaces, wherein at least two of the plurality of surfaces include calibration patterns;
receiving an indication that the calibration component is properly positioned for camera calibration;
rotating the turntable about a rotation axis thereby rotating the calibration component;
capturing images of the calibration component on the turntable with the camera as the turntable is rotating, thereby providing a first plurality of images;
performing a first calibration calculation with the first plurality of images to calibrate the camera, thereby providing a calibrated camera;
providing information to the user for positioning the calibration component on the turntable in a second position for turntable calibration;
receiving an indication that the calibration component is properly positioned for turntable calibration;
rotating the turntable about the rotation axis thereby rotating the calibration component;
capturing a second plurality of images of the calibration pattern included on at least one of the plurality of surfaces of the calibration component using the calibrated camera;
determining locations of predetermined points on the calibration pattern using the captured images;
determining a rotational position of the rotation axis of the turntable with respect to the camera based upon the locations of the predetermined points, thereby providing a calibrated turntable;
providing information to the user for positioning the calibration component on the turntable in a third position for laser calibration, the third position including the calibration component oriented such that the calibration patterns of the at least two of the plurality of surfaces are non-planar with respect to each other and are disposed in a field of view of the calibrated camera;
receiving an indication from the user that the calibration component is properly positioned for laser calibration;
directing a beam of the laser on the calibration patterns of the calibration component in the field of view of the calibrated camera;
capturing a third plurality of images of the beam on the calibration patterns of the calibration component; and
performing a calibration calculation for the laser based on the third plurality of images, thereby providing a calibrated laser.
US14/456,052 2013-08-09 2014-08-11 Laser scanning systems and methods Abandoned US20150042757A1 (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
US14/456,052 US20150042757A1 (en) 2013-08-09 2014-08-11 Laser scanning systems and methods

Applications Claiming Priority (4)

Application Number Priority Date Filing Date Title
US201361864158P 2013-08-09 2013-08-09
US201361875360P 2013-09-09 2013-09-09
US201361906171P 2013-11-19 2013-11-19
US14/456,052 US20150042757A1 (en) 2013-08-09 2014-08-11 Laser scanning systems and methods

Publications (1)

Publication Number Publication Date
US20150042757A1 true US20150042757A1 (en) 2015-02-12

Family

ID=52448283

Family Applications (3)

Application Number Title Priority Date Filing Date
US14/456,091 Abandoned US20150042758A1 (en) 2013-08-09 2014-08-11 Laser scanning systems and methods
US14/456,052 Abandoned US20150042757A1 (en) 2013-08-09 2014-08-11 Laser scanning systems and methods
US14/456,010 Active 2034-10-24 US9418424B2 (en) 2013-08-09 2014-08-11 Laser scanning systems and methods

Family Applications Before (1)

Application Number Title Priority Date Filing Date
US14/456,091 Abandoned US20150042758A1 (en) 2013-08-09 2014-08-11 Laser scanning systems and methods

Family Applications After (1)

Application Number Title Priority Date Filing Date
US14/456,010 Active 2034-10-24 US9418424B2 (en) 2013-08-09 2014-08-11 Laser scanning systems and methods

Country Status (1)

Country Link
US (3) US20150042758A1 (en)

Cited By (8)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20160212412A1 (en) * 2013-09-18 2016-07-21 Matter and Form Inc. Device, system and method for three-dimensional modeling
CN108140247A (en) * 2015-10-05 2018-06-08 谷歌有限责任公司 Use the camera calibrated of composograph
CN108389233A (en) * 2018-02-23 2018-08-10 大连理工大学 The laser scanner and camera calibration method approached based on boundary constraint and mean value
CN109493388A (en) * 2018-09-30 2019-03-19 先临三维科技股份有限公司 Rotating axis calibration method, device, computer equipment and storage medium
US10583354B2 (en) 2014-06-06 2020-03-10 Lego A/S Interactive game apparatus and toy construction system
US10600240B2 (en) 2016-04-01 2020-03-24 Lego A/S Toy scanner
US10646780B2 (en) 2014-10-02 2020-05-12 Lego A/S Game system
WO2022078421A1 (en) * 2020-10-15 2022-04-21 左忠斌 Multi-pitch-angle intelligent visual 3d information collection device

Families Citing this family (12)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20150138320A1 (en) * 2013-11-21 2015-05-21 Antoine El Daher High Accuracy Automated 3D Scanner With Efficient Scanning Pattern
USD737347S1 (en) * 2014-01-05 2015-08-25 Makerbot Industries, Llc Filament spool holder for three-dimensional printer
USD810747S1 (en) * 2015-06-01 2018-02-20 Xyzprinting, Inc. Three dimensional scanning device
FR3043194B1 (en) * 2015-11-02 2019-04-19 Mesure-Systems3D THREE-DIMENSIONAL CONTACTLESS CONTROL DEVICE FOR TURBOMACHINE, ESPECIALLY FOR AIRCRAFT REACTOR OR TURBINE
US10122997B1 (en) 2017-05-03 2018-11-06 Lowe's Companies, Inc. Automated matrix photo framing using range camera input
US10600203B2 (en) 2017-06-06 2020-03-24 CapSen Robotics, Inc. Three-dimensional scanner with detector pose identification
CN108344360B (en) * 2017-11-15 2020-03-31 北京航空航天大学 Laser scanning type global calibration device and method for vision measurement system
USD900177S1 (en) 2019-03-19 2020-10-27 Makerbot Industries, Llc Drawer for a three-dimensional printer
CN109945948B (en) * 2019-03-27 2021-01-19 昆山福烨电子有限公司 Self-positioning resistance repairing jig
CN111355894B (en) * 2020-04-14 2021-09-03 长春理工大学 Novel self-calibration laser scanning projection system
CN112254638B (en) * 2020-10-15 2022-08-12 天目爱视(北京)科技有限公司 Intelligent visual 3D information acquisition equipment that every single move was adjusted
CN112729156A (en) * 2020-12-24 2021-04-30 上海智能制造功能平台有限公司 Data splicing and system calibration method of human body digital measuring device

Citations (8)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20030202691A1 (en) * 2002-04-24 2003-10-30 Paul Beardsley Calibration of multiple cameras for a turntable-based 3D scanner
US20030231174A1 (en) * 2002-06-17 2003-12-18 Wojciech Matusik Modeling and rendering of surface reflectance fields of 3D objects
US20050068544A1 (en) * 2003-09-25 2005-03-31 Gunter Doemens Panoramic scanner
US20050068523A1 (en) * 2003-08-11 2005-03-31 Multi-Dimension Technology, Llc Calibration block and method for 3D scanner
WO2008108749A1 (en) * 2006-01-20 2008-09-12 Nextpat Limited Desktop three-dimensional scanner
US20080246757A1 (en) * 2005-04-25 2008-10-09 Masahiro Ito 3D Image Generation and Display System
US20090097039A1 (en) * 2005-05-12 2009-04-16 Technodream21, Inc. 3-Dimensional Shape Measuring Method and Device Thereof
US20090273792A1 (en) * 2008-04-21 2009-11-05 Max-Planck Gesellschaft Zur Forderung Der Wissenschaften E.V. Robust three-dimensional shape acquisition method and system

Family Cites Families (45)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US4773164A (en) * 1987-06-08 1988-09-27 Tayco Developments, Inc. Self-aligning caliber bar
US4825258A (en) * 1988-01-04 1989-04-25 Whitson John M Device for bore alignment of gun sights
US5745808A (en) * 1995-08-21 1998-04-28 Eastman Kodak Company Camera exposure control system using variable-length exposure tables
US5988862A (en) * 1996-04-24 1999-11-23 Cyra Technologies, Inc. Integrated system for quickly and accurately imaging and modeling three dimensional objects
US5852493A (en) * 1997-03-13 1998-12-22 Spectra Precision, Inc. Self-aligning laser transmitter having a dual slope grade mechanism
US6420698B1 (en) * 1997-04-24 2002-07-16 Cyra Technologies, Inc. Integrated system for quickly and accurately imaging and modeling three-dimensional objects
US6195902B1 (en) * 1998-08-04 2001-03-06 Quarton, Inc. Laser extender
US20020170190A1 (en) * 2001-05-17 2002-11-21 Wetterlind David A. Spotting device
AU2003225622A1 (en) 2002-03-01 2003-09-16 American Tool Companies, Inc. Manual leveling rotating laser with swivel head
GB2390792B (en) * 2002-07-08 2005-08-31 Vision Rt Ltd Image processing system for use with a patient positioning device
US6739062B2 (en) * 2002-08-09 2004-05-25 Quarton, Inc Universal angle means
GB0301775D0 (en) * 2003-01-25 2003-02-26 Wilson John E Device and method for 3Dimaging
JP4112469B2 (en) * 2003-10-07 2008-07-02 オリンパス株式会社 Multiband camera control apparatus and control method
US20060070251A1 (en) * 2003-11-12 2006-04-06 Shuming Wu Line-marking device with rotatable laser
US7323670B2 (en) * 2004-03-16 2008-01-29 Leica Geosystems Hds Llc Laser operation for survey instruments
US7315383B1 (en) * 2004-07-09 2008-01-01 Mohsen Abdollahi Scanning 3D measurement technique using structured lighting and high-speed CMOS imager
KR20060038681A (en) * 2004-11-01 2006-05-04 삼성테크윈 주식회사 Apparatus and method for removing hot pixel in digital camera
US7467162B2 (en) * 2005-06-27 2008-12-16 Microsoft Corporation Pre-configured settings for portable devices
US7467474B1 (en) * 2005-09-19 2008-12-23 Statham Jay P Method and apparatus for pipe alignment tool
JP4755532B2 (en) * 2006-05-26 2011-08-24 株式会社リコー Image forming apparatus
US8488895B2 (en) * 2006-05-31 2013-07-16 Indiana University Research And Technology Corp. Laser scanning digital camera with pupil periphery illumination and potential for multiply scattered light imaging
CN100553873C (en) * 2006-11-14 2009-10-28 力山工业股份有限公司 The laser locating apparatus that is used for drilling machine
US7793423B2 (en) * 2007-07-30 2010-09-14 Joey Lee Loftis Piping alignment tool
US20140028830A1 (en) * 2008-12-02 2014-01-30 Kevin Kieffer Deployable devices and methods of deploying devices
US8417135B2 (en) * 2009-05-12 2013-04-09 Xerox Corporation Methods to control appearance of gloss levels for printed text and images
US20100289910A1 (en) * 2009-05-15 2010-11-18 Moran Research And Consulting, Inc. Method and apparatus for remote camera control
US10769412B2 (en) * 2009-05-18 2020-09-08 Mark Thompson Mug shot acquisition system
JP5451418B2 (en) * 2010-01-22 2014-03-26 キヤノン株式会社 Image forming apparatus
US8307562B2 (en) 2010-04-29 2012-11-13 Black & Decker Inc. Laser line generator having three intersecting light planes
US8487955B2 (en) * 2010-06-30 2013-07-16 Xerox Corporation Language-based color editing for mobile devices
JP5606193B2 (en) * 2010-07-14 2014-10-15 キヤノン株式会社 Image forming apparatus
US9671094B2 (en) 2010-07-22 2017-06-06 Renishaw Plc Laser scanning apparatus and method of use
US8625021B2 (en) * 2010-08-30 2014-01-07 Canon Kabushiki Kaisha Image capture with region-based adjustment of imaging properties
GB201102794D0 (en) * 2011-02-17 2011-03-30 Metail Ltd Online retail system
US9127935B2 (en) * 2012-01-04 2015-09-08 Chris Olexa Laser centering tool for surface areas
JP5854848B2 (en) * 2012-01-10 2016-02-09 キヤノン株式会社 IMAGING DEVICE, IMAGING DEVICE CONTROL METHOD, PROGRAM, AND STORAGE MEDIUM
US9036888B2 (en) * 2012-04-30 2015-05-19 General Electric Company Systems and methods for performing quality review scoring of biomarkers and image analysis methods for biological tissue
US8964089B2 (en) * 2012-05-09 2015-02-24 Canon Kabushiki Kaisha Systems and methods for simulated preview for preferred image exposure
JP2013239861A (en) * 2012-05-14 2013-11-28 Freebit Co Ltd Image capturing system
JP5814865B2 (en) * 2012-06-20 2015-11-17 株式会社 日立産業制御ソリューションズ Imaging device
US20140111670A1 (en) * 2012-10-23 2014-04-24 Nvidia Corporation System and method for enhanced image capture
JP6091864B2 (en) * 2012-11-27 2017-03-08 株式会社キーエンス Shape measuring device, shape measuring method, and shape measuring program
US9536345B2 (en) * 2012-12-26 2017-01-03 Intel Corporation Apparatus for enhancement of 3-D images using depth mapping and light source synthesis
US9264630B2 (en) * 2013-01-04 2016-02-16 Nokia Technologies Oy Method and apparatus for creating exposure effects using an optical image stabilizing device
WO2015026847A1 (en) * 2013-08-19 2015-02-26 Aio Robotics, Inc. Four-in-one three-dimensional copy machine

Patent Citations (8)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20030202691A1 (en) * 2002-04-24 2003-10-30 Paul Beardsley Calibration of multiple cameras for a turntable-based 3D scanner
US20030231174A1 (en) * 2002-06-17 2003-12-18 Wojciech Matusik Modeling and rendering of surface reflectance fields of 3D objects
US20050068523A1 (en) * 2003-08-11 2005-03-31 Multi-Dimension Technology, Llc Calibration block and method for 3D scanner
US20050068544A1 (en) * 2003-09-25 2005-03-31 Gunter Doemens Panoramic scanner
US20080246757A1 (en) * 2005-04-25 2008-10-09 Masahiro Ito 3D Image Generation and Display System
US20090097039A1 (en) * 2005-05-12 2009-04-16 Technodream21, Inc. 3-Dimensional Shape Measuring Method and Device Thereof
WO2008108749A1 (en) * 2006-01-20 2008-09-12 Nextpat Limited Desktop three-dimensional scanner
US20090273792A1 (en) * 2008-04-21 2009-11-05 Max-Planck Gesellschaft Zur Forderung Der Wissenschaften E.V. Robust three-dimensional shape acquisition method and system

Non-Patent Citations (5)

* Cited by examiner, † Cited by third party
Title
Colombo et al., A Desktop 3D Scanner Exploiting Rotation and Visual Rectification of Laser Profiles, 2006, IEEE, pp. 1-6. *
Liu et al., Strategy for automatic and complete three-dimensional optical digitization, 2012, Opt Lett., pp. 1-11. *
Moreno et al., Simple, Accurate, and Robust Projector-Camera Calibration, 2012, IEEE, pp. 464-471. *
Sadlo et al., A Practical Structured Light Acquisition System for Point-Based Geometry and Texture, 2005, Proceedings of the Eurographics Symposium on Point-Based Graphics, pp. 1-10. *
Yemez et al., 3D reconstruction of real objects with high resolution shape and texture, 2004, Elsevier, pp. 1137-1153. *

Cited By (9)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20160212412A1 (en) * 2013-09-18 2016-07-21 Matter and Form Inc. Device, system and method for three-dimensional modeling
US9900585B2 (en) * 2013-09-18 2018-02-20 Matter and Form Inc. Device, system and method for three-dimensional modeling
US10583354B2 (en) 2014-06-06 2020-03-10 Lego A/S Interactive game apparatus and toy construction system
US10646780B2 (en) 2014-10-02 2020-05-12 Lego A/S Game system
CN108140247A (en) * 2015-10-05 2018-06-08 谷歌有限责任公司 Use the camera calibrated of composograph
US10600240B2 (en) 2016-04-01 2020-03-24 Lego A/S Toy scanner
CN108389233A (en) * 2018-02-23 2018-08-10 大连理工大学 The laser scanner and camera calibration method approached based on boundary constraint and mean value
CN109493388A (en) * 2018-09-30 2019-03-19 先临三维科技股份有限公司 Rotating axis calibration method, device, computer equipment and storage medium
WO2022078421A1 (en) * 2020-10-15 2022-04-21 左忠斌 Multi-pitch-angle intelligent visual 3d information collection device

Also Published As

Publication number Publication date
US20150042758A1 (en) 2015-02-12
US20150043225A1 (en) 2015-02-12
US9418424B2 (en) 2016-08-16

Similar Documents

Publication Publication Date Title
US9418424B2 (en) Laser scanning systems and methods
US10719001B2 (en) Smart lighting device and control method thereof
US20210112229A1 (en) Three-dimensional scanning device and methods
US10516834B2 (en) Methods and apparatus for facilitating selective blurring of one or more image portions
CN107113415B (en) The method and apparatus for obtaining and merging for more technology depth maps
EP3018903B1 (en) Method and system for projector calibration
US20190052813A1 (en) Methods and apparatus for implementing zoom using one or more moveable camera modules
KR101696630B1 (en) Lighting system and method for image and object enhancement
US9649814B2 (en) 3D scanning-printing device
DE102016125779B4 (en) Camera module assembly with movable reflective elements and portable electronic device herewith
US10270986B2 (en) Near-infrared video compositing
WO2015162605A2 (en) System and method for controlling a camera based on processing an image captured by other camera
US9786080B1 (en) 2D/3D image scanning and compositing
CN105678736B (en) Change the image processing system and its operating method of estimation of Depth with aperture
CN114830030A (en) System and method for capturing and generating panoramic three-dimensional images
TW201802562A (en) Flashlight module, electronic device with the flashlight module, and method for controlling the flashlight module
WO2018087781A1 (en) Product photography machine
CN106840034A (en) 3 D scanning system and its application with the speckle projector
US20190222777A1 (en) Near-infrared video compositing
TWI662355B (en) Image background removing system, image background changing apparatus, image background removing method, and computer program product thereof
WO2016127919A1 (en) Array camera module and array camera device and focusing method therefor
CN109479087A (en) A kind of method and device of image procossing
JP2015072444A (en) Solid focusing method and system thereof
CN106954009B (en) Map acquisition device and method and portable terminal
US11792511B2 (en) Camera system utilizing auxiliary image sensors

Legal Events

Date Code Title Description
AS Assignment

Owner name: MAKERBOT INDUSTRIES, LLC, NEW YORK

Free format text: ASSIGNMENT OF ASSIGNORS INTEREST;ASSIGNORS:GOODMAN, TAYLOR S.;ANANTHA, VISHNU;MCCALLUM, BENJAMIN R.;AND OTHERS;SIGNING DATES FROM 20141017 TO 20150102;REEL/FRAME:034743/0493

STCB Information on status: application discontinuation

Free format text: ABANDONED -- FAILURE TO RESPOND TO AN OFFICE ACTION