US20140043442A1 - Mobile device accessory for three-dimensional scanning - Google Patents
Mobile device accessory for three-dimensional scanning Download PDFInfo
- Publication number
- US20140043442A1 US20140043442A1 US13/737,579 US201313737579A US2014043442A1 US 20140043442 A1 US20140043442 A1 US 20140043442A1 US 201313737579 A US201313737579 A US 201313737579A US 2014043442 A1 US2014043442 A1 US 2014043442A1
- Authority
- US
- United States
- Prior art keywords
- camera
- dimensional
- housing
- computing device
- mobile computing
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Abandoned
Links
Images
Classifications
-
- H04N13/02—
-
- B—PERFORMING OPERATIONS; TRANSPORTING
- B29—WORKING OF PLASTICS; WORKING OF SUBSTANCES IN A PLASTIC STATE IN GENERAL
- B29C—SHAPING OR JOINING OF PLASTICS; SHAPING OF MATERIAL IN A PLASTIC STATE, NOT OTHERWISE PROVIDED FOR; AFTER-TREATMENT OF THE SHAPED PRODUCTS, e.g. REPAIRING
- B29C64/00—Additive manufacturing, i.e. manufacturing of three-dimensional [3D] objects by additive deposition, additive agglomeration or additive layering, e.g. by 3D printing, stereolithography or selective laser sintering
- B29C64/10—Processes of additive manufacturing
- B29C64/106—Processes of additive manufacturing using only liquids or viscous materials, e.g. depositing a continuous bead of viscous material
-
- B—PERFORMING OPERATIONS; TRANSPORTING
- B33—ADDITIVE MANUFACTURING TECHNOLOGY
- B33Y—ADDITIVE MANUFACTURING, i.e. MANUFACTURING OF THREE-DIMENSIONAL [3-D] OBJECTS BY ADDITIVE DEPOSITION, ADDITIVE AGGLOMERATION OR ADDITIVE LAYERING, e.g. BY 3-D PRINTING, STEREOLITHOGRAPHY OR SELECTIVE LASER SINTERING
- B33Y40/00—Auxiliary operations or equipment, e.g. for material handling
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N1/00—Scanning, transmission or reproduction of documents or the like, e.g. facsimile transmission; Details thereof
- H04N1/00127—Connection or combination of a still picture apparatus with another apparatus, e.g. for storage, processing or transmission of still picture signals or of information associated with a still picture
- H04N1/00278—Connection or combination of a still picture apparatus with another apparatus, e.g. for storage, processing or transmission of still picture signals or of information associated with a still picture with a printing apparatus, e.g. a laser beam printer
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N13/00—Stereoscopic video systems; Multi-view video systems; Details thereof
- H04N13/20—Image signal generators
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N13/00—Stereoscopic video systems; Multi-view video systems; Details thereof
- H04N13/20—Image signal generators
- H04N13/204—Image signal generators using stereoscopic image cameras
- H04N13/207—Image signal generators using stereoscopic image cameras using a single 2D image sensor
- H04N13/211—Image signal generators using stereoscopic image cameras using a single 2D image sensor using temporal multiplexing
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N13/00—Stereoscopic video systems; Multi-view video systems; Details thereof
- H04N13/20—Image signal generators
- H04N13/204—Image signal generators using stereoscopic image cameras
- H04N13/239—Image signal generators using stereoscopic image cameras using two 2D image sensors having a relative position equal to or related to the interocular distance
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N13/00—Stereoscopic video systems; Multi-view video systems; Details thereof
- H04N13/20—Image signal generators
- H04N13/204—Image signal generators using stereoscopic image cameras
- H04N13/254—Image signal generators using stereoscopic image cameras in combination with electromagnetic radiation sources for illuminating objects
-
- B—PERFORMING OPERATIONS; TRANSPORTING
- B29—WORKING OF PLASTICS; WORKING OF SUBSTANCES IN A PLASTIC STATE IN GENERAL
- B29C—SHAPING OR JOINING OF PLASTICS; SHAPING OF MATERIAL IN A PLASTIC STATE, NOT OTHERWISE PROVIDED FOR; AFTER-TREATMENT OF THE SHAPED PRODUCTS, e.g. REPAIRING
- B29C64/00—Additive manufacturing, i.e. manufacturing of three-dimensional [3D] objects by additive deposition, additive agglomeration or additive layering, e.g. by 3D printing, stereolithography or selective laser sintering
- B29C64/10—Processes of additive manufacturing
- B29C64/106—Processes of additive manufacturing using only liquids or viscous materials, e.g. depositing a continuous bead of viscous material
- B29C64/118—Processes of additive manufacturing using only liquids or viscous materials, e.g. depositing a continuous bead of viscous material using filamentary material being melted, e.g. fused deposition modelling [FDM]
Definitions
- an accessory provides additional optics, image sensors, lighting and/or processing to complement preexisting device hardware in support of a variety of three-dimensional imaging techniques.
- FIG. 1 is a block diagram of a three-dimensional printer.
- FIG. 2 shows a networked three-dimensional printing environment.
- FIG. 3 shows a mobile device with an accessory for three-dimensional imaging.
- FIG. 4 is functional block diagram of an accessory coupled to a mobile device.
- FIG. 5 shows a three-dimensional imaging system with dual optical paths.
- FIG. 6 shows a three-dimensional imaging system with dual optical paths.
- FIG. 7 shows a method for using an accessory to capture three-dimensional images with a mobile device.
- FIG. 8 shows a method for using an accessory to capture three-dimensional images with a mobile device.
- FIG. 9 shows a method for using an accessory to capture three-dimensional images with a mobile device.
- FIG. 1 is a block diagram of a three-dimensional printer.
- the printer 100 may include a build platform 102 , an extruder 106 , an x-y-z positioning assembly 108 , and a controller 110 that cooperate to fabricate an object 112 within a working volume 114 of the printer 100 .
- the build platform 102 may include a surface 116 that is rigid and substantially planar.
- the surface 116 may provide a fixed, dimensionally and positionally stable platform on which to build the object 112 .
- the build platform 102 may include a thermal element 130 that controls the temperature of the build platform 102 through one or more active devices 132 , such as resistive elements that convert electrical current into heat, Peltier effect devices that can create a heating or cooling affect, or any other thermoelectric heating and/or cooling devices.
- the thermal element 130 may be coupled in a communicating relationship with the controller 110 in order for the controller 110 to controllably impart heat to or remove heat from the surface 116 of the build platform 102 .
- the extruder 106 may include a chamber 122 in an interior thereof to receive a build material.
- the build material may, for example, include acrylonitrile butadiene styrene (“ABS”), high-density polyethylene (“HDPL”), polylactic acid (“PLA”), or any other suitable plastic, thermoplastic, or other material that can usefully be extruded to form a three-dimensional object.
- the extruder 106 may include an extrusion tip 124 or other opening that includes an exit port with a circular, oval, slotted or other cross-sectional profile that extrudes build material in a desired cross-sectional shape.
- the extruder 106 may include a heater 126 (also referred to as a heating element) to melt thermoplastic or other meltable build materials within the chamber 122 for extrusion through an extrusion tip 124 in liquid form. While illustrated in block form, it will be understood that the heater 126 may include, e.g., coils of resistive wire wrapped about the extruder 106 , one or more heating blocks with resistive elements to heat the extruder 106 with applied current, an inductive heater, or any other arrangement of heating elements suitable for creating heat within the chamber 122 sufficient to melt the build material for extrusion.
- the extruder 106 may also or instead include a motor 128 or the like to push the build material into the chamber 122 and/or through the extrusion tip 124 .
- a build material such as ABS plastic in filament form may be fed into the chamber 122 from a spool or the like by the motor 128 , melted by the heater 126 , and extruded from the extrusion tip 124 .
- the build material may be extruded at a controlled volumetric rate. It will be understood that a variety of techniques may also or instead be employed to deliver build material at a controlled volumetric rate, which may depend upon the type of build material, the volumetric rate desired, and any other factors. All such techniques that might be suitably adapted to delivery of build material for fabrication of a three-dimensional object are intended to fall within the scope of this disclosure.
- the x-y-z positioning assembly 108 may generally be adapted to three-dimensionally position the extruder 106 and the extrusion tip 124 within the working volume 114 .
- the object 112 may be fabricated in three dimensions by depositing successive layers of material in two-dimensional patterns derived, for example, from cross-sections of a computer model or other computerized representation of the object 112 .
- a variety of arrangements and techniques are known in the art to achieve controlled linear movement along one or more axes.
- the x-y-z positioning assembly 108 may, for example, include a number of stepper motors 109 to independently control a position of the extruder 106 within the working volume along each of an x-axis, a y-axis, and a z-axis. More generally, the x-y-z positioning assembly 108 may include without limitation various combinations of stepper motors, encoded DC motors, gears, belts, pulleys, worm gears, threads, and so forth.
- the build platform 102 may be coupled to one or more threaded rods by a threaded nut so that the threaded rods can be rotated to provide z-axis positioning of the build platform 102 relative to the extruder 124 .
- This arrangement may advantageously simplify design and improve accuracy by permitting an x-y positioning mechanism for the extruder 124 to be fixed relative to a build volume. Any such arrangement suitable for controllably positioning the extruder 106 within the working volume 114 may be adapted to use with the printer 100 described herein.
- this may include moving the extruder 106 , or moving the build platform 102 , or some combination of these.
- any reference to moving an extruder relative to a build platform, working volume, or object is intended to include movement of the extruder or movement of the build platform, or both, unless a more specific meaning is explicitly provided or otherwise clear from the context.
- an x, y, z coordinate system serves as a convenient basis for positioning within three dimensions, any other coordinate system or combination of coordinate systems may also or instead be employed, such as a positional controller and assembly that operates according to cylindrical or spherical coordinates.
- the controller 110 may be electrically or otherwise coupled in a communicating relationship with the build platform 102 , the x-y-z positioning assembly 108 , and the other various components of the printer 100 .
- the controller 110 is operable to control the components of the printer 100 , such as the build platform 102 , the x-y-z positioning assembly 108 , and any other components of the printer 100 described herein to fabricate the object 112 from the build material.
- the controller 110 may include any combination of software and/or processing circuitry suitable for controlling the various components of the printer 100 described herein including without limitation microprocessors, microcontrollers, application-specific integrated circuits, programmable gate arrays, and any other digital and/or analog components, as well as combinations of the foregoing, along with inputs and outputs for transceiving control signals, drive signals, power signals, sensor signals, and so forth.
- this may include circuitry directly and physically associated with the printer 100 such as an on-board processor.
- this may be a processor associated with a personal computer or other computing device coupled to the printer 100 , e.g., through a wired or wireless connection.
- controller or “processor” as used herein, unless a different meaning is explicitly provided or otherwise clear from the context.
- the other hardware 134 may include a temperature sensor positioned to sense a temperature of the surface of the build platform 102 , the extruder 126 , or any other system components. This may, for example, include a thermistor or the like embedded within or attached below the surface of the build platform 102 . This may also or instead include an infrared detector or the like directed at the surface 116 of the build platform 102 .
- the other hardware 134 may include a sensor to detect a presence of the object 112 at a predetermined location.
- This may include an optical detector arranged in a beam-breaking configuration to sense the presence of the object 112 at a predetermined location.
- This may also or instead include an imaging device and image processing circuitry to capture an image of the working volume and to analyze the image to evaluate a position of the object 112 .
- This sensor may be used for example to ensure that the object 112 is removed from the build platform 102 prior to beginning a new build on the working surface 116 .
- the sensor may be used to determine whether an object is present that should not be, or to detect when an object is absent.
- the feedback from this sensor may be used by the controller 110 to issue processing interrupts or otherwise control operation of the printer 100 .
- the other hardware 134 may also or instead include a heating element (instead of or in addition to the thermal element 130 ) to heat the working volume such as a radiant heater or forced hot air heater to maintain the object 112 at a fixed, elevated temperature throughout a build, or the other hardware 134 may include a cooling element to cool the working volume.
- a heating element instead of or in addition to the thermal element 130
- the other hardware 134 may include a cooling element to cool the working volume.
- FIG. 2 depicts a networked three-dimensional printing environment.
- the environment 200 may include a data network 202 interconnecting a plurality of participating devices in a communicating relationship.
- the participating devices may, for example, include any number of three-dimensional printers 204 (also referred to interchangeably herein as “printers”), client devices 206 , print servers 208 , content sources 210 , mobile devices 212 , and other resources 216 .
- the data network 202 may be any network(s) or internetwork(s) suitable for communicating data and control information among participants in the environment 200 .
- This may include public networks such as the Internet, private networks, telecommunications networks such as the Public Switched Telephone Network or cellular networks using third generation (e.g., 3G or IMT-2000), fourth generation (e.g., LTE (E-UTRA) or WiMax-Advanced (IEEE 802.16m)) and/or other technologies, as well as any of a variety of corporate area or local area networks and other switches, routers, hubs, gateways, and the like that might be used to carry data among participants in the environment 200 .
- third generation e.g., 3G or IMT-2000
- fourth generation e.g., LTE (E-UTRA) or WiMax-Advanced (IEEE 802.16m)
- IEEE 802.16m WiMax-Advanced
- the three-dimensional printers 204 may be any computer-controlled devices for three-dimensional fabrication, including without limitation any of the three-dimensional printers or other fabrication or prototyping devices described above.
- each such device may include a network interface comprising, e.g., a network interface card, which term is used broadly herein to include any hardware (along with software, firmware, or the like to control operation of same) suitable for establishing and maintaining wired and/or wireless communications.
- the network interface card may include without limitation wired Ethernet network interface cards (“NICs”), wireless 802.11 networking cards, wireless 802.11 USB devices, or other hardware for wireless local area networking.
- the network interface may also or instead include cellular network hardware, wide area wireless network hardware or any other hardware for centralized, ad hoc, peer-to-peer, or other radio communications that might be used to carry data.
- the network interface may include a serial or USB port to directly connect to a computing device such as a desktop computer that, in turn, provides more general network connectivity to the data network 202 .
- the printers 204 might be made to fabricate any object, practical or otherwise, that is amenable to fabrication according to each printer's capabilities. This may be a model of a house or a tea cup, as depicted, or any other object such as a bunny, gears or other machine hardware, replications of scanned three-dimensional objects, or fanciful works of art.
- Client devices 206 may be any devices within the environment 200 operated by users to initiate, manage, monitor, or otherwise interact with print jobs at the three-dimensional printers 204 . This may include desktop computers, laptop computers, network computers, tablets, or any other computing device that can participate in the environment 200 as contemplated herein.
- Each client device 206 generally provides a user interface, which may include a graphical user interface, a text or command line interface, a voice-controlled interface, and/or a gesture-based interface to control operation of remote three-dimensional printers 204 .
- the user interface may be maintained by a locally executing application on one of the client devices 206 that receives data and status information from, e.g., the printers 204 and print servers 208 concerning pending or executing print jobs.
- the user interface may create a suitable display on the client device 206 for user interaction.
- the user interface may be remotely served and presented on one of the client devices 206 , such as where a print server 208 or one of the three-dimensional printers 204 includes a web server that provides information through one or more web pages or the like that can be displayed within a web browser or similar client executing on one of the client devices 206 .
- the user interface may include a voice controlled interface that receives spoken commands from a user and/or provides spoken feedback to the user.
- the print servers 208 may include data storage, a network interface, and a processor and/or other processing circuitry. In the following description, where the functions or configuration of a print server 208 are described, this is intended to include corresponding functions or configuration (e.g., by programming) of a processor of the print server 208 . In general, the print servers 208 (or processors thereof) may perform a variety of processing tasks related to management of networked printing. For example, the print servers 208 may manage print jobs received from one or more of the client devices 206 , and provide related supporting functions such as content search and management. A print server 208 may also include a web server that provides web-based access by the client devices 206 to the capabilities of the print server 208 .
- a print server 208 may also communicate periodically with three-dimensional printers 204 in order to obtain status information concerning, e.g., availability of printers and/or the status of particular print jobs, any of which may be subsequently presented to a user through the web server or any other suitable interface.
- a print server 208 may also maintain a list of available three-dimensional printers 204 , and may automatically select one of the three-dimensional printers 204 for a user-submitted print job, or may permit a user to specify a single printer, or a group of preferred printers, for fabricating an object.
- any number of criteria may be used such as geographical proximity, printing capabilities, current print queue, fees (if any) for use of a particular three-dimensional printer 204 , and so forth. Where the user specifies criteria, this may similarly include any relevant aspects of three-dimensional printers 204 , and may permit use of absolute criteria (e.g., filters) or preferences, which may be weighted preferences or unweighted preferences, any of which may be used by a print server 208 to allocate a print job to a suitable resource.
- absolute criteria e.g., filters
- preferences may be weighted preferences or unweighted preferences, any of which may be used by a print server 208 to allocate a print job to a suitable resource.
- the print server 208 may be configured to support interactive voice control of one of the printers 204 .
- the print server 208 may be configured to receive a voice signal (e.g., in digitized audio form) from a microphone or other audio input of the printer 204 , and to process the voice signal to extract relevant content such as a command for the printer.
- the voice signal may be further processed to extract additional context or relevant details.
- the voice signal may be processed to extract an object identifier that specifies an object for printing, e.g., by filename, file metadata, or semantic content.
- the voice signal may also be processed to extract a dimensional specification, such as a scale or absolute dimension for an object.
- the print server 208 may then generate suitable control signals for return to the printer 204 to cause the printer 204 to fabricate the object. Where an error or omission is detected, the print server 208 may return a request for clarification to the printer 204 , which may render the request in spoken form through a speaker, or within a user interface of the printer 204 or an associated device.
- a print server 208 may store a user's preference on handling objects greater than a build volume of a printer. These preferences may control whether to resize the object, whether to break the object into multiple sub-objects for fabrication, and whether to transmit multiple sub-objects to a single printer or multiple printers.
- user preferences or requirements may be stored, such as multi-color printing capability, build material options and capabilities, and so forth.
- a print queue (which may be a printer-specific or user-specific queue, and which may be hosted at a printer 204 , a server 208 , or some combination of these) may be managed by a print server 208 according to one or more criteria from a remote user requesting a print job.
- the print server 208 may also store user preferences or criteria for filtering content, e.g., for automatic printing or other handling.
- any criteria that can be used to identify models of potential interest by explicit type (e.g., labeled in model metadata), implicit type (e.g., determined based on analysis of the model), source, and so forth, may be provided to the print server 208 and used to automatically direct new content to one or more user-specified ones of the three-dimensional printers 204 .
- the print server 208 may usefully store user-specific data such as training for a voice recognition model.
- the print server 208 may also or instead store voice rendering data to use in generating spoken output by the printer 204 .
- This may, for example, include voice type data, voice model data, voice sample data, and so forth.
- a user may purchase or otherwise obtain a voice style (e.g., a celebrity voice or other personality) to render spoken commands and maintain the voice style on the print server 208 .
- the print server 208 may also or instead store data characterizing capabilities of the printer 204 so that voice commands received at the print server 208 can be analyzed for suitability, accuracy, and so forth according to the capabilities of the printer 204 from which the voice command was received.
- any data or processing for voice interaction that can be usefully stored or executed remotely from the printer 204 may be located at the printer server 208 . It will be understood that any such data may also or instead be stored on a client device, a printer 204 , or some combination of these.
- the processor of the print server may be configured to store a plurality of print jobs submitted to the web server in a log and to provide an analysis of print activity based on the log.
- This may include any type of analysis that might be useful to participants in the environment 200 .
- the analysis may include tracking of the popularity of particular objects, or of particular content sources.
- the analysis may include tracking of which three-dimensional printers 204 are most popular or least popular, or related statistics such as the average backlog of pending print jobs at a number of the three-dimensional printers 204 .
- the analysis may include success of a particular printer in fabricating a particular model or of a particular printer in completing print jobs generally.
- any statistics or data may be obtained, and any analysis may be performed, that might be useful to users (e.g., when requesting prints), content sources (e.g., when choosing new printable objects for publication), providers of fabrication resources (e.g., when setting fees), or network facilitators such as the print servers 208 .
- a print server 208 may also maintain a database 209 of content, along with an interface for users at client devices 206 to search the database 209 and request fabrication of objects in the database 209 using any of the three-dimensional printers 204 .
- a print server 208 (or any system including the print server 208 ) may include a database 209 of three-dimensional models, and the print server 208 may act as a server that provides a search engine for locating a particular three-dimensional model in the database 209 .
- the search engine may be a text-based search engine using keyword text queries, plain language queries, and so forth.
- the search engine may also or instead include an image-based search engine configured to identify three-dimensional models similar to a two-dimensional or three-dimensional image provide by a user.
- the printer server 208 may periodically search for suitable content at remote locations on the data network, which content may be retrieved to the database 209 , or have its remote location (e.g., a URL or other network location identifier) stored in the database 209 .
- the print server 208 may provide an interface for submission of objects from remote users, along with any suitable metadata such as a title, tags, creator information, descriptive narrative, pictures, recommended printer settings, and so forth.
- the database 209 may be manually curated according to any desired standards.
- printable objects in the database 209 may be manually or automatically annotated according to content type, popularity, editorial commentary, and so forth.
- the print server 208 may more generally provide a variety of management functions.
- the print server 204 may store a location of a predetermined alternative three-dimensional printer to execute a print job from a remote user in the event of a failure by the one of the plurality of three-dimensional printers 204 .
- the print server 208 may maintain exclusive control over at least one of the plurality of three-dimensional printers 204 , such that other users and/or print servers cannot control the printer.
- the print server 208 may submit a print job to a first available one of the plurality of three-dimensional printers 204 .
- a print server 208 may provide an interface for managing subscriptions to sources of content. This may include tools for searching existing subscriptions, locating or specifying new sources, subscribing to sources of content, and so forth.
- a print server 208 may manage subscriptions and automatically direct new content from these subscriptions to a three-dimensional printer 204 according to any user-specified criteria.
- a three-dimensional printer 204 may autonomously subscribe to sources of content through a network interface and receive new content directly from such sources, it is also contemplated that this feature may be maintained through a remote resource such as a print server 208 .
- a print server 208 may maintain print queues for participating three-dimensional printers 204 . This approach may advantageously alleviate backlogs at individual printers 204 , which may have limited memory capacity for pending print jobs. More generally, a print server 208 may, by communicating with multiple three-dimensional printers 204 , obtain a view of utilization of multiple networked resources that permits a more efficient allocation of print jobs than would be possible through simple point-to-point communications among users and printers. Print queues may also be published by a print server 208 so that users can view pending queues for a variety of different three-dimensional printers 204 prior to selecting a resource for a print job. In one aspect, the print queue may be published as a number of print jobs and size of print jobs so that a requester can evaluate likely delays. In another aspect, the print queue may be published as an estimated time until a newly submitted print job can be initiated.
- the print queue of one of the print servers 208 may include one or more print jobs for one of the plurality of three-dimensional printers 204 .
- the print queue may be stored locally at the one of the plurality of three-dimensional printers.
- the print queue may be allocated between the database 209 and a local memory of the three-dimensional printer 204 .
- the print queue may be stored, for example, in the database 209 of the print server 208 .
- the term ‘print queue’ is intended to include print data (e.g., the three-dimensional model or tool instructions to fabricate an object) for a number of print job (which may be arranged for presentation in order of expected execution), as well as any metadata concerning print jobs.
- a portion of the print queue such as the metadata (e.g., size, status, time to completion) may be usefully communicated to a print server 208 for sharing among users while another portion of the print queue such as the model data may be stored at a printer in preparation for execution of a print job.
- the metadata e.g., size, status, time to completion
- Print queues may implement various user preferences on prioritization. For example, for a commercial enterprise, longer print jobs may be deferred for after normal hours of operation (e.g., after 5:00 p.m.), while shorter print jobs may be executed first if they can be completed before the end of a business day. In this manner, objects can be identified and fabricated from within the print queue in a manner that permits as many objects as possible to be fabricated before a predetermined closing time. Similarly, commercial providers of fabrication services may charge explicitly for prioritized fabrication, and implement this prioritization by prioritizing print queues in a corresponding fashion.
- a print server 208 may provide a virtual workspace for a user.
- a user may search local or remote databases of printable objects, save objects of interest (or links thereto), manage pending prints, specify preferences for receiving status updates (e.g., by electronic mail or SMS text), manage subscriptions to content, search for new subscription sources, and so forth.
- the virtual workspace may be, or may include, web-based design tools or a web-based design interface that permits a user to create and modify models.
- the virtual workspace may be deployed on the web, while permitting direct fabrication of a model developed within that environment on a user-specified one of the three-dimensional printers 204 , thus enabling a web-based design environment that is directly coupled to one or more fabrication resources.
- the content sources 210 may include any sources of content for fabrication with a three-dimensional printer 204 .
- This may, for example, include databases of objects accessible through a web interface or application programming interface.
- This may also or instead include individual desktop computers or the like configured as a server for hosted access, or configured to operate as a peer in a peer-to-peer network.
- This may also or instead include content subscription services, which may be made available in an unrestricted fashion, or may be made available on a paid subscription basis, or on an authenticated basis based upon some other relationship (e.g., purchase of a related product or a ticket to an event).
- content providers may serve as content sources 210 as contemplated herein.
- the content sources 210 may include destinations such as amusement parks, museums, theaters, performance venues, or the like, any of which may provide content related to users who purchase tickets.
- the content sources 210 may include manufacturers such as automobile, computer, consumer electronics, or home appliance manufacturers, any of which may provide content related to upgrades, maintenance, repair, or other support of existing products that have been purchased.
- the content sources 210 may include artists or other creative enterprises that sell various works of interest.
- the content sources 210 may include engineering or architectural firms that provide marketing or advertising pieces to existing or prospective customers.
- the content sources 210 may include marketing or advertising firms that provide promotional items for clients. More generally, the content sources 210 may be any individual or enterprise that provides single or serial objects for fabrication by the three-dimensional printers 204 described herein.
- One or more web servers 211 may provide web-based access to and from any of the other participants in the environment 200 . While depicted as a separate network entity, it will be readily appreciated that a web server 211 may be logically or physically associated with one of the other devices described herein, and may, for example, provide a user interface for web access to one of the three-dimensional printers 204 , one of the print servers 208 (or databases 209 coupled thereto), one of the content sources 210 , or any of the other resources 216 described below in a manner that permits user interaction through the data network 202 , e.g., from a client device 206 or mobile device 212 .
- the mobile devices 212 may be any form of mobile device, such as any wireless, battery-powered device, that might be used to interact with the networked printing environment 200 .
- the mobile devices 212 may, for example, include laptop computers, tablets, thin client network computers, portable digital assistants, messaging devices, cellular phones, smart phones, portable media or entertainment devices, and so forth.
- mobile devices 212 may be operated by users for a variety of user-oriented functions such as to locate printable objects, to submit objects for printing, to monitor a personally owned printer, and/or to monitor a pending print job.
- a mobile device 212 may include location awareness technology such as Global Positioning System (“GPS”), which may obtain information that can be usefully integrated into a printing operation in a variety of ways.
- GPS Global Positioning System
- a user may select an object for printing and submit a model of the object to a print server, such as any of the print servers described above.
- the print server may determine a location of the mobile device 212 initiating the print job and locate a closest printer for fabrication of the object.
- a printing function may be location-based, using the GPS input (or cellular network triangulation, proximity detection, or any other suitable location detection techniques). For example, a user may be authorized to print a model only when the user is near a location (e.g., within a geo-fenced area or otherwise proximal to a location), or only after a user has visited a location. Thus a user may be provided with printable content based upon locations that the user has visited, or while within a certain venue such as an amusement park, museum, theater, sports arena, hotel, or the like. Similarly, a matrix barcode such as a QR code may be employed for localization.
- the other resources 216 may include any other software or hardware resources that may be usefully employed in networked printing applications as contemplated herein.
- the other resources 216 may include payment processing servers or platforms used to authorize payment for content subscriptions, content purchases, or printing resources.
- the other resources 216 may include social networking platforms that may be used, e.g., to share three-dimensional models and/or fabrication results according to a user's social graph.
- the other resources 216 may include certificate servers or other security resources for third party verification of identity, encryption or decryption of three-dimensional models, and so forth.
- the other resources 216 may include online tools for three-dimensional design or modeling, as well as databases of objects, surface textures, build supplies, and so forth.
- the other resources 216 may include a desktop computer or the like co-located (e.g., on the same local area network with, or directly coupled to through a serial or USB cable) with one of the three-dimensional printers 204 .
- the other resource 216 may provide supplemental functions for the three-dimensional printer 204 in a networked printing context such as maintaining a print queue or operating a web server for remote interaction with the three-dimensional printer 204 .
- Other resources 216 also include supplemental resources such as three-dimensional scanners, cameras, and post-processing/finishing machines or resources. More generally, any resource that might be usefully integrated into a networked printing environment may be one of the resources 216 as contemplated herein.
- a networked computer with a print server and a web interface to support networked three-dimensional printing.
- This device may include a print server, a database, and a web server as discussed above.
- the print server may be coupled through a data network to a plurality of three-dimensional printers and configured to receive status information from one or more sensors for each one of the plurality of three-dimensional printers.
- the print server may be further configured to manage a print queue for each one of the plurality of three-dimensional printers.
- the database may be coupled in a communicating relationship with the print server and configured to store print queue data and status information for each one of the plurality of three-dimensional printers.
- the web server may be configured to provide a user interface over the data network to a remote user, the user interface adapted to present the status information and the print queue data for one or more of the plurality of three-dimensional printers to the user and the user interface adapted to receive a print job from the remote user for one of the plurality of three-dimensional printers.
- the three-dimensional printer 204 described above may be configured to autonomously subscribe to syndicated content sources and periodically receive and print objects from those sources.
- a device including any of the three-dimensional printers described above; a network interface; and a processor (which may without limitation include the controller for the printer).
- the processor may be configured to subscribe to a plurality of sources of content (such as the content sources 210 described above) selected by a user for fabrication by the three-dimensional printer through the network interface.
- the processor may be further configured to receive one or more three-dimensional models from the plurality of content sources 210 , and to select one of the one or more three-dimensional models for fabrication by the three-dimensional printer 204 according to a user preference for prioritization.
- the user preference may, for example, preferentially prioritize particular content sources 210 , or particular types of content (e.g., tools, games, artwork, upgrade parts, or content related to a particular interest of the user).
- the memory of a three-dimensional printer 204 may be configured to store a queue of one or more additional three-dimensional models not selected for immediate fabrication.
- the processor may be programmed to periodically re-order or otherwise alter the queue according to pre-determined criteria or manual user input. For example, the processor may be configured to evaluate a new three-dimensional model based upon a user preference for prioritization, and to place the new three-dimensional model at a corresponding position in the queue.
- the processor may also or instead be configured to retrieve content from one of the content sources 210 by providing authorization credentials for the user, which may be stored at the three-dimensional printer or otherwise accessible for presentation to the content source 210 .
- the processor may be configured to retrieve content from at least one of the plurality of content sources 210 by authorizing a payment from the user to a content provider.
- the processor may be configured to search a second group of sources of content (such as any of the content sources 210 described above) according to one or more search criteria provide by a user. This may also or instead include demographic information for the user, contextual information for the user, or any other implicit or explicit user information.
- the system may include a web server configured to provide a user interface over a data network, which user interface is adapted to receive user preferences from a user including a subscription to a plurality of sources of a plurality of three-dimensional models, a prioritization of content from the plurality of sources, and an identification of one or more fabrication resources coupled to the data network and suitable for fabricating objects from the plurality of three-dimensional models.
- the system may also include a database to store the user preferences, and to receive and store the plurality of three-dimensional models as they are issued by the plurality of sources.
- the system may include a processor (e.g., of a print server 208 , or alternatively of a client device 206 interacting with the print server 208 ) configured to select one of the plurality of three-dimensional models for fabrication based upon the prioritization.
- the system may include a print server configured to communicate with the one or more fabrication resources through the data network, to determine an availability of the one or more fabrication resources, and to transmit the selected one of the plurality of three-dimensional models to one of the one or more fabrication resources.
- a network of three-dimensional printing resources comprising a plurality of three-dimensional printers, each one of the plurality of three-dimensional printers including a network interface; a server configured to manage execution of a plurality of print jobs by the plurality of three-dimensional printers; and a data network that couples the server and the plurality of three-dimensional printers in a communicating relationship.
- the server may include a web-based user interface configured for a user to submit a new print job to the server and to monitor progress of the new print job.
- the web-based user interface may permit video monitoring of each one of the plurality of three-dimensional printers, or otherwise provide information useful to a remote user including image-based, simulation-based, textual-based or other information concerning status of a current print.
- the web-based user interface may include voice input and/or output for network-based voice control of a printer.
- the fabrication resources may, for example, include any of the three-dimensional printers 204 described above.
- One or more of the fabrication resources may be a private fabrication resource secured with a credential-based access system.
- the user may provide, as a user preference and prior to use of the private fabrication resource, credentials for accessing the private fabrication resource.
- the one or more fabrication resources may include a commercial fabrication resource. In this case the user may provide an authorization to pay for use of the commercial fabrication resource in the form of a user preference prior to use of the commercial fabrication resource.
- prioritizing content may be particularly important to prevent crowding out of limited fabrication resources with low priority content that arrives periodically for autonomous fabrication.
- the systems and methods described herein permit prioritization using a variety of user-specified criteria, and permit use of multiple fabrication resources in appropriate circumstances.
- prioritizing content as contemplated herein may include any useful form of prioritization. For example, this may include prioritizing the content according to source.
- the content sources 210 may have an explicit type that specifies the nature of the source (e.g., commercial or paid content, promotional content, product support content, non-commercial) or the type of content provided (e.g., automotive, consumer electronics, radio control hobbyist, contest prizes, and so forth).
- Prioritizing content may include prioritizing the content according to this type.
- the three-dimensional models themselves may also or instead include a type (e.g., tool, game, home, art, jewelry, replacement part, upgrade part, etc.) or any other metadata, and prioritizing the content may includes prioritizing the content according to this type and/or metadata.
- the processor may be configured to select two or more of the plurality of three-dimensional models for concurrent fabrication by two or more of the plurality of fabrication resources based upon the prioritization when a priority of the two or more of the plurality of three-dimensional models exceeds a predetermined threshold. That is, where particular models individually have a priority above the predetermined threshold, multiple fabrication resources may be located and employed to fabricate these models concurrently.
- the predetermined threshold may be evaluated for each model individually, or for all of the models collectively such as on an aggregate or average basis.
- the processor may be configured to adjust prioritization based upon a history of fabrication when a number of objects fabricated from one of the plurality of sources exceeds a predetermined threshold.
- a user may limit the number of objects fabricated from a particular source, giving subsequent priority to content from other sources regardless of an objectively determined priority for a new object from the particular source. This prevents a single source from overwhelming a single fabrication resource, such as a personal three-dimensional printer operated by the user, in a manner that crowds out other content from other sources of possible interest.
- this may enable content sources 210 to publish on any convenient schedule, without regard to whether and how subscribers will be able to fabricate objects.
- the processor may be configured to identify one or more additional sources of content based upon a similarity to one of the plurality of sources of content. For example, where a content source 210 is an automotive manufacturer, the processor may perform a search for other automotive manufactures, related parts suppliers, mechanics, and so forth.
- the processor may also or instead be configured to identify one or more additional sources of content based upon a social graph of the user. This may, for example, include analyzing a social graph of relationships from the user to identify groups with common interests, shared professions, a shared history of schools or places of employment, or a common current or previous residence location, any of which may be used to locate other sources of content that may be of interest to the user.
- FIG. 3 shows a mobile device with an accessory for three-dimensional imaging.
- the mobile device 302 may include any suitable mobile device such as a cellular phone, media player, tablet, laptop computer or the like.
- the mobile device may include a processor 304 such as a microprocessor, microcontroller, or other processing circuitry that controls operation of the mobile device, provides a user interface, and so forth.
- the mobile device 302 may include a first camera 306 and a second camera 308 operable by the mobile device 302 to capture still images or vide, or to support live video-based communications.
- the first camera 306 may be a forward facing camera and the second camera 308 may be a rear facing camera (or vice versa).
- a user can take pictures of objects in front of the user with a first camera 306 facing away from the user, or the user can take a picture of himself or herself, or other items facing toward the user from the mobile device 302 using the second camera 308 .
- the processor 304 may include processing circuitry configured to obtain three-dimensional data from one or more images obtained by the first camera 306 and/or the second camera 308 .
- a wide array of image-based techniques for three-dimensional reconstruction are known in the art, and may be suitably adapted for use with the systems and methods contemplated herein.
- the processor 304 may apply shape-from motion techniques to a sequence of images captured from either or both of the first camera 306 and the second camera 308 .
- the first camera 306 and the second camera 308 may be controlled to capture images concurrently, or substantially concurrently, and the two images from offset poses may be processed as a stereoscopic image pair to extract three-dimensional features.
- the housing may include a structured light source or the light that illuminates an object with features that can be recognized when projected onto an object and processed to recover three-dimensional data.
- a structured light source or the light that illuminates an object with features that can be recognized when projected onto an object and processed to recover three-dimensional data.
- the housing 310 may include independent processing circuitry for three-dimensional imaging, which may receive images from a camera of the mobile device 302 and process such images either alone or in combination with images from a camera of the housing 310 to obtain three-dimensional data. All such variations are intended to fall within the scope of this disclosure.
- An accessory may include a housing 310 with a mechanical interface 312 configured to removably and replaceably attach to a predetermined mobile computing device such as the mobile device 302 in a predetermined orientation. While this mechanical interface 312 is illustrated in the cross-section of FIG. 3 as a flanged edge that encloses sides of the mobile device 302 along its perimeter, it will be appreciated that mobile devices 302 may have a variety of shapes and sizes, and a variety of mechanical interfaces may readily be devised to removably and replaceably secure the housing 310 to the mobile device 302 such that the mobile device 302 and the housing 310 are in a predetermined orientation relative to one another.
- the housing 310 may be fashioned of a flexible material that permits the housing 310 to be elastically bent around the edges of the mobile device 302 , or the mechanical interface 312 may include hinged, spring-loaded, or sliding latches that are manually secured about the edges of the mobile device 302 .
- a lens 314 on the housing 310 may as a result be fixed in a predetermined location and orientation relative to a lens of the first camera 306 in order to provide a field of view from a predetermined pose relative to the first camera 306 of the mobile device 302 when the mobile device 302 is positioned within the housing 310 .
- An optical train 316 may be further provided that optically couples the second camera 308 of the mobile device 302 to the lens 314 .
- the first camera 306 and the second camera 308 both capture forward-facing images from offset poses, thus providing a stereoscopic perspective on a field of view for the combined device 320 .
- the optical train may include any of a variety of optical components such as mirrors, fiber optics, intermediate lenses, and so forth to suitably couple the lens 314 to the second camera 308 for image acquisition.
- FIG. 4 is functional block diagram of an accessory coupled to a mobile device.
- the accessory 402 and the mobile device 404 may share various components for a three-dimensional imaging system, and may further share processing resources and/or be coordinated through a communications interface to cooperate in a three-dimensional imaging process.
- the accessory 402 may include a communication interface 406 configured to data communication between the accessory 402 and the mobile device 404 , with a complementary communication interface 408 on the mobile device 404 .
- the communication interface 406 may include hardware and/or software for any suitable wireless communication interface using, e.g., any 802.11 wireless protocol, BlueTooth, or any standardized or proprietary short-range wireless communications protocol based upon, e.g., radio frequency, optical, acoustic, or other suitable communication medium.
- the communication interface may include a wired communication interface that couples to a data port of the predetermined mobile computing device when the housing is attached to the predetermined mobile computing device.
- the communication interface 408 of the mobile device may include a data port configured for wired data communications.
- Contemporary mobile devices include numerous suitable physical ports including without limitation standardized ports such as USB connectors, micro-USB connectors, two or three ring plugs, and so forth, any of which may adapted for use as a data port as contemplated herein.
- many devices included proprietary arrangements of plugs, contacts, and the like for docking stations and recharging that may be adapted to use as a physical data port.
- the communication interface 406 of the accessory 402 may include a sensor to detect an action of the mobile computing device. This may include a data input such as a data input, trigger, dedicated pin, or the like in the communication interface 406 . In another aspect, this may include a sensor independent of the communications circuitry that couples the accessory 402 to the mobile device 404 .
- the sensor may include a sensor that detects a sound, a vibration, an illumination, or the like. In this manner, the mobile device 404 may signal the accessory 402 independent of a data communication link using any action that is (a) within the capabilities of the mobile device 404 , and (b) detectable by the accessory 402 .
- a processor of the mobile device may transmit a signal through the communication interfaces 408 , 406 , or with a vibration or a beep, to concurrently capture a picture with a camera of the accessory 402 , or to otherwise operate a shutter, illumination source or the like concurrently with the image capture by the mobile device.
- the action itself may include an autofocus, zoom, light meter reading, or other action related to image capture that can include a corresponding data signal to the accessory.
- the accessory 402 may also or instead include a camera 412 separate from the mobile device 404 .
- the accessory 402 may support three-dimensional processing in one respect by providing a supplemental camera to capture an image concurrently with or otherwise in addition to one or more images from the mobile device 404 for use in three-dimensional processing.
- a system 400 described herein includes a camera 412 within a housing of an accessory 402 .
- An image from the camera 412 may be transmitted to the mobile device 404 through the communication interfaces 406 , 408 for the mobile device 404 to perform three-dimensional processing tasks with a processor 410 or other processing circuitry of the mobile device 404 .
- the accessory may also or instead include a processor 414 or other processing circuitry to support three-dimensional imaging/processing.
- the processor 414 of the accessory 402 may receive image data from the mobile device 404 , e.g., through the communication interface 406 , and process the image data along with data obtained from the camera 412 of the accessory to obtain three-dimensional data.
- the processor 414 may also control operation of the camera 412 , and/or may control operation of a camera of the mobile device 404 , e.g., by communication with the mobile device 404 through the communication interfaces 406 , 408 .
- the processor 410 of the mobile device 404 may be configured to receive a first image from the camera and a second image through the communication interface from the second camera, and further configured to process the first image and the second image to obtain three-dimensional data from an overlapping field of view of the camera and the second camera.
- processing circuitry for three-dimensional processing may be contained within the mobile device 404 , such as the processing circuitry 410 depicted in FIG. 4 , which may be programmed for appropriate three-dimensional processing tasks, or within the accessory 402 (e.g., processor 414 ), or some combination of these.
- a system 400 contemplated herein includes processing circuitry on a mobile device configured to obtain substantially concurrent images from a camera and a second camera of the mobile device.
- the system 400 may also or instead include processing circuitry on the mobile device configured to process such substantially concurrent images to obtain three-dimensional data from an overlapping region of the field of view of the first camera (e.g., through the optical train) and the second field of view of the second camera.
- the processor 414 of the accessory 402 may include, or be associated with, a memory that, among other things, stores a unique identifier for the accessory 402 . This may be used to identify a user of the accessory 402 so that, for example, when a three-dimensional image is acquired using the accessory 402 and a mobile device 404 , the three-dimensional image may be transmitted to a print server or other networked three-dimensional printing resource such as any of those described above, which may in turn automatically associate the three-dimensional image with a particular user. This may be particularly useful where, for example, the mobile device 404 includes a data network connection for Internet access to such a remote resource.
- a user of the accessory may have three-dimensional images automatically uploaded to the remote resource where they can be available for printing or other manipulation by the user.
- a user interface may be provided on the mobile device 404 or on the accessory 402 for a user to authorize transmission of a captured three-dimensional image with the unique identifier to a remote resource for storage and subsequent retrieval/use.
- the user interface (again, on either the accessory 402 or the mobile device 404 ) may provide a “print this now” button or the like so that a physical reproduction of the three-dimensional image, or a further-processed version of the image, can be immediately queued for fabrication.
- the other networked printing resources, systems and methods described above may be used in various combinations to further process and/or reproduce such images on a three-dimensional printer, either automatically or under user control.
- the components of the accessory 402 may be powered by a power source for the mobile device 404 , which may be transferred to the housing 402 through the same electromechanical interface, e.g., a USB or plug connector, that supports data communications.
- the housing 402 may also or instead include a power source 416 independent of the mobile device for autonomous operation.
- This power source 416 which may be a battery or the like, may support operation of the camera 412 and the processing circuitry 414 of the accessory 402 , and may also provide supplemental power to the mobile device 404 , which may be particularly useful, for example, where data acquisition and three-dimensional processing would otherwise tend to tax a power supply of a mobile device to premature depletion.
- the accessory 402 may also include any other hardware 418 complementary to the intended use(s) of the accessory 402 .
- this may include memory such as a removable storage device (e.g., memory card, USB drive, or the like) or internal memory for storing image data and/or processed three-dimensional data.
- the accessory 402 may also include processing circuitry adapted convert acquired three-dimensional data into a suitable form.
- the processing circuitry may convert raw point cloud or polygonal data into an STL format for use by a three-dimensional printer, or into a CAD file of any suitable format for further processing.
- the other hardware 418 may include local or cellular wireless communications capabilities for connecting the accessory 402 to remote resources such as a three-dimensional printer, print server, desktop computer, or other device or combination of devices useful for processing and management of printable content as contemplated herein.
- FIG. 5 shows a three-dimensional imaging system with dual optical paths.
- An accessory 502 coupled to a mobile device 504 may in general include an optical train to direct optical paths to cameras of the accessory 502 and/or mobile device 504 .
- a first optical path 506 within the accessory 502 may direct an image from a lens of the accessory 502 to a first camera 508 of the mobile device 502 .
- a second optical path 510 within the accessory 502 may direct an image from another lens or opening of the mobile device 502 to a second camera 512 of the mobile device 502 .
- first optical path 506 provides an optical coupling to a first field of view (indicated generally by an arrow 520 )
- second optical path 510 may provide an optical coupling to a second field of view (indicated generally by a second arrow 522 ) different from the first field of view.
- stereoscopic imaging or other three-dimensional imaging techniques based upon image differentiation may be employed with multiple cameras of the mobile device 502 .
- Each optical path 506 , 510 may include independent optical components, which may be fixed optics such as transfer lenses or fiber optics, and/or controllable optics such as shutters, apertures, or the like to supplement imaging functions (i.e., sampling, shutter speed, etc.) of the mobile device 502 .
- the first optical path 506 may be readily omitted where the camera 508 has a field of view that can be overlapped with the second field of view.
- the second optical path 510 may be included to provide supplemental optics such as focusing lenses, scaling lenses, a controllable shutter or aperture, and so forth.
- the accessory 502 may optionally include a supplemental light source 528 positioned to illuminate the field of view and/or the second field of view.
- the supplemental light source 528 may be a strobe, flash, high-intensity light, or other light source useful for photographic illumination.
- the supplemental light source 528 may also or instead include a structured light source that can provide illumination using predetermined patterns of light that can be imaged and processed to derive three-dimensional data.
- the supplemental light source 528 may serve as an illumination source to illuminate a field of view of one of the cameras 508 , 512 from a predetermined pose for any of a variety of optically-based imaging techniques.
- the illumination source may be a structured light source that projects a predetermined pattern of light from the predetermined pose.
- the predetermined pattern of light may include one or more lines or shapes, which may be created with a lens, filter, or other suitable optics, or with a controllable or steerable light source such as a laser and corresponding hardware.
- a fixture 530 such as a lens, mirrors, or other mechanical and/or optical beam steering elements, may be provided to move the illumination source in a predetermined pattern.
- the illumination source may, for example, include a laser light source, a light emitting diode, an incandescent light source, or combinations of the foregoing.
- the supplemental light source 528 may include a plurality of illumination sources coupled to the housing, each one of the plurality of illumination sources having a different pose relative to the predetermined mobile computing device. In this manner, any reconstruction technique based upon directional lighting and or different patterns of light may be usefully implemented using a number of separately controllable illumination sources coupled to the housing of the accessory 504 .
- FIG. 6 shows a three-dimensional imaging system with dual optical paths.
- a first optical path 602 of an accessory 600 optically couples a camera 604 of a mobile device 605 to a first lens 606
- a second optical path 608 optically couples the camera 604 to a second lens 610 that provides a pose that is offset from the first lens 606 .
- An optical switch 612 such as a moveable mirror, surface with controllable reflectivity, controllable mirrors and apertures, or any other hardware that can controllably select between the optical paths 602 , 608 may be used as the optical switch 612 so that the accessory 600 can controllably direct the camera 604 toward the first lens 606 or the second lens 608 .
- a processor on the mobile device may, for example, be configured (e.g., by programming) to control the camera 604 and the optical switch 612 to capture temporally adjacent images from a field of view of the first lens 606 and a second field of view of the second lens 610 with the camera 604 .
- the processor may be configured to process the temporally adjacent images to obtain three-dimensional data from an overlapping region of the field of view and the second field of view.
- FIG. 7 shows a method for using an accessory to capture three-dimensional images with a mobile device.
- the method 700 may begin with attaching an accessory such as any of the accessories described above to a mobile device.
- the accessory may be removably and replaceably attachable to the mobile device, and may include a camera and a communication interface for communications with the mobile device.
- the method 700 may include capturing a first image with the camera.
- the method 700 may include capturing a second image substantially concurrently with the first image using a second camera of the mobile device, wherein the second camera has an overlapping field of view with the camera.
- the method 700 may include transmitting the first image to the mobile device through the communication interface.
- this step may include transmitting the second image to the accessory, where subsequent three-dimensional processing is performed on a processor of the accessory.
- the method may include processing the first image and the second image on the mobile device to obtain three-dimensional data from the overlapping field of view.
- FIG. 8 shows a method for using an accessory to capture three-dimensional images with a mobile device.
- an accessory may provide multiple optical paths, along with an optical switch that can be controlled to selectively expose a camera of a mobile device to different poses relative to an object in a field of view.
- the two (or more) resulting images may be processed to extract three-dimensional data.
- the method 800 may include attaching an accessory to a mobile device having a camera.
- the accessory may be removably and replaceably attachable to the mobile device, and the accessory may include an optical train with a first optical path for the camera to a first field of view and a second optical path for the camera to a second field of view having an overlapping field of view with the first field of view.
- the accessory may include an optical switch configured to selectively switch between the respective optical paths.
- the method 800 may include selecting the first optical path, such as by controlling the optical switch accordingly with a control signal from processing circuitry of the accessory.
- the method 800 may include capturing a first image with the camera, such as through the selected first optical path.
- the method 800 may include selecting the second optical path, such as by controlling the optical switch accordingly with a control signal from the processing circuitry of the accessory.
- the method 800 may include capturing a second image with the camera, e.g., through the selected second optical path.
- the method 800 may include processing the first image and the second image on the mobile device to obtain three-dimensional data from the overlapping field of view provided by the two optical paths.
- FIG. 9 shows a method for using an accessory to capture three-dimensional images with a mobile device.
- an accessory may provide two optical paths for two different cameras of the mobile device, which optical paths may serve to direct the two cameras toward to overlapping fields of view of an object.
- images may be captured using the two cameras and provided to a processor for extraction of three-dimensional data.
- the two cameras may advantageously be operated concurrently or substantially concurrently in order to avoid temporally-based changes in a shape or position of the object that might otherwise require additional processing for accurate extraction of three-dimensional data.
- the method 900 may include attaching an accessory to a mobile device.
- the accessory may be removably and replaceably attachable to the mobile device as described above, and the accessory may include an optical train with a first optical path from a first camera of the mobile device to a first field of view and a second optical path from a second camera of the mobile device to a second field of view having an overlapping field of view with the first field of view.
- the method may include capturing a first image with the first camera.
- the method may include capturing a second image with the second camera.
- the second image may be captured substantially concurrently with the first image. That is, the first image and the second image may be captured sufficiently close in time to prevent substantial movement of an object within the overlapping field of view relative to the cameras.
- this may include operating the first camera and the second camera concurrently, or as close to concurrently as possible based upon the hardware and processing capabilities of the mobile device.
- the method may include processing the first image and the second image on the mobile device to obtain three-dimensional data from the overlapping field of view.
- the methods or processes described above, and steps thereof, may be realized in hardware, software, or any combination of these suitable for a particular application.
- the hardware may include a general-purpose computer and/or dedicated computing device.
- the processes may be realized in one or more microprocessors, microcontrollers, embedded microcontrollers, programmable digital signal processors, or other programmable device, along with internal and/or external memory.
- the processes may also, or instead, be embodied in an application specific integrated circuit, a programmable gate array, programmable array logic, or any other device or combination of devices that may be configured to process electronic signals.
- one or more of the processes may be realized as computer executable code created using a structured programming language such as C, an object oriented programming language such as C++, or any other high-level or low-level programming language (including assembly languages, hardware description languages, and database programming languages and technologies) that may be stored, compiled or interpreted to run on one of the above devices, as well as heterogeneous combinations of processors, processor architectures, or combinations of different hardware and software.
- a structured programming language such as C
- an object oriented programming language such as C++
- any other high-level or low-level programming language including assembly languages, hardware description languages, and database programming languages and technologies
- each method described above and combinations thereof may be embodied in computer executable code that, when executing on one or more computing devices, performs the steps thereof.
- the methods may be embodied in systems that perform the steps thereof, and may be distributed across devices in a number of ways, or all of the functionality may be integrated into a dedicated, standalone device or other hardware.
- means for performing the steps associated with the processes described above may include any of the hardware and/or software described above. All such permutations and combinations are intended to fall within the scope of the present disclosure.
- performing the step of X includes any suitable method for causing another party such as a remote user, a remote processing resource (e.g., a server or cloud computer) or a machine to perform the step of X.
- performing steps X, Y and Z may include any method of directing or controlling any combination of such other individuals or resources to perform steps X, Y and Z to obtain the benefit of such steps.
Abstract
A variety of techniques are disclosed for enabling three-dimensional scanning with mobile devices. In general, an accessory provides additional optics, image sensors, lighting and/or processing to complement preexisting device hardware in support of a variety of three-dimensional imaging techniques.
Description
- This application is a continuation of U.S. Appl. No. U.S. Ser. No. 13/736,210 filed on Jan. 8, 2013, which claims the benefit of U.S. Prov. App. No. 61/680,989 filed on Aug. 8, 2012 and U.S. Prov. App. No. 61/719,874 filed on Oct. 29, 2012. The entire content of these applications is incorporated herein by reference.
- This application is related to U.S. application Ser. No. 13/314,337 filed on Dec. 8, 2011, the entire content of which is hereby incorporated by reference.
- There remains a need for accessories to support three-dimensional scanning with mobile devices such as cellular phones, tables, and laptop computers.
- A variety of techniques are disclosed for enabling three-dimensional scanning with mobile devices. In general, an accessory provides additional optics, image sensors, lighting and/or processing to complement preexisting device hardware in support of a variety of three-dimensional imaging techniques.
- The invention and the following detailed description of certain embodiments thereof may be understood by reference to the following figures:
-
FIG. 1 is a block diagram of a three-dimensional printer. -
FIG. 2 shows a networked three-dimensional printing environment. -
FIG. 3 shows a mobile device with an accessory for three-dimensional imaging. -
FIG. 4 is functional block diagram of an accessory coupled to a mobile device. -
FIG. 5 shows a three-dimensional imaging system with dual optical paths. -
FIG. 6 shows a three-dimensional imaging system with dual optical paths. -
FIG. 7 shows a method for using an accessory to capture three-dimensional images with a mobile device. -
FIG. 8 shows a method for using an accessory to capture three-dimensional images with a mobile device. -
FIG. 9 shows a method for using an accessory to capture three-dimensional images with a mobile device. - All documents mentioned herein are hereby incorporated in their entirety by reference. References to items in the singular should be understood to include items in the plural, and vice versa, unless explicitly stated otherwise or clear from the text. Grammatical conjunctions are intended to express any and all disjunctive and conjunctive combinations of conjoined clauses, sentences, words, and the like, unless otherwise stated or clear from the context. Thus the term “or” should generally be understood to mean “and/or” and so forth.
- The following description emphasizes three-dimensional printers using fused deposition modeling or similar techniques where a bead of material is extruded in a layered series of two dimensional patterns as “roads,” “paths” or the like to form a three-dimensional object from a digital model. It will be understood, however, that numerous additive fabrication techniques are known in the art including without limitation multijet printing, stereolithography, Digital Light Processor (“DLP”) three-dimensional printing, selective laser sintering, and so forth. Such techniques may benefit from the systems and methods described below, and all such printing technologies are intended to fall within the scope of this disclosure, and within the scope of terms such as “printer”, “three-dimensional printer”, “fabrication system”, and so forth, unless a more specific meaning is explicitly provided or otherwise clear from the context.
-
FIG. 1 is a block diagram of a three-dimensional printer. In general, theprinter 100 may include abuild platform 102, anextruder 106, anx-y-z positioning assembly 108, and acontroller 110 that cooperate to fabricate anobject 112 within aworking volume 114 of theprinter 100. - The
build platform 102 may include asurface 116 that is rigid and substantially planar. Thesurface 116 may provide a fixed, dimensionally and positionally stable platform on which to build theobject 112. Thebuild platform 102 may include athermal element 130 that controls the temperature of thebuild platform 102 through one or moreactive devices 132, such as resistive elements that convert electrical current into heat, Peltier effect devices that can create a heating or cooling affect, or any other thermoelectric heating and/or cooling devices. Thethermal element 130 may be coupled in a communicating relationship with thecontroller 110 in order for thecontroller 110 to controllably impart heat to or remove heat from thesurface 116 of thebuild platform 102. - The
extruder 106 may include achamber 122 in an interior thereof to receive a build material. The build material may, for example, include acrylonitrile butadiene styrene (“ABS”), high-density polyethylene (“HDPL”), polylactic acid (“PLA”), or any other suitable plastic, thermoplastic, or other material that can usefully be extruded to form a three-dimensional object. Theextruder 106 may include anextrusion tip 124 or other opening that includes an exit port with a circular, oval, slotted or other cross-sectional profile that extrudes build material in a desired cross-sectional shape. - The
extruder 106 may include a heater 126 (also referred to as a heating element) to melt thermoplastic or other meltable build materials within thechamber 122 for extrusion through anextrusion tip 124 in liquid form. While illustrated in block form, it will be understood that theheater 126 may include, e.g., coils of resistive wire wrapped about theextruder 106, one or more heating blocks with resistive elements to heat theextruder 106 with applied current, an inductive heater, or any other arrangement of heating elements suitable for creating heat within thechamber 122 sufficient to melt the build material for extrusion. Theextruder 106 may also or instead include amotor 128 or the like to push the build material into thechamber 122 and/or through theextrusion tip 124. - In general operation (and by way of example rather than limitation), a build material such as ABS plastic in filament form may be fed into the
chamber 122 from a spool or the like by themotor 128, melted by theheater 126, and extruded from theextrusion tip 124. By controlling a rate of themotor 128, the temperature of theheater 126, and/or other process parameters, the build material may be extruded at a controlled volumetric rate. It will be understood that a variety of techniques may also or instead be employed to deliver build material at a controlled volumetric rate, which may depend upon the type of build material, the volumetric rate desired, and any other factors. All such techniques that might be suitably adapted to delivery of build material for fabrication of a three-dimensional object are intended to fall within the scope of this disclosure. - The
x-y-z positioning assembly 108 may generally be adapted to three-dimensionally position theextruder 106 and theextrusion tip 124 within theworking volume 114. Thus by controlling the volumetric rate of delivery for the build material and the x, y, z position of theextrusion tip 124, theobject 112 may be fabricated in three dimensions by depositing successive layers of material in two-dimensional patterns derived, for example, from cross-sections of a computer model or other computerized representation of theobject 112. A variety of arrangements and techniques are known in the art to achieve controlled linear movement along one or more axes. Thex-y-z positioning assembly 108 may, for example, include a number of stepper motors 109 to independently control a position of theextruder 106 within the working volume along each of an x-axis, a y-axis, and a z-axis. More generally, thex-y-z positioning assembly 108 may include without limitation various combinations of stepper motors, encoded DC motors, gears, belts, pulleys, worm gears, threads, and so forth. For example, in one aspect thebuild platform 102 may be coupled to one or more threaded rods by a threaded nut so that the threaded rods can be rotated to provide z-axis positioning of thebuild platform 102 relative to theextruder 124. This arrangement may advantageously simplify design and improve accuracy by permitting an x-y positioning mechanism for theextruder 124 to be fixed relative to a build volume. Any such arrangement suitable for controllably positioning theextruder 106 within the workingvolume 114 may be adapted to use with theprinter 100 described herein. - In general, this may include moving the
extruder 106, or moving thebuild platform 102, or some combination of these. Thus it will be appreciated that any reference to moving an extruder relative to a build platform, working volume, or object, is intended to include movement of the extruder or movement of the build platform, or both, unless a more specific meaning is explicitly provided or otherwise clear from the context. Still more generally, while an x, y, z coordinate system serves as a convenient basis for positioning within three dimensions, any other coordinate system or combination of coordinate systems may also or instead be employed, such as a positional controller and assembly that operates according to cylindrical or spherical coordinates. - The
controller 110 may be electrically or otherwise coupled in a communicating relationship with thebuild platform 102, thex-y-z positioning assembly 108, and the other various components of theprinter 100. In general, thecontroller 110 is operable to control the components of theprinter 100, such as thebuild platform 102, thex-y-z positioning assembly 108, and any other components of theprinter 100 described herein to fabricate theobject 112 from the build material. Thecontroller 110 may include any combination of software and/or processing circuitry suitable for controlling the various components of theprinter 100 described herein including without limitation microprocessors, microcontrollers, application-specific integrated circuits, programmable gate arrays, and any other digital and/or analog components, as well as combinations of the foregoing, along with inputs and outputs for transceiving control signals, drive signals, power signals, sensor signals, and so forth. In one aspect, this may include circuitry directly and physically associated with theprinter 100 such as an on-board processor. In another aspect, this may be a processor associated with a personal computer or other computing device coupled to theprinter 100, e.g., through a wired or wireless connection. Similarly, various functions described herein may be allocated between an on-board processor for theprinter 100 and a separate computer. All such computing devices and environments are intended to fall within the meaning of the term “controller” or “processor” as used herein, unless a different meaning is explicitly provided or otherwise clear from the context. - A variety of additional sensors and other components may be usefully incorporated into the
printer 100 described above. These other components are generically depicted asother hardware 134 inFIG. 1 , for which the positioning and mechanical/electrical interconnections with other elements of theprinter 100 will be readily understood and appreciated by one of ordinary skill in the art. Theother hardware 134 may include a temperature sensor positioned to sense a temperature of the surface of thebuild platform 102, theextruder 126, or any other system components. This may, for example, include a thermistor or the like embedded within or attached below the surface of thebuild platform 102. This may also or instead include an infrared detector or the like directed at thesurface 116 of thebuild platform 102. - In another aspect, the
other hardware 134 may include a sensor to detect a presence of theobject 112 at a predetermined location. This may include an optical detector arranged in a beam-breaking configuration to sense the presence of theobject 112 at a predetermined location. This may also or instead include an imaging device and image processing circuitry to capture an image of the working volume and to analyze the image to evaluate a position of theobject 112. This sensor may be used for example to ensure that theobject 112 is removed from thebuild platform 102 prior to beginning a new build on the workingsurface 116. Thus the sensor may be used to determine whether an object is present that should not be, or to detect when an object is absent. The feedback from this sensor may be used by thecontroller 110 to issue processing interrupts or otherwise control operation of theprinter 100. - The
other hardware 134 may also or instead include a heating element (instead of or in addition to the thermal element 130) to heat the working volume such as a radiant heater or forced hot air heater to maintain theobject 112 at a fixed, elevated temperature throughout a build, or theother hardware 134 may include a cooling element to cool the working volume. -
FIG. 2 depicts a networked three-dimensional printing environment. In general, theenvironment 200 may include adata network 202 interconnecting a plurality of participating devices in a communicating relationship. The participating devices may, for example, include any number of three-dimensional printers 204 (also referred to interchangeably herein as “printers”),client devices 206,print servers 208,content sources 210,mobile devices 212, andother resources 216. - The
data network 202 may be any network(s) or internetwork(s) suitable for communicating data and control information among participants in theenvironment 200. This may include public networks such as the Internet, private networks, telecommunications networks such as the Public Switched Telephone Network or cellular networks using third generation (e.g., 3G or IMT-2000), fourth generation (e.g., LTE (E-UTRA) or WiMax-Advanced (IEEE 802.16m)) and/or other technologies, as well as any of a variety of corporate area or local area networks and other switches, routers, hubs, gateways, and the like that might be used to carry data among participants in theenvironment 200. - The three-
dimensional printers 204 may be any computer-controlled devices for three-dimensional fabrication, including without limitation any of the three-dimensional printers or other fabrication or prototyping devices described above. In general, each such device may include a network interface comprising, e.g., a network interface card, which term is used broadly herein to include any hardware (along with software, firmware, or the like to control operation of same) suitable for establishing and maintaining wired and/or wireless communications. The network interface card may include without limitation wired Ethernet network interface cards (“NICs”), wireless 802.11 networking cards, wireless 802.11 USB devices, or other hardware for wireless local area networking. The network interface may also or instead include cellular network hardware, wide area wireless network hardware or any other hardware for centralized, ad hoc, peer-to-peer, or other radio communications that might be used to carry data. In another aspect, the network interface may include a serial or USB port to directly connect to a computing device such as a desktop computer that, in turn, provides more general network connectivity to thedata network 202. - The
printers 204 might be made to fabricate any object, practical or otherwise, that is amenable to fabrication according to each printer's capabilities. This may be a model of a house or a tea cup, as depicted, or any other object such as a bunny, gears or other machine hardware, replications of scanned three-dimensional objects, or fanciful works of art. -
Client devices 206 may be any devices within theenvironment 200 operated by users to initiate, manage, monitor, or otherwise interact with print jobs at the three-dimensional printers 204. This may include desktop computers, laptop computers, network computers, tablets, or any other computing device that can participate in theenvironment 200 as contemplated herein. Eachclient device 206 generally provides a user interface, which may include a graphical user interface, a text or command line interface, a voice-controlled interface, and/or a gesture-based interface to control operation of remote three-dimensional printers 204. The user interface may be maintained by a locally executing application on one of theclient devices 206 that receives data and status information from, e.g., theprinters 204 andprint servers 208 concerning pending or executing print jobs. The user interface may create a suitable display on theclient device 206 for user interaction. In other embodiments, the user interface may be remotely served and presented on one of theclient devices 206, such as where aprint server 208 or one of the three-dimensional printers 204 includes a web server that provides information through one or more web pages or the like that can be displayed within a web browser or similar client executing on one of theclient devices 206. In one aspect, the user interface may include a voice controlled interface that receives spoken commands from a user and/or provides spoken feedback to the user. - The
print servers 208 may include data storage, a network interface, and a processor and/or other processing circuitry. In the following description, where the functions or configuration of aprint server 208 are described, this is intended to include corresponding functions or configuration (e.g., by programming) of a processor of theprint server 208. In general, the print servers 208 (or processors thereof) may perform a variety of processing tasks related to management of networked printing. For example, theprint servers 208 may manage print jobs received from one or more of theclient devices 206, and provide related supporting functions such as content search and management. Aprint server 208 may also include a web server that provides web-based access by theclient devices 206 to the capabilities of theprint server 208. Aprint server 208 may also communicate periodically with three-dimensional printers 204 in order to obtain status information concerning, e.g., availability of printers and/or the status of particular print jobs, any of which may be subsequently presented to a user through the web server or any other suitable interface. Aprint server 208 may also maintain a list of available three-dimensional printers 204, and may automatically select one of the three-dimensional printers 204 for a user-submitted print job, or may permit a user to specify a single printer, or a group of preferred printers, for fabricating an object. Where theprint server 208 selects the printer automatically, any number of criteria may be used such as geographical proximity, printing capabilities, current print queue, fees (if any) for use of a particular three-dimensional printer 204, and so forth. Where the user specifies criteria, this may similarly include any relevant aspects of three-dimensional printers 204, and may permit use of absolute criteria (e.g., filters) or preferences, which may be weighted preferences or unweighted preferences, any of which may be used by aprint server 208 to allocate a print job to a suitable resource. - In one aspect, the
print server 208 may be configured to support interactive voice control of one of theprinters 204. For example, theprint server 208 may be configured to receive a voice signal (e.g., in digitized audio form) from a microphone or other audio input of theprinter 204, and to process the voice signal to extract relevant content such as a command for the printer. Where the command is recognized as a print command, the voice signal may be further processed to extract additional context or relevant details. For example, the voice signal may be processed to extract an object identifier that specifies an object for printing, e.g., by filename, file metadata, or semantic content. The voice signal may also be processed to extract a dimensional specification, such as a scale or absolute dimension for an object. Theprint server 208 may then generate suitable control signals for return to theprinter 204 to cause theprinter 204 to fabricate the object. Where an error or omission is detected, theprint server 208 may return a request for clarification to theprinter 204, which may render the request in spoken form through a speaker, or within a user interface of theprinter 204 or an associated device. - Other user preferences may be usefully stored at the
print server 208 to facilitate autonomous, unsupervised fabrication of content fromcontent sources 210. For example, aprint server 208 may store a user's preference on handling objects greater than a build volume of a printer. These preferences may control whether to resize the object, whether to break the object into multiple sub-objects for fabrication, and whether to transmit multiple sub-objects to a single printer or multiple printers. In addition, user preferences or requirements may be stored, such as multi-color printing capability, build material options and capabilities, and so forth. More generally, a print queue (which may be a printer-specific or user-specific queue, and which may be hosted at aprinter 204, aserver 208, or some combination of these) may be managed by aprint server 208 according to one or more criteria from a remote user requesting a print job. Theprint server 208 may also store user preferences or criteria for filtering content, e.g., for automatic printing or other handling. While this is described below as a feature for autonomous operation of a printer (such as a printer that locally subscribes to a syndicated model source), any criteria that can be used to identify models of potential interest by explicit type (e.g., labeled in model metadata), implicit type (e.g., determined based on analysis of the model), source, and so forth, may be provided to theprint server 208 and used to automatically direct new content to one or more user-specified ones of the three-dimensional printers 204. - In the context of voice-controlled printing, the
print server 208 may usefully store user-specific data such as training for a voice recognition model. Theprint server 208 may also or instead store voice rendering data to use in generating spoken output by theprinter 204. This may, for example, include voice type data, voice model data, voice sample data, and so forth. Thus for example, a user may purchase or otherwise obtain a voice style (e.g., a celebrity voice or other personality) to render spoken commands and maintain the voice style on theprint server 208. Theprint server 208 may also or instead store data characterizing capabilities of theprinter 204 so that voice commands received at theprint server 208 can be analyzed for suitability, accuracy, and so forth according to the capabilities of theprinter 204 from which the voice command was received. More generally, any data or processing for voice interaction that can be usefully stored or executed remotely from theprinter 204 may be located at theprinter server 208. It will be understood that any such data may also or instead be stored on a client device, aprinter 204, or some combination of these. - In one aspect, the processor of the print server may be configured to store a plurality of print jobs submitted to the web server in a log and to provide an analysis of print activity based on the log. This may include any type of analysis that might be useful to participants in the
environment 200. For example, the analysis may include tracking of the popularity of particular objects, or of particular content sources. The analysis may include tracking of which three-dimensional printers 204 are most popular or least popular, or related statistics such as the average backlog of pending print jobs at a number of the three-dimensional printers 204. The analysis may include success of a particular printer in fabricating a particular model or of a particular printer in completing print jobs generally. More generally, any statistics or data may be obtained, and any analysis may be performed, that might be useful to users (e.g., when requesting prints), content sources (e.g., when choosing new printable objects for publication), providers of fabrication resources (e.g., when setting fees), or network facilitators such as theprint servers 208. - A
print server 208 may also maintain adatabase 209 of content, along with an interface for users atclient devices 206 to search thedatabase 209 and request fabrication of objects in thedatabase 209 using any of the three-dimensional printers 204. Thus in one aspect, a print server 208 (or any system including the print server 208) may include adatabase 209 of three-dimensional models, and theprint server 208 may act as a server that provides a search engine for locating a particular three-dimensional model in thedatabase 209. The search engine may be a text-based search engine using keyword text queries, plain language queries, and so forth. The search engine may also or instead include an image-based search engine configured to identify three-dimensional models similar to a two-dimensional or three-dimensional image provide by a user. - In another aspect, the
printer server 208 may periodically search for suitable content at remote locations on the data network, which content may be retrieved to thedatabase 209, or have its remote location (e.g., a URL or other network location identifier) stored in thedatabase 209. In another aspect, theprint server 208 may provide an interface for submission of objects from remote users, along with any suitable metadata such as a title, tags, creator information, descriptive narrative, pictures, recommended printer settings, and so forth. In one aspect, thedatabase 209 may be manually curated according to any desired standards. In another aspect, printable objects in thedatabase 209 may be manually or automatically annotated according to content type, popularity, editorial commentary, and so forth. - The
print server 208 may more generally provide a variety of management functions. For example, theprint server 204 may store a location of a predetermined alternative three-dimensional printer to execute a print job from a remote user in the event of a failure by the one of the plurality of three-dimensional printers 204. In another aspect, theprint server 208 may maintain exclusive control over at least one of the plurality of three-dimensional printers 204, such that other users and/or print servers cannot control the printer. In another aspect, theprint server 208 may submit a print job to a first available one of the plurality of three-dimensional printers 204. - In another aspect, a
print server 208 may provide an interface for managing subscriptions to sources of content. This may include tools for searching existing subscriptions, locating or specifying new sources, subscribing to sources of content, and so forth. In one aspect, aprint server 208 may manage subscriptions and automatically direct new content from these subscriptions to a three-dimensional printer 204 according to any user-specified criteria. Thus while it is contemplated that a three-dimensional printer 204 may autonomously subscribe to sources of content through a network interface and receive new content directly from such sources, it is also contemplated that this feature may be maintained through a remote resource such as aprint server 208. - A
print server 208 may maintain print queues for participating three-dimensional printers 204. This approach may advantageously alleviate backlogs atindividual printers 204, which may have limited memory capacity for pending print jobs. More generally, aprint server 208 may, by communicating with multiple three-dimensional printers 204, obtain a view of utilization of multiple networked resources that permits a more efficient allocation of print jobs than would be possible through simple point-to-point communications among users and printers. Print queues may also be published by aprint server 208 so that users can view pending queues for a variety of different three-dimensional printers 204 prior to selecting a resource for a print job. In one aspect, the print queue may be published as a number of print jobs and size of print jobs so that a requester can evaluate likely delays. In another aspect, the print queue may be published as an estimated time until a newly submitted print job can be initiated. - In one aspect, the print queue of one of the
print servers 208 may include one or more print jobs for one of the plurality of three-dimensional printers 204. The print queue may be stored locally at the one of the plurality of three-dimensional printers. In another aspect, the print queue may be allocated between thedatabase 209 and a local memory of the three-dimensional printer 204. In another aspect, the print queue may be stored, for example, in thedatabase 209 of theprint server 208. As used here, the term ‘print queue’ is intended to include print data (e.g., the three-dimensional model or tool instructions to fabricate an object) for a number of print job (which may be arranged for presentation in order of expected execution), as well as any metadata concerning print jobs. Thus, a portion of the print queue such as the metadata (e.g., size, status, time to completion) may be usefully communicated to aprint server 208 for sharing among users while another portion of the print queue such as the model data may be stored at a printer in preparation for execution of a print job. - Print queues may implement various user preferences on prioritization. For example, for a commercial enterprise, longer print jobs may be deferred for after normal hours of operation (e.g., after 5:00 p.m.), while shorter print jobs may be executed first if they can be completed before the end of a business day. In this manner, objects can be identified and fabricated from within the print queue in a manner that permits as many objects as possible to be fabricated before a predetermined closing time. Similarly, commercial providers of fabrication services may charge explicitly for prioritized fabrication, and implement this prioritization by prioritizing print queues in a corresponding fashion.
- In another aspect, a
print server 208 may provide a virtual workspace for a user. In this virtual workspace, a user may search local or remote databases of printable objects, save objects of interest (or links thereto), manage pending prints, specify preferences for receiving status updates (e.g., by electronic mail or SMS text), manage subscriptions to content, search for new subscription sources, and so forth. In one aspect, the virtual workspace may be, or may include, web-based design tools or a web-based design interface that permits a user to create and modify models. In one aspect, the virtual workspace may be deployed on the web, while permitting direct fabrication of a model developed within that environment on a user-specified one of the three-dimensional printers 204, thus enabling a web-based design environment that is directly coupled to one or more fabrication resources. - The
content sources 210 may include any sources of content for fabrication with a three-dimensional printer 204. This may, for example, include databases of objects accessible through a web interface or application programming interface. This may also or instead include individual desktop computers or the like configured as a server for hosted access, or configured to operate as a peer in a peer-to-peer network. This may also or instead include content subscription services, which may be made available in an unrestricted fashion, or may be made available on a paid subscription basis, or on an authenticated basis based upon some other relationship (e.g., purchase of a related product or a ticket to an event). It will be readily appreciated that any number of content providers may serve ascontent sources 210 as contemplated herein. By way of non-limiting example, thecontent sources 210 may include destinations such as amusement parks, museums, theaters, performance venues, or the like, any of which may provide content related to users who purchase tickets. Thecontent sources 210 may include manufacturers such as automobile, computer, consumer electronics, or home appliance manufacturers, any of which may provide content related to upgrades, maintenance, repair, or other support of existing products that have been purchased. Thecontent sources 210 may include artists or other creative enterprises that sell various works of interest. Thecontent sources 210 may include engineering or architectural firms that provide marketing or advertising pieces to existing or prospective customers. Thecontent sources 210 may include marketing or advertising firms that provide promotional items for clients. More generally, thecontent sources 210 may be any individual or enterprise that provides single or serial objects for fabrication by the three-dimensional printers 204 described herein. - One or
more web servers 211 may provide web-based access to and from any of the other participants in theenvironment 200. While depicted as a separate network entity, it will be readily appreciated that aweb server 211 may be logically or physically associated with one of the other devices described herein, and may, for example, provide a user interface for web access to one of the three-dimensional printers 204, one of the print servers 208 (ordatabases 209 coupled thereto), one of thecontent sources 210, or any of theother resources 216 described below in a manner that permits user interaction through thedata network 202, e.g., from aclient device 206 ormobile device 212. - The
mobile devices 212 may be any form of mobile device, such as any wireless, battery-powered device, that might be used to interact with thenetworked printing environment 200. Themobile devices 212 may, for example, include laptop computers, tablets, thin client network computers, portable digital assistants, messaging devices, cellular phones, smart phones, portable media or entertainment devices, and so forth. In general,mobile devices 212 may be operated by users for a variety of user-oriented functions such as to locate printable objects, to submit objects for printing, to monitor a personally owned printer, and/or to monitor a pending print job. Amobile device 212 may include location awareness technology such as Global Positioning System (“GPS”), which may obtain information that can be usefully integrated into a printing operation in a variety of ways. For example, a user may select an object for printing and submit a model of the object to a print server, such as any of the print servers described above. The print server may determine a location of themobile device 212 initiating the print job and locate a closest printer for fabrication of the object. - In another aspect, a printing function may be location-based, using the GPS input (or cellular network triangulation, proximity detection, or any other suitable location detection techniques). For example, a user may be authorized to print a model only when the user is near a location (e.g., within a geo-fenced area or otherwise proximal to a location), or only after a user has visited a location. Thus a user may be provided with printable content based upon locations that the user has visited, or while within a certain venue such as an amusement park, museum, theater, sports arena, hotel, or the like. Similarly, a matrix barcode such as a QR code may be employed for localization.
- The
other resources 216 may include any other software or hardware resources that may be usefully employed in networked printing applications as contemplated herein. For example, theother resources 216 may include payment processing servers or platforms used to authorize payment for content subscriptions, content purchases, or printing resources. As another example, theother resources 216 may include social networking platforms that may be used, e.g., to share three-dimensional models and/or fabrication results according to a user's social graph. In another aspect, theother resources 216 may include certificate servers or other security resources for third party verification of identity, encryption or decryption of three-dimensional models, and so forth. In another aspect, theother resources 216 may include online tools for three-dimensional design or modeling, as well as databases of objects, surface textures, build supplies, and so forth. In another aspect, theother resources 216 may include a desktop computer or the like co-located (e.g., on the same local area network with, or directly coupled to through a serial or USB cable) with one of the three-dimensional printers 204. In this case, theother resource 216 may provide supplemental functions for the three-dimensional printer 204 in a networked printing context such as maintaining a print queue or operating a web server for remote interaction with the three-dimensional printer 204.Other resources 216 also include supplemental resources such as three-dimensional scanners, cameras, and post-processing/finishing machines or resources. More generally, any resource that might be usefully integrated into a networked printing environment may be one of theresources 216 as contemplated herein. - It will be readily appreciated that the various components of the
networked printing environment 200 described above may be arranged and configured to support networked printing in a variety of ways. For example, in one aspect there is disclosed herein a networked computer with a print server and a web interface to support networked three-dimensional printing. This device may include a print server, a database, and a web server as discussed above. The print server may be coupled through a data network to a plurality of three-dimensional printers and configured to receive status information from one or more sensors for each one of the plurality of three-dimensional printers. The print server may be further configured to manage a print queue for each one of the plurality of three-dimensional printers. The database may be coupled in a communicating relationship with the print server and configured to store print queue data and status information for each one of the plurality of three-dimensional printers. The web server may be configured to provide a user interface over the data network to a remote user, the user interface adapted to present the status information and the print queue data for one or more of the plurality of three-dimensional printers to the user and the user interface adapted to receive a print job from the remote user for one of the plurality of three-dimensional printers. - The three-
dimensional printer 204 described above may be configured to autonomously subscribe to syndicated content sources and periodically receive and print objects from those sources. Thus in one aspect there is disclosed herein a device including any of the three-dimensional printers described above; a network interface; and a processor (which may without limitation include the controller for the printer). The processor may be configured to subscribe to a plurality of sources of content (such as thecontent sources 210 described above) selected by a user for fabrication by the three-dimensional printer through the network interface. The processor may be further configured to receive one or more three-dimensional models from the plurality ofcontent sources 210, and to select one of the one or more three-dimensional models for fabrication by the three-dimensional printer 204 according to a user preference for prioritization. The user preference may, for example, preferentially prioritizeparticular content sources 210, or particular types of content (e.g., tools, games, artwork, upgrade parts, or content related to a particular interest of the user). - The memory of a three-
dimensional printer 204 may be configured to store a queue of one or more additional three-dimensional models not selected for immediate fabrication. The processor may be programmed to periodically re-order or otherwise alter the queue according to pre-determined criteria or manual user input. For example, the processor may be configured to evaluate a new three-dimensional model based upon a user preference for prioritization, and to place the new three-dimensional model at a corresponding position in the queue. The processor may also or instead be configured to retrieve content from one of thecontent sources 210 by providing authorization credentials for the user, which may be stored at the three-dimensional printer or otherwise accessible for presentation to thecontent source 210. The processor may be configured to retrieve content from at least one of the plurality ofcontent sources 210 by authorizing a payment from the user to a content provider. The processor may be configured to search a second group of sources of content (such as any of thecontent sources 210 described above) according to one or more search criteria provide by a user. This may also or instead include demographic information for the user, contextual information for the user, or any other implicit or explicit user information. - In another aspect, there is disclosed herein a system for managing subscriptions to three-dimensional content sources such as any of the
content sources 210 described above. The system may include a web server configured to provide a user interface over a data network, which user interface is adapted to receive user preferences from a user including a subscription to a plurality of sources of a plurality of three-dimensional models, a prioritization of content from the plurality of sources, and an identification of one or more fabrication resources coupled to the data network and suitable for fabricating objects from the plurality of three-dimensional models. The system may also include a database to store the user preferences, and to receive and store the plurality of three-dimensional models as they are issued by the plurality of sources. The system may include a processor (e.g., of aprint server 208, or alternatively of aclient device 206 interacting with the print server 208) configured to select one of the plurality of three-dimensional models for fabrication based upon the prioritization. The system may include a print server configured to communicate with the one or more fabrication resources through the data network, to determine an availability of the one or more fabrication resources, and to transmit the selected one of the plurality of three-dimensional models to one of the one or more fabrication resources. - In another aspect, there is disclosed herein a network of three-dimensional printing resources comprising a plurality of three-dimensional printers, each one of the plurality of three-dimensional printers including a network interface; a server configured to manage execution of a plurality of print jobs by the plurality of three-dimensional printers; and a data network that couples the server and the plurality of three-dimensional printers in a communicating relationship.
- In general as described above, the server may include a web-based user interface configured for a user to submit a new print job to the server and to monitor progress of the new print job. The web-based user interface may permit video monitoring of each one of the plurality of three-dimensional printers, or otherwise provide information useful to a remote user including image-based, simulation-based, textual-based or other information concerning status of a current print. The web-based user interface may include voice input and/or output for network-based voice control of a printer.
- The fabrication resources may, for example, include any of the three-
dimensional printers 204 described above. One or more of the fabrication resources may be a private fabrication resource secured with a credential-based access system. The user may provide, as a user preference and prior to use of the private fabrication resource, credentials for accessing the private fabrication resource. In another aspect, the one or more fabrication resources may include a commercial fabrication resource. In this case the user may provide an authorization to pay for use of the commercial fabrication resource in the form of a user preference prior to use of the commercial fabrication resource. - Many current three-dimensional printers require significant manufacturing time to fabricate an object. At the same time, certain printers may include a tool or system to enable multiple, sequential object prints without human supervision or intervention, such as a conveyor belt. In this context, prioritizing content may be particularly important to prevent crowding out of limited fabrication resources with low priority content that arrives periodically for autonomous fabrication. As a significant advantage, the systems and methods described herein permit prioritization using a variety of user-specified criteria, and permit use of multiple fabrication resources in appropriate circumstances. Thus prioritizing content as contemplated herein may include any useful form of prioritization. For example, this may include prioritizing the content according to source. The
content sources 210 may have an explicit type that specifies the nature of the source (e.g., commercial or paid content, promotional content, product support content, non-commercial) or the type of content provided (e.g., automotive, consumer electronics, radio control hobbyist, contest prizes, and so forth). Prioritizing content may include prioritizing the content according to this type. The three-dimensional models themselves may also or instead include a type (e.g., tool, game, home, art, jewelry, replacement part, upgrade part, etc.) or any other metadata, and prioritizing the content may includes prioritizing the content according to this type and/or metadata. - In one aspect, the processor may be configured to select two or more of the plurality of three-dimensional models for concurrent fabrication by two or more of the plurality of fabrication resources based upon the prioritization when a priority of the two or more of the plurality of three-dimensional models exceeds a predetermined threshold. That is, where particular models individually have a priority above the predetermined threshold, multiple fabrication resources may be located and employed to fabricate these models concurrently. The predetermined threshold may be evaluated for each model individually, or for all of the models collectively such as on an aggregate or average basis.
- In one aspect, the processor may be configured to adjust prioritization based upon a history of fabrication when a number of objects fabricated from one of the plurality of sources exceeds a predetermined threshold. Thus, for example, a user may limit the number of objects fabricated from a particular source, giving subsequent priority to content from other sources regardless of an objectively determined priority for a new object from the particular source. This prevents a single source from overwhelming a single fabrication resource, such as a personal three-dimensional printer operated by the user, in a manner that crowds out other content from other sources of possible interest. At the same time, this may enable
content sources 210 to publish on any convenient schedule, without regard to whether and how subscribers will be able to fabricate objects. - In another aspect, the processor may be configured to identify one or more additional sources of content based upon a similarity to one of the plurality of sources of content. For example, where a
content source 210 is an automotive manufacturer, the processor may perform a search for other automotive manufactures, related parts suppliers, mechanics, and so forth. The processor may also or instead be configured to identify one or more additional sources of content based upon a social graph of the user. This may, for example, include analyzing a social graph of relationships from the user to identify groups with common interests, shared professions, a shared history of schools or places of employment, or a common current or previous residence location, any of which may be used to locate other sources of content that may be of interest to the user. -
FIG. 3 shows a mobile device with an accessory for three-dimensional imaging. - The
mobile device 302 may include any suitable mobile device such as a cellular phone, media player, tablet, laptop computer or the like. In general, the mobile device may include aprocessor 304 such as a microprocessor, microcontroller, or other processing circuitry that controls operation of the mobile device, provides a user interface, and so forth. Themobile device 302 may include afirst camera 306 and asecond camera 308 operable by themobile device 302 to capture still images or vide, or to support live video-based communications. In one embodiment, thefirst camera 306 may be a forward facing camera and thesecond camera 308 may be a rear facing camera (or vice versa). In this conventional configuration, a user can take pictures of objects in front of the user with afirst camera 306 facing away from the user, or the user can take a picture of himself or herself, or other items facing toward the user from themobile device 302 using thesecond camera 308. - The
processor 304 may include processing circuitry configured to obtain three-dimensional data from one or more images obtained by thefirst camera 306 and/or thesecond camera 308. A wide array of image-based techniques for three-dimensional reconstruction are known in the art, and may be suitably adapted for use with the systems and methods contemplated herein. For example, theprocessor 304 may apply shape-from motion techniques to a sequence of images captured from either or both of thefirst camera 306 and thesecond camera 308. In another aspect, thefirst camera 306 and thesecond camera 308 may be controlled to capture images concurrently, or substantially concurrently, and the two images from offset poses may be processed as a stereoscopic image pair to extract three-dimensional features. Similarly, the housing may include a structured light source or the light that illuminates an object with features that can be recognized when projected onto an object and processed to recover three-dimensional data. These or any other suitable techniques may be usefully employed with themobile device 302 and accessory as contemplated herein. Certain variations are described below that employ different arrangements of hardware and processing, as well as various types of communication between themobile device 302 and components within thehousing 310 of the accessory. For example, thehousing 310 may include an additional camera to complement a camera of themobile device 302. In another aspect, thehousing 310 may include a structured light source or other supplemental illumination source for improved imaging. In another aspect, thehousing 310 may include independent processing circuitry for three-dimensional imaging, which may receive images from a camera of themobile device 302 and process such images either alone or in combination with images from a camera of thehousing 310 to obtain three-dimensional data. All such variations are intended to fall within the scope of this disclosure. - An accessory may include a
housing 310 with amechanical interface 312 configured to removably and replaceably attach to a predetermined mobile computing device such as themobile device 302 in a predetermined orientation. While thismechanical interface 312 is illustrated in the cross-section ofFIG. 3 as a flanged edge that encloses sides of themobile device 302 along its perimeter, it will be appreciated thatmobile devices 302 may have a variety of shapes and sizes, and a variety of mechanical interfaces may readily be devised to removably and replaceably secure thehousing 310 to themobile device 302 such that themobile device 302 and thehousing 310 are in a predetermined orientation relative to one another. For example, thehousing 310 may be fashioned of a flexible material that permits thehousing 310 to be elastically bent around the edges of themobile device 302, or themechanical interface 312 may include hinged, spring-loaded, or sliding latches that are manually secured about the edges of themobile device 302. - However attached, a
lens 314 on thehousing 310 may as a result be fixed in a predetermined location and orientation relative to a lens of thefirst camera 306 in order to provide a field of view from a predetermined pose relative to thefirst camera 306 of themobile device 302 when themobile device 302 is positioned within thehousing 310. Anoptical train 316 may be further provided that optically couples thesecond camera 308 of themobile device 302 to thelens 314. In this configuration, thefirst camera 306 and thesecond camera 308 both capture forward-facing images from offset poses, thus providing a stereoscopic perspective on a field of view for the combineddevice 320. The optical train may include any of a variety of optical components such as mirrors, fiber optics, intermediate lenses, and so forth to suitably couple thelens 314 to thesecond camera 308 for image acquisition. -
FIG. 4 is functional block diagram of an accessory coupled to a mobile device. In generally, theaccessory 402 and themobile device 404 may share various components for a three-dimensional imaging system, and may further share processing resources and/or be coordinated through a communications interface to cooperate in a three-dimensional imaging process. - The
accessory 402, or the housing of the accessory 402 (housing and accessory being used interchangeably herein, unless a different meaning is explicitly provided or otherwise clear from the context), may include acommunication interface 406 configured to data communication between the accessory 402 and themobile device 404, with acomplementary communication interface 408 on themobile device 404. In one aspect, thecommunication interface 406 may include hardware and/or software for any suitable wireless communication interface using, e.g., any 802.11 wireless protocol, BlueTooth, or any standardized or proprietary short-range wireless communications protocol based upon, e.g., radio frequency, optical, acoustic, or other suitable communication medium. In another aspect, the communication interface may include a wired communication interface that couples to a data port of the predetermined mobile computing device when the housing is attached to the predetermined mobile computing device. Thus in one aspect, thecommunication interface 408 of the mobile device may include a data port configured for wired data communications. Contemporary mobile devices include numerous suitable physical ports including without limitation standardized ports such as USB connectors, micro-USB connectors, two or three ring plugs, and so forth, any of which may adapted for use as a data port as contemplated herein. Similarly, many devices included proprietary arrangements of plugs, contacts, and the like for docking stations and recharging that may be adapted to use as a physical data port. - In one aspect, the
communication interface 406 of theaccessory 402 may include a sensor to detect an action of the mobile computing device. This may include a data input such as a data input, trigger, dedicated pin, or the like in thecommunication interface 406. In another aspect, this may include a sensor independent of the communications circuitry that couples theaccessory 402 to themobile device 404. For example, the sensor may include a sensor that detects a sound, a vibration, an illumination, or the like. In this manner, themobile device 404 may signal theaccessory 402 independent of a data communication link using any action that is (a) within the capabilities of themobile device 404, and (b) detectable by theaccessory 402. Thus, for example, when a picture is taken with the mobile device camera, a processor of the mobile device may transmit a signal through the communication interfaces 408, 406, or with a vibration or a beep, to concurrently capture a picture with a camera of theaccessory 402, or to otherwise operate a shutter, illumination source or the like concurrently with the image capture by the mobile device. Similarly, the action itself may include an autofocus, zoom, light meter reading, or other action related to image capture that can include a corresponding data signal to the accessory. - The
accessory 402 may also or instead include acamera 412 separate from themobile device 404. As noted above, theaccessory 402 may support three-dimensional processing in one respect by providing a supplemental camera to capture an image concurrently with or otherwise in addition to one or more images from themobile device 404 for use in three-dimensional processing. Thus in one aspect asystem 400 described herein includes acamera 412 within a housing of anaccessory 402. An image from thecamera 412 may be transmitted to themobile device 404 through the communication interfaces 406, 408 for themobile device 404 to perform three-dimensional processing tasks with aprocessor 410 or other processing circuitry of themobile device 404. - The accessory may also or instead include a
processor 414 or other processing circuitry to support three-dimensional imaging/processing. For example, theprocessor 414 of theaccessory 402 may receive image data from themobile device 404, e.g., through thecommunication interface 406, and process the image data along with data obtained from thecamera 412 of the accessory to obtain three-dimensional data. Theprocessor 414 may also control operation of thecamera 412, and/or may control operation of a camera of themobile device 404, e.g., by communication with themobile device 404 through the communication interfaces 406, 408. Similarly, theprocessor 410 of themobile device 404 may be configured to receive a first image from the camera and a second image through the communication interface from the second camera, and further configured to process the first image and the second image to obtain three-dimensional data from an overlapping field of view of the camera and the second camera. - In one aspect it will be understood that processing circuitry for three-dimensional processing may be contained within the
mobile device 404, such as theprocessing circuitry 410 depicted inFIG. 4 , which may be programmed for appropriate three-dimensional processing tasks, or within the accessory 402 (e.g., processor 414), or some combination of these. In one aspect, asystem 400 contemplated herein includes processing circuitry on a mobile device configured to obtain substantially concurrent images from a camera and a second camera of the mobile device. Thesystem 400 may also or instead include processing circuitry on the mobile device configured to process such substantially concurrent images to obtain three-dimensional data from an overlapping region of the field of view of the first camera (e.g., through the optical train) and the second field of view of the second camera. - The
processor 414 of theaccessory 402 may include, or be associated with, a memory that, among other things, stores a unique identifier for theaccessory 402. This may be used to identify a user of theaccessory 402 so that, for example, when a three-dimensional image is acquired using theaccessory 402 and amobile device 404, the three-dimensional image may be transmitted to a print server or other networked three-dimensional printing resource such as any of those described above, which may in turn automatically associate the three-dimensional image with a particular user. This may be particularly useful where, for example, themobile device 404 includes a data network connection for Internet access to such a remote resource. In this manner, a user of the accessory may have three-dimensional images automatically uploaded to the remote resource where they can be available for printing or other manipulation by the user. In one aspect, a user interface may be provided on themobile device 404 or on theaccessory 402 for a user to authorize transmission of a captured three-dimensional image with the unique identifier to a remote resource for storage and subsequent retrieval/use. In another aspect, the user interface (again, on either theaccessory 402 or the mobile device 404) may provide a “print this now” button or the like so that a physical reproduction of the three-dimensional image, or a further-processed version of the image, can be immediately queued for fabrication. The other networked printing resources, systems and methods described above may be used in various combinations to further process and/or reproduce such images on a three-dimensional printer, either automatically or under user control. - The components of the
accessory 402 may be powered by a power source for themobile device 404, which may be transferred to thehousing 402 through the same electromechanical interface, e.g., a USB or plug connector, that supports data communications. Thehousing 402 may also or instead include apower source 416 independent of the mobile device for autonomous operation. Thispower source 416, which may be a battery or the like, may support operation of thecamera 412 and theprocessing circuitry 414 of theaccessory 402, and may also provide supplemental power to themobile device 404, which may be particularly useful, for example, where data acquisition and three-dimensional processing would otherwise tend to tax a power supply of a mobile device to premature depletion. - The
accessory 402 may also include any other hardware 418 complementary to the intended use(s) of theaccessory 402. For example, this may include memory such as a removable storage device (e.g., memory card, USB drive, or the like) or internal memory for storing image data and/or processed three-dimensional data. Where three-dimensional data is captured for a specific use, theaccessory 402 may also include processing circuitry adapted convert acquired three-dimensional data into a suitable form. For example, the processing circuitry may convert raw point cloud or polygonal data into an STL format for use by a three-dimensional printer, or into a CAD file of any suitable format for further processing. In another aspect, the other hardware 418 may include local or cellular wireless communications capabilities for connecting theaccessory 402 to remote resources such as a three-dimensional printer, print server, desktop computer, or other device or combination of devices useful for processing and management of printable content as contemplated herein. -
FIG. 5 shows a three-dimensional imaging system with dual optical paths. Anaccessory 502 coupled to amobile device 504 may in general include an optical train to direct optical paths to cameras of theaccessory 502 and/ormobile device 504. For example a firstoptical path 506 within theaccessory 502 may direct an image from a lens of theaccessory 502 to afirst camera 508 of themobile device 502. A secondoptical path 510 within theaccessory 502 may direct an image from another lens or opening of themobile device 502 to asecond camera 512 of themobile device 502. While the firstoptical path 506 provides an optical coupling to a first field of view (indicated generally by an arrow 520), the secondoptical path 510 may provide an optical coupling to a second field of view (indicated generally by a second arrow 522) different from the first field of view. In this manner, stereoscopic imaging or other three-dimensional imaging techniques based upon image differentiation may be employed with multiple cameras of themobile device 502. Eachoptical path mobile device 502. - It will be appreciated that the first
optical path 506 may be readily omitted where thecamera 508 has a field of view that can be overlapped with the second field of view. Alternatively, even in this configuration, the secondoptical path 510 may be included to provide supplemental optics such as focusing lenses, scaling lenses, a controllable shutter or aperture, and so forth. - The
accessory 502 may optionally include a supplementallight source 528 positioned to illuminate the field of view and/or the second field of view. The supplementallight source 528 may be a strobe, flash, high-intensity light, or other light source useful for photographic illumination. The supplementallight source 528 may also or instead include a structured light source that can provide illumination using predetermined patterns of light that can be imaged and processed to derive three-dimensional data. - In one aspect, the supplemental
light source 528 may serve as an illumination source to illuminate a field of view of one of thecameras fixture 530 such as a lens, mirrors, or other mechanical and/or optical beam steering elements, may be provided to move the illumination source in a predetermined pattern. The illumination source may, for example, include a laser light source, a light emitting diode, an incandescent light source, or combinations of the foregoing. In another aspect, the supplementallight source 528 may include a plurality of illumination sources coupled to the housing, each one of the plurality of illumination sources having a different pose relative to the predetermined mobile computing device. In this manner, any reconstruction technique based upon directional lighting and or different patterns of light may be usefully implemented using a number of separately controllable illumination sources coupled to the housing of theaccessory 504. -
FIG. 6 shows a three-dimensional imaging system with dual optical paths. In the embodiment ofFIG. 6 a firstoptical path 602 of anaccessory 600 optically couples acamera 604 of amobile device 605 to afirst lens 606, and a secondoptical path 608 optically couples thecamera 604 to asecond lens 610 that provides a pose that is offset from thefirst lens 606. Anoptical switch 612 such as a moveable mirror, surface with controllable reflectivity, controllable mirrors and apertures, or any other hardware that can controllably select between theoptical paths optical switch 612 so that theaccessory 600 can controllably direct thecamera 604 toward thefirst lens 606 or thesecond lens 608. A processor on the mobile device may, for example, be configured (e.g., by programming) to control thecamera 604 and theoptical switch 612 to capture temporally adjacent images from a field of view of thefirst lens 606 and a second field of view of thesecond lens 610 with thecamera 604. In another aspect, the processor may be configured to process the temporally adjacent images to obtain three-dimensional data from an overlapping region of the field of view and the second field of view. - While dual optical path systems are described, it will be understood that any number of additional paths and supporting hardware/software may be used to capture additional views of an object, such as to resolve spatial ambiguities, address occlusions, and otherwise improve three-dimensional processing as contemplated herein.
-
FIG. 7 shows a method for using an accessory to capture three-dimensional images with a mobile device. - As shown in
step 702, themethod 700 may begin with attaching an accessory such as any of the accessories described above to a mobile device. The accessory may be removably and replaceably attachable to the mobile device, and may include a camera and a communication interface for communications with the mobile device. - As shown in
step 704, themethod 700 may include capturing a first image with the camera. - As shown in
step 706, themethod 700 may include capturing a second image substantially concurrently with the first image using a second camera of the mobile device, wherein the second camera has an overlapping field of view with the camera. - As shown in
step 708, themethod 700 may include transmitting the first image to the mobile device through the communication interface. In another aspect, this step may include transmitting the second image to the accessory, where subsequent three-dimensional processing is performed on a processor of the accessory. - As shown in step 710, the method may include processing the first image and the second image on the mobile device to obtain three-dimensional data from the overlapping field of view.
-
FIG. 8 shows a method for using an accessory to capture three-dimensional images with a mobile device. As described below, an accessory may provide multiple optical paths, along with an optical switch that can be controlled to selectively expose a camera of a mobile device to different poses relative to an object in a field of view. The two (or more) resulting images may be processed to extract three-dimensional data. - As shown in
step 802, themethod 800 may include attaching an accessory to a mobile device having a camera. As described above, the accessory may be removably and replaceably attachable to the mobile device, and the accessory may include an optical train with a first optical path for the camera to a first field of view and a second optical path for the camera to a second field of view having an overlapping field of view with the first field of view. The accessory may include an optical switch configured to selectively switch between the respective optical paths. - As shown in
step 804, themethod 800 may include selecting the first optical path, such as by controlling the optical switch accordingly with a control signal from processing circuitry of the accessory. - As shown in
step 806, themethod 800 may include capturing a first image with the camera, such as through the selected first optical path. - As shown in
step 808, themethod 800 may include selecting the second optical path, such as by controlling the optical switch accordingly with a control signal from the processing circuitry of the accessory. - As shown in
step 810, themethod 800 may include capturing a second image with the camera, e.g., through the selected second optical path. - As shown in
step 812, themethod 800 may include processing the first image and the second image on the mobile device to obtain three-dimensional data from the overlapping field of view provided by the two optical paths. -
FIG. 9 shows a method for using an accessory to capture three-dimensional images with a mobile device. As described below, an accessory may provide two optical paths for two different cameras of the mobile device, which optical paths may serve to direct the two cameras toward to overlapping fields of view of an object. In this manner, images may be captured using the two cameras and provided to a processor for extraction of three-dimensional data. The two cameras may advantageously be operated concurrently or substantially concurrently in order to avoid temporally-based changes in a shape or position of the object that might otherwise require additional processing for accurate extraction of three-dimensional data. - As shown in
step 902, themethod 900 may include attaching an accessory to a mobile device. The accessory may be removably and replaceably attachable to the mobile device as described above, and the accessory may include an optical train with a first optical path from a first camera of the mobile device to a first field of view and a second optical path from a second camera of the mobile device to a second field of view having an overlapping field of view with the first field of view. - As shown in
step 904, the method may include capturing a first image with the first camera. - As shown in
step 906, the method may include capturing a second image with the second camera. The second image may be captured substantially concurrently with the first image. That is, the first image and the second image may be captured sufficiently close in time to prevent substantial movement of an object within the overlapping field of view relative to the cameras. In one aspect, this may include operating the first camera and the second camera concurrently, or as close to concurrently as possible based upon the hardware and processing capabilities of the mobile device. - As shown in
step 906, the method may include processing the first image and the second image on the mobile device to obtain three-dimensional data from the overlapping field of view. - The methods or processes described above, and steps thereof, may be realized in hardware, software, or any combination of these suitable for a particular application. The hardware may include a general-purpose computer and/or dedicated computing device. The processes may be realized in one or more microprocessors, microcontrollers, embedded microcontrollers, programmable digital signal processors, or other programmable device, along with internal and/or external memory. The processes may also, or instead, be embodied in an application specific integrated circuit, a programmable gate array, programmable array logic, or any other device or combination of devices that may be configured to process electronic signals. It will further be appreciated that one or more of the processes may be realized as computer executable code created using a structured programming language such as C, an object oriented programming language such as C++, or any other high-level or low-level programming language (including assembly languages, hardware description languages, and database programming languages and technologies) that may be stored, compiled or interpreted to run on one of the above devices, as well as heterogeneous combinations of processors, processor architectures, or combinations of different hardware and software.
- Thus, in one aspect, each method described above and combinations thereof may be embodied in computer executable code that, when executing on one or more computing devices, performs the steps thereof. In another aspect, the methods may be embodied in systems that perform the steps thereof, and may be distributed across devices in a number of ways, or all of the functionality may be integrated into a dedicated, standalone device or other hardware. In another aspect, means for performing the steps associated with the processes described above may include any of the hardware and/or software described above. All such permutations and combinations are intended to fall within the scope of the present disclosure.
- It should further be appreciated that the methods above are provided by way of example. Absent an explicit indication to the contrary, the disclosed steps may be modified, supplemented, omitted, and/or re-ordered without departing from the scope of this disclosure.
- The method steps of the invention(s) described herein are intended to include any suitable method of causing such method steps to be performed, consistent with the patentability of the following claims, unless a different meaning is expressly provided or otherwise clear from the context. So for example performing the step of X includes any suitable method for causing another party such as a remote user, a remote processing resource (e.g., a server or cloud computer) or a machine to perform the step of X. Similarly, performing steps X, Y and Z may include any method of directing or controlling any combination of such other individuals or resources to perform steps X, Y and Z to obtain the benefit of such steps.
- While particular embodiments of the present invention have been shown and described, it will be apparent to those skilled in the art that various changes and modifications in form and details may be made therein without departing from the spirit and scope of this disclosure and are intended to form a part of the invention as defined by the following claims, which are to be interpreted in the broadest sense allowable by law.
Claims (20)
1. A system for three-dimensional imaging with a mobile device, the system comprising:
a housing with a mechanical interface configured to removably and replaceably attach to a predetermined mobile computing device in a predetermined orientation; and an illumination source coupled to the housing that illuminates a field of view of a camera of the predetermined mobile computing device from a predetermined pose.
2. The system of claim 1 wherein the illumination source is a structured light source that projects a predetermined pattern of light from the predetermined pose.
3. The system of claim 2 wherein the predetermined pattern of light includes one or more lines.
4. The system of claim 2 wherein the predetermined pattern of light includes one or more shapes.
5. The system of claim 1 further comprising a fixture on the housing to move the illumination source in a predetermined pattern.
6. The system of claim 1 wherein the illumination source includes a laser light source.
7. The system of claim 1 wherein the illumination source includes a light emitting diode.
8. The system of claim 1 further comprising a plurality of illumination sources coupled to the housing, each one of the plurality of illumination sources having a different pose relative to the predetermined mobile computing device.
9. The system of claim 1 further comprising a communication interface in the housing configured for data communications between the housing and the predetermined mobile computing device.
10. The system of claim 9 wherein the communication interface includes a wired communication interface that couples to a data port of the predetermined mobile computing device when the housing is attached to the predetermined mobile computing device.
11. The system of claim 9 wherein the communication interface includes a wireless communication interface.
12. The system of claim 9 wherein the communication interface includes a sensor to detect an action of the predetermined mobile computing device.
13. The system of claim 9 further comprising processing circuitry to obtain three-dimensional data from one or more images acquired by the camera of the predetermined mobile computing device.
14. The system of claim 13 wherein the processing circuitry is within the housing.
15. The system of claim 13 wherein the processing circuitry is within the predetermined mobile computing device.
16. The system of claim 1 wherein the housing includes a power source independent from the predetermined mobile computing device.
17. The system of claim 1 further comprising an optical train in the housing, wherein the optical train includes a first optical path that optically couples the camera to the field of view and a second optical path that optically couples a second camera of the predetermined mobile computing device to a second field of view from a different pose than the field of view.
18. The system of claim 1 further comprising an optical train in the housing, wherein the optical train includes a first optical path that optically couples the camera to the field of view and a second optical path that optically couples the camera to a second field of view from a different pose than the field of view, the optical train further including an optical switch that selectively couples the camera to the first optical path and the second optical path.
19. The system of claim 1 further comprising a second camera in the housing wherein the housing includes processing circuitry to control the second camera in response to a control signal received through a communication interface to the predetermined mobile computing device.
20. The system of claim 19 wherein the predetermined mobile computing device includes processing circuitry configured to receive a first image from the camera and a second image through the communication interface from the second camera, and further configured to process the first image and the second image to obtain three-dimensional data from an overlapping field of view of the camera and the second camera.
Priority Applications (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
US13/737,579 US20140043442A1 (en) | 2012-08-08 | 2013-01-09 | Mobile device accessory for three-dimensional scanning |
Applications Claiming Priority (4)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
US201261680989P | 2012-08-08 | 2012-08-08 | |
US201261719874P | 2012-10-29 | 2012-10-29 | |
US13/736,210 US20140043441A1 (en) | 2012-08-08 | 2013-01-08 | Mobile device accessory for three-dimensional scanning |
US13/737,579 US20140043442A1 (en) | 2012-08-08 | 2013-01-09 | Mobile device accessory for three-dimensional scanning |
Related Parent Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
US13/736,210 Continuation US20140043441A1 (en) | 2012-08-08 | 2013-01-08 | Mobile device accessory for three-dimensional scanning |
Publications (1)
Publication Number | Publication Date |
---|---|
US20140043442A1 true US20140043442A1 (en) | 2014-02-13 |
Family
ID=50065905
Family Applications (3)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
US13/691,230 Active 2033-03-11 US9473760B2 (en) | 2012-08-08 | 2012-11-30 | Displays for three-dimensional printers |
US13/736,210 Abandoned US20140043441A1 (en) | 2012-08-08 | 2013-01-08 | Mobile device accessory for three-dimensional scanning |
US13/737,579 Abandoned US20140043442A1 (en) | 2012-08-08 | 2013-01-09 | Mobile device accessory for three-dimensional scanning |
Family Applications Before (2)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
US13/691,230 Active 2033-03-11 US9473760B2 (en) | 2012-08-08 | 2012-11-30 | Displays for three-dimensional printers |
US13/736,210 Abandoned US20140043441A1 (en) | 2012-08-08 | 2013-01-08 | Mobile device accessory for three-dimensional scanning |
Country Status (1)
Country | Link |
---|---|
US (3) | US9473760B2 (en) |
Cited By (7)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN103916601A (en) * | 2014-04-10 | 2014-07-09 | 深圳先进技术研究院 | Three-dimensional scanning device based on mobile device and three-dimensional reconstruction method of three-dimensional scanning device |
USD717304S1 (en) * | 2012-03-26 | 2014-11-11 | Patientsafe Solutions, Inc. | Scanning jacket for a handheld device |
USD719167S1 (en) * | 2014-08-22 | 2014-12-09 | Patientsafe Solutions, Inc. | Scanning jacket for a handheld personal digital assistant (PDA) device |
USD719166S1 (en) * | 2014-08-22 | 2014-12-09 | Patientsafe Solutions, Inc. | Scanning jacket for a handheld phone device |
USD761261S1 (en) * | 2015-06-09 | 2016-07-12 | Teco Image Systems Co., Ltd | Handheld scanner |
CN106626393A (en) * | 2016-12-27 | 2017-05-10 | 张海涛 | 3D (Three-dimensional) printing mobile scanning device |
US9989999B2 (en) | 2011-03-31 | 2018-06-05 | Patientsafe Solutions, Inc. | Method of scanning codes and processing data with handheld scanning jacket |
Families Citing this family (53)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JP6071482B2 (en) * | 2012-11-29 | 2017-02-01 | キヤノン株式会社 | Information processing apparatus, information processing system, control method therefor, and program |
US20140232035A1 (en) * | 2013-02-19 | 2014-08-21 | Hemant Bheda | Reinforced fused-deposition modeling |
USD754763S1 (en) * | 2013-03-15 | 2016-04-26 | Arburg Gmbh + Co. Kg | Device for producing three-dimensional articles |
EP3008453A4 (en) * | 2013-06-10 | 2017-01-18 | Relevant Play, LLC. | Systems and methods for infrared detection |
US9814037B2 (en) | 2013-06-28 | 2017-11-07 | Intel Corporation | Method for efficient channel estimation and beamforming in FDD system by exploiting uplink-downlink correspondence |
US10016940B2 (en) * | 2013-08-23 | 2018-07-10 | Xyzprinting, Inc. | Three-dimensional printing apparatus |
USD745903S1 (en) * | 2013-10-10 | 2015-12-22 | Michael Daniel Armani | Three-dimensional printer frame |
WO2015057886A1 (en) | 2013-10-15 | 2015-04-23 | Wolf And Associates, Inc. | Three-dimensional printer systems and methods |
US9931776B2 (en) * | 2013-12-12 | 2018-04-03 | United Technologies Corporation | Methods for manufacturing fiber-reinforced polymeric components |
US20150172773A1 (en) * | 2013-12-18 | 2015-06-18 | United Video Properties, Inc. | Systems and methods for selectively printing three-dimensional objects within media assets |
USD776174S1 (en) * | 2014-01-02 | 2017-01-10 | 3D Systems, Inc. | Three-dimensional printer frame |
USD733196S1 (en) | 2014-02-03 | 2015-06-30 | Wolf And Associates, Inc. | 3D printer enclosure |
US10512552B2 (en) | 2014-03-25 | 2019-12-24 | Biobots, Inc. | Methods, devices, and systems for the fabrication of materials and tissues utilizing electromagnetic radiation |
DE102014004692A1 (en) * | 2014-03-31 | 2015-10-15 | Voxeljet Ag | Method and apparatus for 3D printing with conditioned process control |
JP6416495B2 (en) * | 2014-04-28 | 2018-10-31 | ローランドディー.ジー.株式会社 | 3D modeling apparatus and 3D modeling method |
NL2013141B1 (en) * | 2014-07-07 | 2016-09-20 | Dingify B V | 3D printer, system comprising such 3D printer and method for operating both. |
JP1523962S (en) | 2014-10-21 | 2015-05-18 | ||
JP1523475S (en) * | 2014-10-21 | 2015-05-18 | ||
USD771164S1 (en) * | 2014-11-21 | 2016-11-08 | Sina Noorazar | Printer for printing 3-dimensional objects |
USD757132S1 (en) * | 2015-01-04 | 2016-05-24 | Xyzprinting, Inc. | 3D printer |
CN105815424B (en) * | 2015-01-05 | 2020-02-07 | 三纬国际立体列印科技股份有限公司 | Three-dimensional printing device |
USD768214S1 (en) * | 2015-01-30 | 2016-10-04 | Sti Co., Ltd. | 3D printer |
USD763330S1 (en) * | 2015-02-27 | 2016-08-09 | Natural Machines, Inc. | Three dimensional printer |
USD760306S1 (en) * | 2015-03-20 | 2016-06-28 | Wolf & Associates, Inc. | 3D printer enclosure |
USD760825S1 (en) * | 2015-03-25 | 2016-07-05 | Biobots, Inc. | Bioprinter |
US10265911B1 (en) * | 2015-05-13 | 2019-04-23 | Marvell International Ltd. | Image-based monitoring and feedback system for three-dimensional printing |
USD765745S1 (en) * | 2015-06-01 | 2016-09-06 | Well Smart Electronics Limited | 3D printer |
KR101953085B1 (en) * | 2015-06-23 | 2019-02-28 | 캐논코리아비즈니스솔루션 주식회사 | Three-dimensional Printer with apparatus which create digital hologram |
USD766998S1 (en) * | 2015-07-28 | 2016-09-20 | Xyzprinting, Inc. | 3D printing apparatus |
USD765154S1 (en) * | 2015-10-27 | 2016-08-30 | Nano-Dimensions Technologies LTD. | 3D printer |
USD777809S1 (en) * | 2016-01-05 | 2017-01-31 | Xyzprinting, Inc. | 3D printer |
USD777227S1 (en) * | 2016-01-05 | 2017-01-24 | Xyzprinting, Inc. | Stereolithography machine |
USD791203S1 (en) * | 2016-03-15 | 2017-07-04 | Shenzhen Longer 3D Technology Co., Ltd. | 3D printer |
US10994480B2 (en) | 2016-06-08 | 2021-05-04 | Wolf & Associates, Inc. | Three-dimensional printer systems and methods |
EP3475873A4 (en) | 2016-06-24 | 2020-02-19 | Relevant Play, LLC. | Authenticable digital code and associated systems and methods |
JP6827741B2 (en) * | 2016-08-31 | 2021-02-10 | キヤノン株式会社 | Information processing equipment, control methods, and programs |
DE102016119868A1 (en) * | 2016-10-18 | 2018-04-19 | Reifenhäuser GmbH & Co. KG Maschinenfabrik | Plastic extrusion line with mobile user interface and method for operating this plastic extrusion line |
CN109791299B (en) * | 2017-01-25 | 2021-07-02 | 惠普发展公司,有限责任合伙企业 | Device for transmitting light, accessory for transmitting light and manufacturing method thereof |
WO2018165155A1 (en) * | 2017-03-09 | 2018-09-13 | Walmart Apollo, Llc | System and methods for three dimensional printing with blockchain controls |
US10300651B2 (en) | 2017-04-11 | 2019-05-28 | Innosun Llc | Portable 3D printer |
USD826296S1 (en) | 2017-04-11 | 2018-08-21 | Innosun Llc | Printer for printing three-dimensional objects |
WO2019022701A1 (en) | 2017-07-24 | 2019-01-31 | Hewlett-Packard Development Company, L.P. | Imaging with job ticket |
US10828723B2 (en) | 2017-11-13 | 2020-11-10 | General Electric Company | Process monitoring for mobile large scale additive manufacturing using foil-based build materials |
US10894299B2 (en) | 2017-11-13 | 2021-01-19 | General Electric Company | Fixed bed large scale additive manufacturing using foil-based build materials |
US11364564B2 (en) | 2017-11-13 | 2022-06-21 | General Electric Company | Mobile large scale additive manufacturing using foil-based build materials |
US10828724B2 (en) | 2017-11-13 | 2020-11-10 | General Electric Company | Foil part vectorization for mobile large scale additive manufacturing using foil-based build materials |
US10747477B2 (en) * | 2017-11-17 | 2020-08-18 | Canon Kabushiki Kaisha | Print control system that transmit to a registered printing apparatus, a change instruction for changing a setting of the power of the registered printing apparatus, and related method |
USD864262S1 (en) | 2018-04-24 | 2019-10-22 | Innosun Llc | Three-dimensional (3d) printing system and associated pen |
WO2020005211A1 (en) | 2018-06-26 | 2020-01-02 | Hewlett-Packard Development Company, L.P. | Generating downscaled images |
USD871463S1 (en) * | 2018-08-02 | 2019-12-31 | Jiangsu Wiiboox Technology Co., Ltd. | 3D printer |
JP2020087347A (en) * | 2018-11-30 | 2020-06-04 | 株式会社リコー | Voice operation system, voice operation method, and voice operation program |
US20220080655A1 (en) * | 2018-12-20 | 2022-03-17 | Jabil Inc. | Apparatus, system and method of combining additive manufacturing print types |
JP2022123699A (en) * | 2021-02-12 | 2022-08-24 | 東芝テック株式会社 | Information processing terminal and program |
Citations (15)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US5585789A (en) * | 1992-05-11 | 1996-12-17 | Sharp Kabushiki Kaisha | Data communication apparatus |
US5633705A (en) * | 1994-05-26 | 1997-05-27 | Mitsubishi Denki Kabushiki Kaisha | Obstacle detecting system for a motor vehicle |
US5778268A (en) * | 1996-07-12 | 1998-07-07 | Inaba; Minoru | Stereo camera |
US5870220A (en) * | 1996-07-12 | 1999-02-09 | Real-Time Geometry Corporation | Portable 3-D scanning system and method for rapid shape digitizing and adaptive mesh generation |
US6979093B2 (en) * | 2003-12-16 | 2005-12-27 | Wen-Feng Tsay | Accessory illuminating device of mobile phone |
US20060210146A1 (en) * | 2005-01-07 | 2006-09-21 | Jin Gu | Creating 3D images of objects by illuminating with infrared patterns |
US20070147827A1 (en) * | 2005-12-28 | 2007-06-28 | Arnold Sheynman | Methods and apparatus for wireless stereo video streaming |
US20090181729A1 (en) * | 2008-01-04 | 2009-07-16 | Griffin Jr Paul P | Device case with optical lenses |
US20100255876A1 (en) * | 2009-04-03 | 2010-10-07 | Ubiquity Holdings | Medical scan clip on |
US20120026298A1 (en) * | 2010-03-24 | 2012-02-02 | Filo Andrew S | Apparatus and method for producing images for stereoscopic viewing |
US20120236424A1 (en) * | 2011-03-16 | 2012-09-20 | Lumos Technology Co., Ltd. | Lens connection module and connection adapter for same |
US20120270600A1 (en) * | 2011-04-25 | 2012-10-25 | Steve Terry Zelson | Case for portable electronic device |
US20130293684A1 (en) * | 2011-04-15 | 2013-11-07 | Faro Technologies, Inc. | Three-dimensional coordinate scanner and method of operation |
US8593745B2 (en) * | 2011-03-18 | 2013-11-26 | Premier Systems Usa, Inc. | Lenses for communication devices |
US8678601B2 (en) * | 2011-11-18 | 2014-03-25 | Robert Lee Morris | Collapsible light modifier for portable flash |
Family Cites Families (31)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US5776409A (en) * | 1988-04-18 | 1998-07-07 | 3D Systems, Inc. | Thermal stereolithograp using slice techniques |
US5545367A (en) * | 1992-04-15 | 1996-08-13 | Soane Technologies, Inc. | Rapid prototype three dimensional stereolithography |
US6896839B2 (en) * | 2001-02-07 | 2005-05-24 | Minolta Co., Ltd. | Three-dimensional molding apparatus and three-dimensional molding method |
DE20106887U1 (en) * | 2001-04-20 | 2001-09-06 | Envision Technologies Gmbh | Device for producing a three-dimensional object |
SE524420C2 (en) * | 2002-12-19 | 2004-08-10 | Arcam Ab | Apparatus and method for making a three-dimensional product |
US20050012246A1 (en) * | 2003-06-19 | 2005-01-20 | Kazutora Yoshino | High resolution and rapid three dimensional object generator |
US20050074596A1 (en) * | 2003-10-06 | 2005-04-07 | Nielsen Jeffrey A. | Method and system for using porous structures in solid freeform fabrication |
US7614866B2 (en) * | 2007-01-17 | 2009-11-10 | 3D Systems, Inc. | Solid imaging apparatus and method |
ATE553910T1 (en) * | 2007-07-04 | 2012-05-15 | Envisiontec Gmbh | METHOD AND DEVICE FOR PRODUCING A THREE-DIMENSIONAL OBJECT |
JP5293993B2 (en) * | 2008-01-09 | 2013-09-18 | ソニー株式会社 | Stereolithography apparatus and stereolithography method |
US8326024B2 (en) * | 2009-04-14 | 2012-12-04 | Global Filtration Systems | Method of reducing the force required to separate a solidified object from a substrate |
US8922625B2 (en) * | 2009-11-19 | 2014-12-30 | Lg Electronics Inc. | Mobile terminal and controlling method thereof |
JP5593736B2 (en) * | 2010-03-02 | 2014-09-24 | セイコーエプソン株式会社 | Modeling method and modeling apparatus |
JP2011241450A (en) * | 2010-05-19 | 2011-12-01 | Keijiro Yamamoto | Layered manufacturing method and layered manufacturing apparatus |
US9022769B2 (en) * | 2010-07-22 | 2015-05-05 | Stratasys, Inc. | Multiple-zone liquefier assembly for extrusion-based additive manufacturing systems |
US8414280B2 (en) * | 2010-08-18 | 2013-04-09 | Makerbot Industries, Llc | Networked three-dimensional printing |
US20120092724A1 (en) * | 2010-08-18 | 2012-04-19 | Pettis Nathaniel B | Networked three-dimensional printing |
US8668859B2 (en) * | 2010-08-18 | 2014-03-11 | Makerbot Industries, Llc | Automated 3D build processes |
WO2012088257A1 (en) * | 2010-12-22 | 2012-06-28 | Stratasys, Inc. | Print head assembly and print head for use in fused deposition modeling system |
US8274552B2 (en) * | 2010-12-27 | 2012-09-25 | 3Dmedia Corporation | Primary and auxiliary image capture devices for image processing and related methods |
WO2012143923A2 (en) * | 2011-04-17 | 2012-10-26 | Objet Ltd. | System and method for additive manufacturing of an object |
TWI448732B (en) * | 2012-05-03 | 2014-08-11 | Young Optics Inc | Three-dimensional printing apparatus |
US9463598B2 (en) * | 2012-05-22 | 2016-10-11 | Makerbot Industries, Llc | In-filling for additive manufacturing |
US9421716B2 (en) * | 2012-08-08 | 2016-08-23 | Makerbot Industries, Llc | Photo booth for three-dimensional images |
US20140120196A1 (en) * | 2012-10-29 | 2014-05-01 | Makerbot Industries, Llc | Quick-release extruder |
US9332167B1 (en) * | 2012-11-20 | 2016-05-03 | Amazon Technologies, Inc. | Multi-directional camera module for an electronic device |
WO2014144482A1 (en) * | 2013-03-15 | 2014-09-18 | Matterfab Corp. | Apparatus and methods for manufacturing |
US20150102531A1 (en) * | 2013-10-11 | 2015-04-16 | Global Filtration Systems, A Dba Of Gulf Filtration Systems Inc. | Apparatus and method for forming three-dimensional objects using a curved build platform |
US20150103146A1 (en) * | 2013-10-16 | 2015-04-16 | Qualcomm Incorporated | Conversion of at least one non-stereo camera into a stereo camera |
JP5971266B2 (en) * | 2014-01-22 | 2016-08-17 | トヨタ自動車株式会社 | Stereolithography apparatus and stereolithography method |
US9527244B2 (en) * | 2014-02-10 | 2016-12-27 | Global Filtration Systems | Apparatus and method for forming three-dimensional objects from solidifiable paste |
-
2012
- 2012-11-30 US US13/691,230 patent/US9473760B2/en active Active
-
2013
- 2013-01-08 US US13/736,210 patent/US20140043441A1/en not_active Abandoned
- 2013-01-09 US US13/737,579 patent/US20140043442A1/en not_active Abandoned
Patent Citations (15)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US5585789A (en) * | 1992-05-11 | 1996-12-17 | Sharp Kabushiki Kaisha | Data communication apparatus |
US5633705A (en) * | 1994-05-26 | 1997-05-27 | Mitsubishi Denki Kabushiki Kaisha | Obstacle detecting system for a motor vehicle |
US5778268A (en) * | 1996-07-12 | 1998-07-07 | Inaba; Minoru | Stereo camera |
US5870220A (en) * | 1996-07-12 | 1999-02-09 | Real-Time Geometry Corporation | Portable 3-D scanning system and method for rapid shape digitizing and adaptive mesh generation |
US6979093B2 (en) * | 2003-12-16 | 2005-12-27 | Wen-Feng Tsay | Accessory illuminating device of mobile phone |
US20060210146A1 (en) * | 2005-01-07 | 2006-09-21 | Jin Gu | Creating 3D images of objects by illuminating with infrared patterns |
US20070147827A1 (en) * | 2005-12-28 | 2007-06-28 | Arnold Sheynman | Methods and apparatus for wireless stereo video streaming |
US20090181729A1 (en) * | 2008-01-04 | 2009-07-16 | Griffin Jr Paul P | Device case with optical lenses |
US20100255876A1 (en) * | 2009-04-03 | 2010-10-07 | Ubiquity Holdings | Medical scan clip on |
US20120026298A1 (en) * | 2010-03-24 | 2012-02-02 | Filo Andrew S | Apparatus and method for producing images for stereoscopic viewing |
US20120236424A1 (en) * | 2011-03-16 | 2012-09-20 | Lumos Technology Co., Ltd. | Lens connection module and connection adapter for same |
US8593745B2 (en) * | 2011-03-18 | 2013-11-26 | Premier Systems Usa, Inc. | Lenses for communication devices |
US20130293684A1 (en) * | 2011-04-15 | 2013-11-07 | Faro Technologies, Inc. | Three-dimensional coordinate scanner and method of operation |
US20120270600A1 (en) * | 2011-04-25 | 2012-10-25 | Steve Terry Zelson | Case for portable electronic device |
US8678601B2 (en) * | 2011-11-18 | 2014-03-25 | Robert Lee Morris | Collapsible light modifier for portable flash |
Cited By (7)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US9989999B2 (en) | 2011-03-31 | 2018-06-05 | Patientsafe Solutions, Inc. | Method of scanning codes and processing data with handheld scanning jacket |
USD717304S1 (en) * | 2012-03-26 | 2014-11-11 | Patientsafe Solutions, Inc. | Scanning jacket for a handheld device |
CN103916601A (en) * | 2014-04-10 | 2014-07-09 | 深圳先进技术研究院 | Three-dimensional scanning device based on mobile device and three-dimensional reconstruction method of three-dimensional scanning device |
USD719167S1 (en) * | 2014-08-22 | 2014-12-09 | Patientsafe Solutions, Inc. | Scanning jacket for a handheld personal digital assistant (PDA) device |
USD719166S1 (en) * | 2014-08-22 | 2014-12-09 | Patientsafe Solutions, Inc. | Scanning jacket for a handheld phone device |
USD761261S1 (en) * | 2015-06-09 | 2016-07-12 | Teco Image Systems Co., Ltd | Handheld scanner |
CN106626393A (en) * | 2016-12-27 | 2017-05-10 | 张海涛 | 3D (Three-dimensional) printing mobile scanning device |
Also Published As
Publication number | Publication date |
---|---|
US20140043441A1 (en) | 2014-02-13 |
US9473760B2 (en) | 2016-10-18 |
US20140043630A1 (en) | 2014-02-13 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
US20140043442A1 (en) | Mobile device accessory for three-dimensional scanning | |
US10150248B2 (en) | Downloadable three-dimensional models | |
US11167464B2 (en) | Tagged build material for three-dimensional printing | |
US11599685B2 (en) | Detection and use of printer configuration information | |
US9421716B2 (en) | Photo booth for three-dimensional images | |
US10908849B2 (en) | Networked three-dimensional printing | |
US9626142B2 (en) | Automated model selection |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
AS | Assignment |
Owner name: MAKERBOT INDUSTRIES, LLC, NEW YORK Free format text: ASSIGNMENT OF ASSIGNORS INTEREST;ASSIGNORS:BUSER, ANTHONY JAMES;DOUGLAS, ARIEL;PAX, CHARLES E.;SIGNING DATES FROM 20130116 TO 20130129;REEL/FRAME:032414/0227 Owner name: MAKERBOT INDUSTRIES, LLC, NEW YORK Free format text: ASSIGNMENT OF ASSIGNORS INTEREST;ASSIGNOR:BORENSTEIN, GREGORY ALAN;REEL/FRAME:032413/0890 Effective date: 20120816 |
|
STCB | Information on status: application discontinuation |
Free format text: ABANDONED -- FAILURE TO RESPOND TO AN OFFICE ACTION |