GB2520255A - Method and apparatus for 3D Printing - Google Patents

Method and apparatus for 3D Printing Download PDF

Info

Publication number
GB2520255A
GB2520255A GB1319944.3A GB201319944A GB2520255A GB 2520255 A GB2520255 A GB 2520255A GB 201319944 A GB201319944 A GB 201319944A GB 2520255 A GB2520255 A GB 2520255A
Authority
GB
United Kingdom
Prior art keywords
model
customer
scanner
controller
printing
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Withdrawn
Application number
GB1319944.3A
Other versions
GB201319944D0 (en
Inventor
Philip Alexander Stout
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Asda Stores Ltd
Original Assignee
Asda Stores Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Asda Stores Ltd filed Critical Asda Stores Ltd
Priority to GB1319944.3A priority Critical patent/GB2520255A/en
Publication of GB201319944D0 publication Critical patent/GB201319944D0/en
Publication of GB2520255A publication Critical patent/GB2520255A/en
Withdrawn legal-status Critical Current

Links

Classifications

    • GPHYSICS
    • G01MEASURING; TESTING
    • G01BMEASURING LENGTH, THICKNESS OR SIMILAR LINEAR DIMENSIONS; MEASURING ANGLES; MEASURING AREAS; MEASURING IRREGULARITIES OF SURFACES OR CONTOURS
    • G01B11/00Measuring arrangements characterised by the use of optical techniques
    • G01B11/24Measuring arrangements characterised by the use of optical techniques for measuring contours or curvatures
    • G01B11/25Measuring arrangements characterised by the use of optical techniques for measuring contours or curvatures by projecting a pattern, e.g. one or more lines, moiré fringes on the object
    • G01B11/2518Projection by scanning of the object
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B29WORKING OF PLASTICS; WORKING OF SUBSTANCES IN A PLASTIC STATE IN GENERAL
    • B29CSHAPING OR JOINING OF PLASTICS; SHAPING OF MATERIAL IN A PLASTIC STATE, NOT OTHERWISE PROVIDED FOR; AFTER-TREATMENT OF THE SHAPED PRODUCTS, e.g. REPAIRING
    • B29C48/00Extrusion moulding, i.e. expressing the moulding material through a die or nozzle which imparts the desired form; Apparatus therefor
    • B29C48/03Extrusion moulding, i.e. expressing the moulding material through a die or nozzle which imparts the desired form; Apparatus therefor characterised by the shape of the extruded material at extrusion
    • B29C48/09Articles with cross-sections having partially or fully enclosed cavities, e.g. pipes or channels
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B29WORKING OF PLASTICS; WORKING OF SUBSTANCES IN A PLASTIC STATE IN GENERAL
    • B29CSHAPING OR JOINING OF PLASTICS; SHAPING OF MATERIAL IN A PLASTIC STATE, NOT OTHERWISE PROVIDED FOR; AFTER-TREATMENT OF THE SHAPED PRODUCTS, e.g. REPAIRING
    • B29C64/00Additive manufacturing, i.e. manufacturing of three-dimensional [3D] objects by additive deposition, additive agglomeration or additive layering, e.g. by 3D printing, stereolithography or selective laser sintering
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B29WORKING OF PLASTICS; WORKING OF SUBSTANCES IN A PLASTIC STATE IN GENERAL
    • B29CSHAPING OR JOINING OF PLASTICS; SHAPING OF MATERIAL IN A PLASTIC STATE, NOT OTHERWISE PROVIDED FOR; AFTER-TREATMENT OF THE SHAPED PRODUCTS, e.g. REPAIRING
    • B29C64/00Additive manufacturing, i.e. manufacturing of three-dimensional [3D] objects by additive deposition, additive agglomeration or additive layering, e.g. by 3D printing, stereolithography or selective laser sintering
    • B29C64/30Auxiliary operations or equipment
    • B29C64/386Data acquisition or data processing for additive manufacturing
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01BMEASURING LENGTH, THICKNESS OR SIMILAR LINEAR DIMENSIONS; MEASURING ANGLES; MEASURING AREAS; MEASURING IRREGULARITIES OF SURFACES OR CONTOURS
    • G01B11/00Measuring arrangements characterised by the use of optical techniques
    • G01B11/24Measuring arrangements characterised by the use of optical techniques for measuring contours or curvatures
    • G01B11/25Measuring arrangements characterised by the use of optical techniques for measuring contours or curvatures by projecting a pattern, e.g. one or more lines, moiré fringes on the object
    • G01B11/2509Color coding
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01BMEASURING LENGTH, THICKNESS OR SIMILAR LINEAR DIMENSIONS; MEASURING ANGLES; MEASURING AREAS; MEASURING IRREGULARITIES OF SURFACES OR CONTOURS
    • G01B11/00Measuring arrangements characterised by the use of optical techniques
    • G01B11/24Measuring arrangements characterised by the use of optical techniques for measuring contours or curvatures
    • G01B11/25Measuring arrangements characterised by the use of optical techniques for measuring contours or curvatures by projecting a pattern, e.g. one or more lines, moiré fringes on the object
    • G01B11/254Projection of a pattern, viewing through a pattern, e.g. moiré

Abstract

A 3D printing system 100 adapted for use in a retail store includes a scanner 110, a controller 120 and a 3D printer device 130. The scanner 110 is configured to scan a customer 10 in the retail store. The controller unit 120 is configured to obtain a 3D model 121 of the customer 10 from the scan by scanner 110 and to provide a printing control file. The 3D printer configured to produce a figurine 20 of the customer using the printing control file from the controller 120.

Description

METHOD AND APPARATUS FOR 3D PRINTING
BACKGROUND
Technical Field
Foil The present invention relates in general to the field of on-demand three-dimensional printing.
Description of Related Art
[021 It is possible to provide printing in three dimensions using equipment known as a 3D printer. This type of printer has been developed to produce and manufacture bespoke objects, such as prototypes or samples, through a relatively low-speed computer-controlled additive printing process. Typically, the additive printing process gradually applies successive layers of material by an appropriate delivery system to build up the intended object. The printer is usually driven by a suitable controller which follows commands from an appropriate graphical model of the object. The material is most often an extruded thermoplastic material for fusion deposition modelling, or a photopolymer for stereolithography or digital-light printing processes.
[03] There is a desire to provide an improved 3D printing system.
SUMMARY OF THE INVENTION
[04] According to the present invention there is provided a 3D printing system, a method and a computer-readable storage medium as set forth in the appended claims. Other, optional, features of the invention will be apparent from the dependent claims, and the description which follows.
[05] There now follows a summary of various aspects and advantages according to embodiments of the invention. This summary is provided as an introduction to assist those skilled in the art to more rapidly assimilate the detailed discussion herein and does not and is not intended in any way to limit the scope of the claims that are appended hereto.
[06] In one aspect of the present invention there is provided a 3D printing system adapted for use in a retail store, comprising: a scanner configured to scan a customer in the retail store; a controller unit configured to obtain a 3D model of the customer from the scanner and to provide a printing control file; and a 3D printer configured to produce a figurine of customer using the printing control file from the controller.
[07] In one example, the 3D scanner is configured to obtain a plurality of 2D images of the customer. The scanner may project structured light on to the customer and capture images of the customer illuminated by the structured light. The scanner may be a handheld or portable scanner device. The scanner may be coupled to the controller by any suitable communications link.
[08] In one example, the system further includes a scanning booth arranged to locate the customer during the scanning by the scanner. The scanning booth provides a controlled visual environment in which scanning of the customer may take place.
[09] In one example, the controller is configured to derive a 3D model of the customer. The model may be derived from the images obtained by the scanner and stored in a persistent data storage medium, such as a hard drive. The 3D model may be derived in real time and a visual representation of the 3D model may be displayed during the scanning as visual feedback for the customer and for an operator of the system.
[10] In one example, the controller is configured to create a triangulated surface mesh model of the customer. The model may be registered or oriented within a coordinate system held or defined by the controller.
[11] In one example the controller is configured to perform a plurality of post-capture processing steps on the model.
[12] The controller may be arranged to perform integrity checking of the model. The integrity checking may include creating a watertight version of the model, in which any apertures in the model are filled by solid surface elements.
[13] The controller may be arranged to manipulate, separate or remove components of the observed images. The controller may separate the customer from their observed surroundings, by separating the model of the customer away from a floor on which the customer was standing when observed by the scanner. Thus, the controller selectively includes only desired parts of the observed images when constructing the 3D model of the customer.
[14] The controller may be arranged to provide a sculpting interface which enables manipulation of the 3D model of the customer. The interface may enable manual inputs by a sculptor which change the 3D model, such as refining an appearance of the customer in the model. The sculpting interface is suitably arranged to allow manual refinement of hair or other fine features of the model.
[15] The controller may hold one or more 2D observations of the customer and may display the 2D observations on a display screen alongside the sculpting interface. The system may include a 2D camera which is configured to capture the 2D observations simultaneously with scanning of the customer by the scanner. Alternately, the 2D observations may be captured during the scanning by the scanner itself.
[16] The controller may be arranged to apply a texture to the 3D model. The 3D model may initially be colourless, having a default surface texture (e.g. plain white). The controller may apply a coloured surface texture to the 3D model. The surface texture may be applied after one or more of the processing stages described herein. The surface texture may be applied after the integrity checking, floor removal, and sculpting processes. A surface texture file may be created by the scanner as part of the scanning process of scanning the customer. The controller may project the surface texture file on the colourless 3D model to created or define a coloured 3D model having both shape and texture. The shape may be defined by the triangulated mesh. The surface colour may be defined by one or more image files projected on to the triangulated mesh.
[17] The controller may further provide a finishing interface for final manual finishing of the 3D model. Providing the finishing interface may allow operator inputs to manipulate the mesh and/or the surface texture of the model. The finishing interface may include displaying the 2D observations of the customer for a visual reference by the operator while performing the finishing operations.
[18] In one example, the 3D printer is a ceramic printer. The ceramic printer may print the figurine by additively depositing a ceramic material, such as gypsum (calcium sulfate dihydrate) mixed with a suitable binding agent. Optionally, a colouring dye is added at selected regions, to provide a coloured figurine. The printed figurine may be treated or coated, such as by a sealant, ready to be delivered to the customer.
[19] In one example, a tangible non-transient computer-readable storage medium is provided having recorded thereon instructions which, when implemented by a computer device, cause the computer device to be arranged as set forth herein and/or which cause the computer device to perform any of the methods as set forth herein.
BRIEF DESCRIPTION OF THE DRAWINGS
[20] For a better understanding of the invention, and to show how example embodiments may be carried into effect, reference will now be made to the accompanying drawings in which: [21] Figure 1 is a schematic view of an example 3D printing system; [22] Figure 2 is another schematic view of an example 3D printing system; and [23] Figure 3 is a schematic view of an example method.
DETAILED DESCRIPTION OF THE EXAMPLE EMBODIMENTS
[24] At least some of the following example embodiments provide an improved 3D printing system. Many other advantages and improvements will be discussed in more detail herein.
[25] Figure lisa schematic view and example 3D printing system 100. The system includes a scanner 110, a controller 120, and a 3D printer 130. The 3D printing system 100 is adapted for use in a retail store.
[26] The scanner 110 is arranged to scan a customer 10 while the customer is present in the retail store. The scanner 110 may be scanner arranged to move around the customer and obtain views from each direction so as to progressively capture a visual representation of the customer. In one example, the scanner 110 is a handheld device which is carried and manipulated by an operator to perform the scanning. The scanner 110 suitably captures images which provide both geometry (shape) and colour (texture). The scanning process may be completed in under 5 minutes, and often in only 2 or 3 minutes. The handheld scanner 110 may easily capture areas of important detail, undercuts, and occluded areas, by appropriate positioning of the scanner 110 during the scanning process.
[27] The controller 120 is suitably a computer device having a display screen and user inputs such as a keyboard, mouse, etc. The controller 120 is operatively coupled to the scanner 110 such as by a USB cable. The controller 120 suitably constructs a three-dimensional model of the customer in real time as the customer is scanned by the scanner 110. The 3D model 121 may be displayed on the display screen, providing visual feedback for the customer and for the operator of the system who performs the scanning.
[28] In one example, the system further comprises a scanning booth 140 which provides a controlled visual environment (e.g. a plain white background), in which the customer may be easily scanned by the scanner 110. However, one advantage of the example system is that physical restrictions are applied to the scanning booth 140 are relatively minimal and lightweight. That is, the scanning booth 140 in the example embodiment simply needs to provide an environment in which there are minimal distracting visual artefacts in the vicinity of the customer 10 during the scanning progress.
[29] The controller 120 is arranged to construct a 3D model 121 of the customer 10. The 3D model is constructed using images as obtained by the scanner 110 during the scanning stage.
Advantageously, the handheld scanner unit 110 produces a model which is relatively complete, having relatively few gaps or holes that need to be filled. Suitably the scanner 110 projects structured light onto the customer 10 and captures images of the customer when they are illuminated by the structured light. By examining the captured images and in particular by examining distortions of the structured light when projected onto the customer, an accurate 3D model of the customer may be obtained.
[30] In one example, the scanner 110 obtains the images progressively by moving around the customer in various directions and observing the customer from each angle. These images are then composited by the controller 120 to form the 3D model 121. The 3D model 121 is suitably a triangulated surface mesh model of the customer. The 3D model may be held within a co-ordinate system defined by the controller 120, i.e. within a virtual physical environment defined by the controller.
[31] The controller 120 prepares a printing control file which is output to the printer 130. The printer 130 performs a three-dimensional printing process using the printing control file from the controller 120. The 3D printer 130 produces a three-dimensional figurine 20 which, in the example embodiments, is a high quality full colour 3D representation of the customer. In one example, the figurine 20may be about 20cm in height. In one example the 3D printer 130 is a ceramic printer which creates the figurine 20 in ceramic materials, giving a high quality and durable product.
[32] Figure 2 illustrates the controller 120 in more detail. The controller may comprise a plurality of modules which are configured to perform a plurality of processing steps.
[33] A first module 120a receives the scanned images from the scanner 110 and constructs an initial version of the 3D model 121. The controller 120 may then perform one or more post-capture processing steps on the model 121 using the subsequent modules.
[34] The first module 120a may perform a global registration, which is an algorithm that converts all one-frame surfaces into a single co-ordinate system using information on the mutual position of each pair of surfaces. An accuracy value is given indicating the quality of the scan. At this point it is possible to assess if the person has moved or if the scan was poor, and to repeat the scanning process for this part of the customer. As a result, the scanning process is performed with a high quality initial capture.
[35] The first module 120a may also perform a conversion process by converting the millions of data points collected by the scanning of the customer into a triangulated surface mesh. At this point the project may be saved and the 3D model is recorded to a persistent storage (e.g. a hard drive). The 3D model is suitably held in a format capable of being retrieved from the storage at a later time.
[36] The second module 12Db is suitably arranged to import the newly generated 3D model.
The second stage 12Db may perform an integrity check on the integrity of the model. This check is an analysis of the model that will look for imperfections that might cause the file to fail when it is later put into the 3D printer. These imperfections include: self intersecting triangles, highly creased edges, spikes, small components, small tunnels, small holes, non-manifold edges. The second module corrects these imperfections by modifying these local areas and thus creating good geometry suitable for printing. If this stage was not performed, the 3D printer 130 may, for example, fail to import the file, build the model incorrectly with significant weaknesses or be unable to print these areas and stop half way through. This check is advantageous to reduce the nuniber of failures or poor models that are put into the 3D printing machine 130.
[37] In the scanning process, the scanner 110 may also image part of the floor while a person is being scanned. Inside the second module 120b, it is possible to select this data and effectively cut away the floor leaving just the person's shoes.
[38] The second module 120b is also used to scale a person as appropriate to a figurine of about 6, 7 or 8 inches in height (e.g. 15, 18 or 20 cm). The overall height of a person can be measured and then a relative scaling factor can be applied to reduce the model to the correct size for 3D printing. If a group of customers has been scanned (e.g. each member of a family) and they would like their figurines to be sized correctly in proportion to each other, again this calculation can be made and the figurines scaled relatively by the second module 120b.
[39] It is also possible to reduce the data size of the 3D model. In a practical example, millions of data points are collected on a person and subsequently the initial 3D model can be very large. A decimation of the file is possible by relative tolerance. The second module 120b may analyse the curvature of the model and reduce the number of triangles in low curvature areas while keeping smaller triangles in high curvature areas. Thus, the second module 120b may reduce data density in areas where there is not much detail but keeps the areas of high detail, thus meaning file size can be reduce greatly while not negatively affecting the overall appearance of the figurine.
[40] The revised 3D model from the second module 120b can be saved and imported into the third module 120c for further refinement. Probably the most difficult area to 3D scan on a person is the hair. The customer's hair may need to be re-sculpted slightly to achieve a high quality result. From the scanning process, some areas of the 3D model such as the very top of people's heads can have a flat spot with no real detail. The third module 120c enables reshaping of the head portion of the 3Dmodel, and to brush in detailed strands of hair. The third module 120c may be operated on screen by an operator, in reference to a displayed photograph taken of the customer at the time of the scan.
[41] The third module may also be used for a visual inspection of the 3D model. Particularly, the 3D model may be checked over for any peculiar features, e.g. if a person moved their hands while scanning they may have unnaturally shaped fingers. These identified problem areas can be quickly re-sculpted to achieve a high quality result.
[42] At the end of the sculpting process by the third module 120c, the 3D model in terms of geometry/shape is ready for printing and, in one example, the printing control file could be sent to the 3D printer 130 as a single colour model.
[43] The fourth module 120d suitably imports the sculpted and refined version of the 3D model. The fourth module 120d applies surface texture to the 3D model. Colour information captured during scanning is projected onto the 3D model. This process uses all of the hundreds of colour images to effectively conibine into one single panoramic' which is then projected onto the 3D model. This texturing stage is suitably performed last, in relation to the re-sculpted perfect model.
[44] The fifth module 120e is arranged to perform texture refinement. The 3D model now has colour information associated therewith by the fourth module. However, depending on lighting conditions in the retail store and other factors, there might be some small areas of missing colour or incorrect colour. Using typical image manipulation functions such as clone stamping, healing, etc., any missing or distorted surface areas are corrected with the aim of producing a high quality model in terms of geometry and colour.
[45] In one example, any of the first to fifth modules 120a-120e may be implemented by the same software package, but they are described separately for convenience and clarity.
[46] The controller 120 may ultimately record and export the final geometry and texture in any suitable format for use as the printing control file. In one example, the printing control file is provided in a recognised format, such as a VRML file (Virtual Reality Modelling Language), which is then imported into the 3D printer 130 ready for a print run.
[47] Figure 3 is a flowchart of an example 3D printing method. In the example embodiments, the method may be implemented as described in detail above. For brevity, the details of each step will not be repeated again here.
[48] Step 301 comprises scanning a customer in a retail store to provide an initial 3D model, as described in detail above.
[49] Step 302 comprises performing integrity checking of the initial 3D model and making corrections so that the 3D model is able to be printed by a 3D printer machine as a figurine.
[50] Step 303 comprises isolating portions of the model which are desired to be printed as the figurine. Step 303 also comprises sculpting the 3D model to refine the geometry of the model.
[51] Step 304 comprises applying texture to the model. The texturing may include defining a colour for each surface region of the model. Step 304 may also comprise making colour changes or colour corrections.
[52] Step 305 comprises recording a printing control file from the 3D model. The step 305 suitably includes printing the figurine by a 3D printer using the printing control file.
[53] At least some of the example embodiments described herein may be constructed, partially or wholly, using dedicated special-purpose hardware. Terms such as component, module or unit' used herein may include, but are not limited to, a hardware device, such as circuitry in the form of discrete or integrated components, a Field Programmable Gate Array (FPGA) or Application Specific Integrated Circuit (ASIC), which performs certain tasks or provides the associated functionality. In some embodiments, the described elements may be configured to reside on a tangible, persistent, addressable storage medium and may be configured to execute on one or more processors. These functional elements may in some embodiments include, by way of example, components, such as software components, object-oriented software components, class components and task components, processes, functions, attributes, procedures, subroutines, segments of program code, drivers, firmware, microcode, circuitry, data, databases, data structures, tables, arrays, and variables.
[54] Although the example embodiments have been described with reference to the components, modules and units discussed herein, such functional elements may be combined into fewer elements or separated into additional elements. Various combinations of optional features have been described herein, and it will be appreciated that described features may be combined in any suitable combination. In particular, the features of any one example embodiment may be combined with features of any other embodiment, as appropriate, except where such combinations are mutually exclusive. Throughout this specification, the term "comprising" or "comprises" may mean including the component(s) specified but is not intended to exclude the presence of other components.
[55] Although a few example embodiments have been shown and described, it will be appreciated by those skilled in the art that various changes and modifications might be made without departing from the scope of the invention, as defined in the appended claims.

Claims (19)

  1. CLAIMS1. A 3D printing system adapted for use in a retail store, comprising: a scanner configured to scan a customer in the retail store; a controller unit configured to obtain a 3D model of the customer from the scan by scanner and to provide a printing control file; and a 3D printer configured to produce a figurine of the customer using the printing control file from the controller unit.
  2. 2. The system of claim 1, wherein the scanner is configured to obtain a plurality of 2D images of the customer and the controller is arranged to generate the 3D model from the images captured by the scanner.
  3. 3. The system of claim 2, wherein the scanner is arranged to project structured light onto the customer and to capture the images of the customer illuminated by the structured light.
  4. 4. The system of claim 1, wherein the scanner is a handheld scanner device.
  5. 5. The system of claim 1, further comprising a scanning booth arranged to locate the customer during the scanning by the scanner.
  6. 6. The system of claim 5, wherein the scanning booth provides a controlled visual environment during scanning of the customer by the scanner.
  7. 7. The system of claim 1, wherein the controller is arranged to derive an initial 3D model in real time during scanning of the customer by the scanner, and to provide a visual representation of the 3D model as visual feedback during the scanning.
  8. 8. The system of claim 1 wherein the controller is configured to create the 3D model as a triangulated surface mesh model of the customer.
  9. 9. The system of claim 1, wherein the controller is configured to perform integrity checking of the 3D model.
  10. 10. The system of claim 1, wherein the controller is configured to perform sculpting of the 3D model by adding, removing or manipulating a geometry of the 3D model.
  11. 11. The system of claim 10, wherein the controller is further arranged to display one or more 2D observations of the customer on a display screen while performing the sculpting of the 3D model.
  12. 12. The system of claim 1, wherein the controller is further arranged to apply a coloured surface texture to the 3D model.
  13. 13. The system of claim 12, wherein the surface texture is created from images obtained by the scanner during scanning of the customer.
  14. 14. The system of claim 13, wherein the controller is arranged to project a surface texture file onto the 3D model, wherein the surface texture file comprises colour regions derived from images of the customer obtained by the scanner.
  15. 15. The system of claim 1, wherein the 3D printer is a ceramic printer.
  16. 16. A method of 3D printing, comprising: scanning a customer in a retail store to provide an initial 3D model; performing integrity checking of the initial 3D model and making corrections so that the 3D model is able to be printed by a 3D printer machine as a figurine; sculpting the checked 3D model to refine a geometry of the 3D model, thereby providnig a sculpted 3D model; applying texture to the sculpted 3Dmodel to provide a textured 3D model; recording a printing control file from the textured 3D model; and printing the figurine of the customer using the printing control file.
  17. 17. The method of claim 16, further comprising displaying the textured 3D model and making colour corrections to the texture while applied to the 3D model.
  18. 18. A 3D printing system, substantially as hereinbefore described with reference to any of the accompanying drawings.
  19. 19. A method of 3D printing, substantially as hereinbefore described with reference to any of the accompanying drawings.
GB1319944.3A 2013-11-12 2013-11-12 Method and apparatus for 3D Printing Withdrawn GB2520255A (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
GB1319944.3A GB2520255A (en) 2013-11-12 2013-11-12 Method and apparatus for 3D Printing

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
GB1319944.3A GB2520255A (en) 2013-11-12 2013-11-12 Method and apparatus for 3D Printing

Publications (2)

Publication Number Publication Date
GB201319944D0 GB201319944D0 (en) 2013-12-25
GB2520255A true GB2520255A (en) 2015-05-20

Family

ID=49818496

Family Applications (1)

Application Number Title Priority Date Filing Date
GB1319944.3A Withdrawn GB2520255A (en) 2013-11-12 2013-11-12 Method and apparatus for 3D Printing

Country Status (1)

Country Link
GB (1) GB2520255A (en)

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN105045209A (en) * 2015-07-10 2015-11-11 青岛亿辰电子科技有限公司 Mini 3D doll head and body plug-in structure standardization manufacturing method

Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US5926388A (en) * 1994-12-09 1999-07-20 Kimbrough; Thomas C. System and method for producing a three dimensional relief
JP2001140121A (en) * 1999-08-19 2001-05-22 Natl Inst Of Advanced Industrial Science & Technology Meti Production method of dressform and apparatus therefor
GB2375988A (en) * 2001-05-03 2002-12-04 Peter David Hurley 3-D Bust maker
GB2434541A (en) * 2006-01-30 2007-08-01 Mailling Wright Products Ltd Preparing a clinical restraint
CH703650A2 (en) * 2010-08-17 2012-02-29 Miro Mandelz Method for producing three-dimensional model, involves generating model of three-dimensional figure with fixed or scalable size by process machine and creating three-dimensional model of figure

Patent Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US5926388A (en) * 1994-12-09 1999-07-20 Kimbrough; Thomas C. System and method for producing a three dimensional relief
JP2001140121A (en) * 1999-08-19 2001-05-22 Natl Inst Of Advanced Industrial Science & Technology Meti Production method of dressform and apparatus therefor
GB2375988A (en) * 2001-05-03 2002-12-04 Peter David Hurley 3-D Bust maker
GB2434541A (en) * 2006-01-30 2007-08-01 Mailling Wright Products Ltd Preparing a clinical restraint
CH703650A2 (en) * 2010-08-17 2012-02-29 Miro Mandelz Method for producing three-dimensional model, involves generating model of three-dimensional figure with fixed or scalable size by process machine and creating three-dimensional model of figure

Non-Patent Citations (1)

* Cited by examiner, † Cited by third party
Title
"For £300 you can buy a stunning 3D-printed version of yourself" By Kyle VanHemert, 02/08/13 *

Cited By (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN105045209A (en) * 2015-07-10 2015-11-11 青岛亿辰电子科技有限公司 Mini 3D doll head and body plug-in structure standardization manufacturing method
CN105045209B (en) * 2015-07-10 2019-09-24 泉州台商投资区长芳设计有限公司 Mini 3D male earner interts construction standard production method as head and body

Also Published As

Publication number Publication date
GB201319944D0 (en) 2013-12-25

Similar Documents

Publication Publication Date Title
CN108510577B (en) Realistic motion migration and generation method and system based on existing motion data
CN108389257A (en) Threedimensional model is generated from sweep object
Ballarin et al. Replicas in cultural heritage: 3D printing and the museum experience
JP5829371B2 (en) Facial animation using motion capture data
CN104299211B (en) Free-moving type three-dimensional scanning method
US20030197700A1 (en) Information processing apparatus, program for product assembly process display, and method for product assembly process display
JP7370527B2 (en) Method and computer program for generating three-dimensional model data of clothing
KR101744079B1 (en) The face model generation method for the Dental procedure simulation
DE102015213832A1 (en) Method and device for generating an artificial image
CN108961144A (en) Image processing system
CN106652037B (en) Face mapping processing method and device
US9892485B2 (en) System and method for mesh distance based geometry deformation
GB2520255A (en) Method and apparatus for 3D Printing
JP2003216973A (en) Method, program, device and system for processing three- dimensional image
JPH04256185A (en) Method for collecting sample picture of picture recognition system
CN116681854A (en) Virtual city generation method and device based on target detection and building reconstruction
CN114419121B (en) BIM texture generation method based on image
CN115661367A (en) Dynamic hybrid deformation modeling method and system based on photo collection
US10176628B2 (en) Method for creating a 3D representation and corresponding image recording apparatus
KR101782269B1 (en) creating and applying method of nurbs and follicle using the plug-in program
JP7406654B2 (en) Methods for creating a virtual environment restore of a real location
US20210183152A1 (en) Enhanced techniques for volumetric stage mapping based on calibration object
Pomaska Monitoring the deterioration of stone at Mindener Museum's Lapidarium
EP3779878A1 (en) Method and device for combining a texture with an artificial object
Hülsken et al. Modeling and animating virtual humans for real-time applications

Legal Events

Date Code Title Description
WAP Application withdrawn, taken to be withdrawn or refused ** after publication under section 16(1)