WO2018163042A1 - Fichiers d'impression uv déroulés à partir d'une projection de caméra - Google Patents

Fichiers d'impression uv déroulés à partir d'une projection de caméra Download PDF

Info

Publication number
WO2018163042A1
WO2018163042A1 PCT/IB2018/051396 IB2018051396W WO2018163042A1 WO 2018163042 A1 WO2018163042 A1 WO 2018163042A1 IB 2018051396 W IB2018051396 W IB 2018051396W WO 2018163042 A1 WO2018163042 A1 WO 2018163042A1
Authority
WO
WIPO (PCT)
Prior art keywords
projection
virtual
image
processor
texture
Prior art date
Application number
PCT/IB2018/051396
Other languages
English (en)
Inventor
Jake MCCRANN
Original Assignee
Mccrann Jake
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Mccrann Jake filed Critical Mccrann Jake
Publication of WO2018163042A1 publication Critical patent/WO2018163042A1/fr

Links

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T19/00Manipulating 3D models or images for computer graphics
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F30/00Computer-aided design [CAD]
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F2113/00Details relating to the application field
    • G06F2113/12Cloth
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F30/00Computer-aided design [CAD]
    • G06F30/10Geometric CAD
    • G06F30/13Architectural design, e.g. computer-aided architectural design [CAAD] related to design of buildings, bridges, landscapes, production plants or roads
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2219/00Indexing scheme for manipulating 3D models or images for computer graphics
    • G06T2219/021Flattening

Definitions

  • the present disclosure relates to the field of print-on-demand customizable apparel. More specifically, the present disclosure provides a system and method for generating unwrapped UV print-files for printing on real -world objects, as well as for generating manufacturing patterns for objects which require stitching to be created in real- world.
  • Some of these websites offer a 3D visualization of what the results will look like.
  • the visualization is often disclaimed to be just a general impression of the result to expect. All of these websites essentially depend on the User creating the artwork on a 2D map, a 2D image, that will be used to produce the surface-design on the 3D product.
  • an App for visualization has also been provided they are invariably limited to showing the customer essentially what result has been generated from putting the customer's image over the UV-map of the 3D model and it has yet to even be seen any of these websites even offering 3D model visualizations you can zoom in on to check the quality of the resolution up close.
  • a major problem of the prior art is that it is simply not possible to get continuity across the seams when you are using UV-map texture. You cannot rearrange the UV- islands or superimpose them to achieve this. Only with certain items would this occur when lines can be made collinear.
  • the prior art US 9,345, 280,B2 provides customers with a way to custom design on apparel is simply to take the UV-map of a 3D model of the real -world object, then, cut in on the edges of the UV-map to account for seam-allowance, and then stitch it together.
  • the UV-map is used either by providing it directly to the customer to design on themselves, or providing them with an online 2D image-editing tool with attached gallery of images to also select, and a text-editing tool to create some text on the apparel, or it is used by showing the customer a simple 3D visualization of the model, invariably built using extremely limiting html5/flash platforms, and then allowing them to upload images which are placed on the hidden UV-map and displayed results on the 3D model.
  • the User can then move the image using arrow keys on the user-interface, or change the size of the image, and they can click different parts of the 3D model to elect that part to be the next part they will upload an image for or choose an image for. Those parts they can click are invariably the UV-islands of the hidden UV-map.
  • the discontinuities caused by applying texture to UV-map when rendered on the model will appear most noticeably at the places in what in the real world is actually the seams. Therefore 3D artists prefer this embodiment because after applying texture to the UV map their discontinuities in the model will be apparent mainly at the seams and therefore less noticeable as 'imperfection' to the casual observer.
  • the present disclosure relates to a computer implemented method for transforming a real -world 3D object into an unwrapped 2D print file.
  • the method comprises projecting a 2D image onto a virtual 3D object, dividing surface of the virtual 3D object into a plurality of projection-texture mesh surfaces, unwrapping the virtual 3D object on each of the plurality of projection -texture mesh surfaces to obtain a UV-map for each of the projection-texture mesh, and generating unwrapped 2D print file for printing on the real -world 3D object.
  • the present disclosure relates to computer implemented method for transforming a real -world 3D object into an unwrapped 2D print file.
  • the method comprises projecting a 2D image onto a virtual 3D object, dividing surface of the virtual 3D object into a plurality of projection-texture mesh surfaces, unwrapping the virtual 3D object on each of the plurality of projection-texture mesh surfaces to obtain a UV-map for each of the projection-texture mesh, adding pixels at the edge of the UV map, and generating an unwrapped 2D print file for printing on the real world 3D object.
  • Fig. 1 A illustrate a prior art, where the model/avatar on the right, and on the left , how the mesh has been divided up;
  • Fig IB illustrates an environment of the working of the present disclosure in a communication network, according to various embodiments of the present disclosure;
  • FIG. 2 illustrates a method for transforming a real -world 3D object into an unwrapped 2D print file, according to various embodiments of the present disclosure
  • FIG. 3 A illustrates an exemplary embodiment of generation of a print file from projection of an image on a replica real world object, according to various embodiments of the present disclosure
  • Fig. 3B-3E illustrates an exemplary embodiment of various views of the virtual 3D objects, according to various embodiments of the present disclosure
  • Fig. 4 illustrates a method for transforming a real -world 3D object into an unwrapped 2D print file, according to various embodiments of the present disclosure
  • Figs. 5A to 5 J2 illustrate an exemplary embodiment of printing an image on a real on a real -world object, using edge bleeding, according to various embodiments of the present disclosure
  • FIGs. 6 A to 9B illustrate an exemplary embodiment of seam allowance and printing an image on a portion of a real -world object, according to various embodiments of the present disclosure
  • FIG. 1 A illustrate a prior art, where the model/avatar on the right, and on the left , how the mesh has been divided up.
  • This choice of division is very complex subject in gaming and animation, it will determine how the UV-map looks when you unwrap it, as you have decided this is how the model will be segmented. Understand, that when you make the model in the first place you do not make it in pieces as you see on the left. These have been cut like this to prepare the model for unwrapping that will give the best result.
  • gaming the best result is considered to be the easiest one to relate to the 3D model so you can guess where to apply texture to it. This is the only objective when dividing the model up like this into separate meshes
  • the present disclosure relates generally, as indicated, to mobile applications/computer software stored in a memory and executed by one or more processor is capable of generating UV print files for printing on real world 3D objects.
  • the graphics software is digital imaging software; and according to various embodiments described below, such graphics software may be construed as graphics software generally and, more particularly, to digital imaging software.
  • the disclosure relates to a method and system of converting 2D graphic images to 3D stereoscopic images using such plug-ins in combination with existing or commercially available graphics software.
  • Reference to existing or commercially available graphics software means software that can be used to develop, to make, to modify, to view, etc., images. Those images may be displayed, printed, transmitted, projected or otherwise used in various modes as is well known.
  • the Adobe Photoshop software can be used for many purposes, as is well known.
  • Reference to graphics software as being existing or commercially available is intended to mean not only at present but also that which has been existing or commercially available in the past and that which may exist or become commercially available in the future. For brevity, all of these are referred to below collectively and severally as "existing" graphics software.
  • FIG. 1A illustrates one example of a system architecture and data processing device that may be used to implement one or more illustrative aspects described herein in a standalone and/or networked environment.
  • Various network nodes 105, 136, and 132 may be interconnected via a wide area network (WAN), such as the Internet.
  • WAN wide area network
  • Other networks may also or alternatively be used, including private intranets, corporate networks, LANs, metropolitan area networks (MAN) wireless networks, personal networks (PAN), and the like.
  • Network 101 is for illustration purposes and may be replaced with fewer or additional computer networks.
  • a local area network may have one or more of any known LAN topology and may use one or more of a variety of different protocols, such as Ethernet.
  • Devices and other devices may be connected to one or more of the networks via twisted pair wires, coaxial cable, fibber optics, radio waves or other communication media.
  • network refers not only to systems in which remote storage devices are coupled together via one or more communication paths, but also to stand-alone devices that may be coupled, from time to time, to such systems that have storage capability. Consequently, the term “network” includes not only a “physical network” but also a “content network,” which is comprised of the data-attributable to a single entity-which resides across all physical networks.
  • the components may include data server 136 or web server 136, and client computers 105.
  • Data server 103 provides overall access, control and administration of databases and control software for performing one or more illustrative aspects describe herein.
  • Data server 136 may be connected to a web server through which users interact with and obtain data as requested. Alternatively, data server 136 may act as a web server itself and be directly connected to the Internet. Data server 136 may be connected to web server 136 through the network 101 (e.g., the Internet), via direct or indirect connection, or via some other network. Users may interact with the data server 136 using remote computers 105, e.g., using a web browser to connect to the data server 136 via one or more externally exposed web sites hosted by web server. Client computers 105 may be used in concert with data server 136 to access data stored therein, or may be used for other purposes. For example, from client device 105 a user may access web server 136 using an Internet browser, as is known in the art, or by executing a software application that communicates with web server 136 and/or data server 136 over a computer network (such as the Internet).
  • a computer network such as the Internet
  • FIG. IB illustrates just one example of a network architecture that may be used, and those of skill in the art will appreciate that the specific network architecture and data processing devices used may vary, and are secondary to the functionality that they provide, as further described herein. For example, services provided by web server 136 and data server 136 may be combined on a single server.
  • Each component 105, 136, 132 may be any type of known computer, server, or data processing device, computing device, camera 132.
  • Each component 105, 136, 132 e.g., may include one or more processor 111.
  • Each component 105, 136, 132 may further include random access memory (RAM), read only memory (ROM), network interface, input/output interfaces (e.g., keyboard, mouse, display, printer, etc.), and memory.
  • Input/output (I/O) may include a variety of interface units and drives for reading, writing, displaying, and/or printing data or files.
  • Memory 130 may further store operating system software for controlling overall operation of the data processing device, control logic for instructing data server to perform aspects described herein, and other application software providing secondary, support, and/or other functionality which may or might not be used in conjunction with aspects described herein.
  • the control logic may also be referred to herein as the data server software.
  • Functionality of the data server software may refer to operations or decisions made automatically based on rules coded into the control logic, made manually by a user providing input into the system, and/or a combination of automatic processing based on user input (e.g., queries, data updates, etc.).
  • Memory 130 may also store data used in performance of one or more aspects described herein, including a first database and a second database.
  • the first database may include the second database (e.g., as a separate table, report, etc.). That is, the information can be stored in a single database, or separated into different logical, virtual, or physical databases, depending on system design.
  • the memory can be a remote storage.
  • One or more aspects may be embodied in computer-usable or readable data and/or computer-executable instructions, such as in one or more program modules, executed by one or more computers or other devices as described herein.
  • program modules include routines, programs, objects, components, data structures, etc. that perform particular tasks or implement particular abstract data types when executed by one or more processor in a computer or other device.
  • the modules may be written in a source code programming language that is subsequently compiled for execution or may be written in a scripting language such as (but not limited to) Hypertext Mark-up Language (HTML) or Extensible Mark-up Language (XML).
  • HTML Hypertext Mark-up Language
  • XML Extensible Mark-up Language
  • the computer executable instructions may be stored on a computer readable medium such as a non-volatile storage device.
  • Any suitable computer readable storage media may be utilized, including hard disks, CD-ROMs, optical storage devices, magnetic storage devices, and/or any combination thereof.
  • various transmission (non-storage) media representing data or events as described herein may be transferred between a source and a destination in the form of electromagnetic waves traveling through signal-conducting media such as metal wires, optical fibbers, and/or wireless transmission media (e.g., air and/or space).
  • signal-conducting media such as metal wires, optical fibbers, and/or wireless transmission media (e.g., air and/or space).
  • signal-conducting media such as metal wires, optical fibbers, and/or wireless transmission media (e.g., air and/or space).
  • Various aspects described herein may be embodied as a method, a data processing system, or a computer program product. Therefore, various functionalities may be embodied in whole or in part in software, firmware and/or hardware or hardware equivalents such as integrated circuits, field programm
  • FIG. IB illustrates a block diagram of a system 100 for describing the environment of the functioning of the present disclosure for creating seam allowance of design patterns on UV map imaged objects.
  • the system 100 includes a user device 105, a graphical user interface 110, a processor 115, a display 120, sensors 124, Input unit 122, a communication interface 123, mobile application 126, a database 128, memory unit 130, , a 2D/3D/360 degree image capturing camera 132, a real world object 134 and a remote server 136.
  • the said remote server 136 is communicably coupled to the user device 105 which is being operated by a user 110.
  • the said remote server includes a database for storing virtual replica of a number of real world objects varying from hats, mugs, T-shirts, and many more.
  • the memory unit 130 store instructions, user data and other related items.
  • Example of user data and items includes but not limited to first name, last name, phone number, email ID, objects preference, design preference etc.
  • the database 504 is coupled to the memory unit 130 for storing plurality of UV mapped 3D objects. Examples of object include but not limited to cap, T-shirts, trousers, bags, garments, soccer ball, basket-ball, and other similar items that may be stitched together.
  • the graphical user interface 110 displays the processed instructions.
  • Example of the graphical user interface 110 includes but not limited to LED, LCD, OLED, touch- based display units etc.
  • the processor 115 is coupled with the graphical user interface 110 and the memory unit 130 to process the stored instructions.
  • the instructions initiate with receiving an object selected by a user.
  • the objects are explained in detail in conjunction with the database stored at a remote server 136.
  • the user selects a design pattern that the desires to be printed on the object.
  • the said design pattern can be selected from the memory unit 130 in the user device 105 or could also be fetched from the internet or could also be fetched from the remote server 136 or could also be fetched from camera 132.
  • the user is able to communicate through the communication network to bring in various design patterns for printing on the objects. Examples of the design patterns include but not limited to flowers, animals, plants, characters, cartoons, self-portraits, geometrical designs etc.
  • the processor 115 is configured to execute instructions for creating plurality of segments of the selected object. The system then divides the object into plurality of segments as required by the user.
  • (rgblmg[3] ! rgbImg_bottom[3])) //edge, where in rbg is color of the pixel at edges of segments and rgblmg is current pixels & rgblmg right is right side pixel & rgblmg bottom is bottom row pixel.
  • the number of pixels is added around the edge of each segment of the selected object, such that when each segment is stitched together there is seam allowance on the complete object.
  • the pixels are increased so that when the design is printed on the real world object there is no obscured and erroneous pattern on the object especially around the seam.
  • the instructions include a step of allowing the user to modify the design pattern.
  • the user is able to modify the design pattern by using mobile application features such as X-ray feature for projection, zooming in, zooming out, rotating, changing colors, adding text to the design, save the UV-map, share the UV-map with manufacture, use ortho features for orthogonal projection, adjust angle, adjust size, freeze the image , resets the modification, save the modification for future projects, navigation buttons for moving the image left, right, up , down and rotate 360, search 3D object or model, UV map, UV-segments, 3-D objects, 2D/3D images etc.
  • the said method 200 is a computer implemented method capable of being installed as application software in a computer or smart phone.
  • the said method 200 starts at step 205 where a 2D image is selected by a user.
  • a 2D image is selected by a user.
  • computer graphics have gained popularity among people. Pictures having high resolution with large number of pixels are much clearer and appealing to the human eye. It is further important to have clearer print of an image on a real world 3-dimensional object. The texture being exactly same as a real -life projector projecting an image onto any 3D object, as can be seen the surface where the projection hits, and this disclosure enables such projecting.
  • the user selects and projects an image to be onto a virtual 3D object.
  • a virtual 3D object could be any real -world object where upon the users wants the image to be printed.
  • Such real -world object could be a hat, coffee mugs, apparels and many more and the list is endless. The present disclosure enables such projection on virtual 3D forms of any real -world objects.
  • step 210 the process of dividing the surface of the virtual 3D objects into a plurality of projection texture mesh surfaces. Such division ensures a better distribution and alignment of the 2D image over the surfaces of the virtual 3D object.
  • unwrapping of the mesh is carried out. Eeach of the plurality of projection-texture mesh surfaces is unwrapped to obtain a UV-map for each of the projection-texture mesh. It is the UV-map which will be able to generate a print-file that can be applied to the real-world replicate of the 3D model. Since a UV-map is generated from reach of the mesh-parts used to construct the 3D model, and the same 3D model can be constructed in many different combinations of meshes, it is practical always to make sure that the construction of the 3D model from its mesh components was done in such a way which would generate a UV-map that is suitable for the printing and application on the 3D real-world model.
  • unwrapped 2D print file includes implementation instructions for the user, wherein the implementation instructions include starting point, starting angle, exact location part of real -world 3D object and the combination thereof.
  • UV map is required having at least 5 separate pieces such as Front/back/left-right/top of the car.
  • An efficient manner would be dividing the car-mesh up into every part of the car where a print-out has to be cut. In this manner dividing the mesh of the car up into its doors, windows, every place where the printed adhesive would have to be cut if it had been printed out continuously as one whole, for example, 'left side of the car.
  • a 2D print file is generated which can be printed on real world 3D objects.
  • FIG. 3A there illustrates an exemplary embodiment of generation of a print file from projection of an image on a replica real world object.
  • an image which is required to be printed on a real-world building.
  • a virtual replica of the building which is shown in the application and can be viewed by users to understand and feel the exact look of the building when the image will be printed on it.
  • a UV print file which is generated from the said image and ready for printing.
  • Fig. 3B 3E there are shown different views of a set of virtual 3D mugs when an image is projected on the said set. The front view of the said mugs when the handles are behind them is shown.
  • FIG. 300a Open scene for the mug-set or any set scene as shown in fig 3 B .
  • the diagram shows each section of the mobile application user interface 618, wherein the each section in mark in 300 a user interface is same in all the image showing user interface .
  • the PUBLISH button 325 is used when the User is finally happy with the result. It will pop-up a window asking the User to name the project. This name is used for the url address under the User's account 335.
  • the User has clicked IMAGE button 310 and loaded an image.
  • the User has also clicked all of the top row of the Image-cells 355 which contain UV map for the mug object 630, so the mobile application receive all the images of the mug 630 user interface at 300 b.
  • the User has used the SIZE control to enlarge image at 300 c.
  • the User has used the ANGLE control 345. This rotates the object up to 360 degrees at 300 d.
  • the User has used the arrow controls 330 to move the image around left/right/up/down at 300 e as shown fig. 3C.
  • the User has used the Zoom control 340 at user interface 300f. This is one of the most brilliant features as allows user to check the resolution against real-world size by zooming in.
  • the arrow keys on the keyboard can be used to move the whole scene up/down/left/right, which is needed when zoomed in close.
  • the useer has enlarged the image now to a full scene across the mugs at user interface 300 g.
  • the user has frozen the image by using the freeze button 315 and is now checking the back of the mugs at user interface 300 h. Notice some ugly areas.
  • the User has projected last image on the ugly part of one mug found in user interface 300 I. This shows all the images added on the back of the scene to cover the ugly areas.
  • the fig 3E shows the UV maps use on the mug as gown in image cell in the user interface
  • the Fig 3B to 3-E shows how a user can use the same interface and perform various functionality. It also show sequencally y how a user a project images on a 3D object which is a rea-world example.
  • printing on apparels has been most popular among other graphics utilities.
  • the said method 400 starts at step 405 where a 2D image is selected for projection by a user.
  • the texture being exactly same as a real -life projector projecting an image onto any 3D object, as can be seen the surface where the projection hits, and this disclosure enables such projecting.
  • the user selects and projects 2D image onto a virtual 3D object.
  • a virtual 3D object could be any real -world object where upon the users wants the image to be printed.
  • Such real -world object could be a hat, coffee mugs, apparels, building, car and many more and the like.
  • the present disclosure enables such projection on virtual 3D forms of any real -world objects.
  • step 410 the process of dividing the surface of the virtual 3D objects into a plurality of projection texture mesh surfaces. Such division ensures a better distribution and alignment of the 2D image over the surfaces of the virtual 3D object.
  • step 415 unwrapping of the mesh is carried out. Eeach of the plurality of projection-texture mesh surfaces is unwrapped to obtain a UV-map for each of the projection-texture mesh. It is the UV-map which will be able to generate a print-file that can be applied to the real-world replicate of the 3D model. Since a UV-map is generated from reach of the mesh-parts used to construct the 3D model, and the same 3D model can be constructed in many different combinations of meshes, it is practical always to make sure that the construction of the 3D model from its mesh components was done in such a way which would generate a UV-map that is suitable for the printing and application on the 3D real-world model.
  • step 420 pixels are added at the edges of the UV map, thereby carrying out edge padding of the UV maps.
  • the colours of the said pixels are same as their adjacent pixels.
  • the said pixels appear as printed-dots, at resolution usually between 150-300 dpi. However, this should not be construed as a limitation of the resolution. There are other possible resolutions which could be used.
  • UV maps can be generated using the current method of edge-padding, clothing/switchable articles which require seam-allowance.
  • Such process of edge-padding capable of being applied to print UV maps for printing on cars.
  • the mesh that creates the UV-map in sections is made so that the whole door is itself an individual mesh, then this is printed alone and it has the edge-padding applied to it to account for the part which will be wrapped around to the inside of the door.
  • An ample amount of edge-padding for each section is provided to ensure that efficient and perfect manual-application on the physical article.
  • the step includes generating unwrapped 2D print files for printing on the real world 3D objects.
  • the present disclosure is capable of providing a virtual replica of 100 different sizes of apparels and the user can select the one which fits to the customer. It is observed, in the market, there is limited number of sizes available in stores whereas there are many different body sizes and shapes of humans. Therefore, it becomes difficult for people to select the best size of the garment. [0080]
  • the edge padding of the UV maps enables such generation of garments of varying sizes.
  • the present systems and methods require input from the user and then select the best fit from among the 100 existing sizes for the said user.
  • Virtual replica of the hat is shown in 605.
  • a divided surface of the virtual hat into a plurality of projection texture mesh surfaces. Such division ensures a better distribution and alignment of the 2D image over the surfaces of the virtual 3D object.
  • At 615 there is shown projecting of an image on the virtual 3D hat.
  • At 620 there is shown a user interface of the application where the projected image is shown on the hat and the various divided textures are also shown.
  • Figs. 5A and 5H provide an example of a hat when an image is required to be printed on the hat.
  • Figs. 5A and 5H shows the planar projection-texture of an image onto a hat. This gives that hat a continuity of image over the entire hat.
  • the method which is implemented by the software the software is adapted to receive the colour information of the 2D image projection 615 and export this information as 2D uv-texture map depending on the uv- space that includes the uv- segments 610/625 that are exactly the same of the patterns to be used in manufacturing except they are 0.25 inches around the edge smaller than the manufacturing pattern which includes the seam-allowance. As shown in Figs.
  • Figs. 5A-5H disclose edge- bleeding.
  • the plurality of 8 mesh or UV-segments 610/625 that make up the 3D hat are connected in space simply by their coordinates.
  • the hat has seam-allowance added to it simply by printing any colour around it to extend its edges by 0.25 inches (which in the case of this hat was determined to be 26 pixels), then when the seams are sown together, it is more than likely and especially over time, that the white seam edges or whatever colour they were, will become visible against the backdrop of the original imagery on the hat.
  • the present invention provide a way of automating the manner in which seam-allowance is dealt with is found to be edge-bleeding.
  • the present invention provide the continuity of the image across a seam when the apparel is made in real-world, by providing continuity across the seam the apparel 3D model when using projection-texture in virtual world. If they are using projection-texture then seam-continuity is not a problem because they are not using the UV-map as the reference for the texture on the model, they are instead locking projectors to the model which permanently and continuously project the image onto the model as shown Fig.5 A at 615. But when they want to animate models they can't do this. They need to use UV- map texture. But these Uv-maps are so complex they can't easily design on them. So a method allows a user to painting directly onto a 3D model called Polypainting by using projection-texture .
  • the present invention use an orthographic x-ray projection-texture as the starting point for the base design of any apparel gets the most fantastic results as shown in Fig.5 A at 615. Consecutive projections then are used only for stamping logos or smaller images onto the apparel.
  • the same shaped hat (as far as the outer layer is concerned) in two different ways, it's only difference being in where the seam-lines and stitching' s are - so this difference then would be only aesthetic - and therefore show how any UV-map of a 3D model of any real -world object which requires pieces/parts (UV island or segments) to be stitched or seamed together in order to construct it - can be converted into a plurality of manufacturing patterns, no matter how the 3D model was itself built in parts as of its meshes, meshes themselves being constructed of polygons which are conveniently organised into sections called meshes so that they may be defined as an entire section for which then can be computationally (automatically) unwrapped by using computing device, flattened out, into what is referred to as a UV-map.
  • UV-maps 610, 635,640 with the edge-bleeding is created using present invention - in fact it is not so simple to discover, but like most things once discovered they are simple, hence disclosure.
  • the present invention provides a method for converting UV-maps into manufacturing patterns by using projection-texture for customer-design apparel, and it is the use of projection-texture which then benefits the most from solving the seam- allowance problem in this manner, so as to ensure whatever pattern the User created, that the seams will ensure its continuity exists after the stitching process.
  • Regarding the stitches themselves they are unavoidable part of the manufacturing process and considered an aesthetic quality.
  • Fig. 5B Object Control Panel 618 there is a choice of colours for the User to decide on.
  • the stitches like the white rivets you can see, they are themselves mesh objects, but they are default excluded from the projection- texture, i.e. they are invisible to it.
  • the mobile application using the present invention allow the customers to choose and visualize simultaneously the stitching and rivet-colour or stitching-colour choice whilst they are playing with the projection-texture tools to discover a design they fancy to purchase.
  • FIG. 5A-5 J2 illustrate edge-bleeding and its effect to do with a 2D image.
  • Figures 5 I- 5 J2 are of a sunflower 630 which is suitable for this illustration at step 1.
  • the software application or method is adapted to separate the sunflower 630 into two parts (635, 640) which is identical to separating two parts (635, 640) onto a UV map at step 2. Only difference is that this sunflower image 630 is separated in Fig. 5 I and 5 J2 as two UV-segment (635, 640) in the UV-map, and then edge-bleeding 645, 646 at two UV- segment (635, 640) is applied respectively at 3.
  • edge-bleeding 645, 646 at two UV-segment (635, 640) is stitched or joined by identifying the edges of two UV-segment (635, 640) and matching rgb value of the edges of the two UV-segment (635, 640) at step 4.
  • step 3 and 4 has been put a doted box covering the same border-width that the edge-bleeding exists at. It is not even filled in with all white and yet it is immediately visible. This is what would happen if you just added seam allowance to the UV map by adding white.
  • the UV map generated by the App may include an algorithm/instructions which process the image data to provide desirable result, shown in Fig.5 1-5 J2, has UV- islands/segments sitting on an alpha canvas. All of the alpha channels are zero. This is what the algorithm uses to know where to attack:
  • High resolution PNG file with 4 channels (Blue, Green, Red & Alpha)
  • edge detection Detect edges of the image with the help of alpha channel information. Alpha changes from zero to non-zero and vice versa at edges.
  • rgblmg is current pixels & rgblmg right is right side pixel & rgblmg bottom is bottom row pixel.
  • edge type 1 (pixel alpha value becoming zero)
  • edge type 2 (pixel alpha value becoming non-zero)
  • This instructions is then written in C++ and compiled as an executable to input the UV-map and have it transformed in the manner above.
  • a user might want an image to be printed on a portion of and object rather than printing the image on the whole object.
  • the embodiments of the present disclosure enable such selection by the user for printing on an object.
  • a virtual replica of a hat As discussed earlier, the virtual hat is divided into a plurality of projection-texture mesh surfaces. Each of the projection-texture mesh surfaces is unwrapped along with at least one-part 2D image located on the virtual 3D object. The user may opt to select only the top portion of the hat from among various portions and project the image selectively on the top portion. Thereby, an unwrapped 2D print file a generated according to the present systems and methods.
  • Fig.6A illustrates a section of the hat where the image is projected, which consists of two meshes. And then use it to make some 3D examples of each version of solution.
  • Fig.6 A shows two mesh removed from the hat, and then Fig.6 A shows that those edge-bleeding applied is applied to the two mesh.
  • the edge-bleeding is applied at fig.6A, it extends the border or edges of the UV island or segments and provide the edge-bleeding on seam allowance, after completing the edge-bleeding on seam allowance, the result would be shown in fig.6 A after stitching, the edge-bleeding on seam allowance provide the continuity of the image, the hat will be the right size, but the colour of the seam edge added for the seam-allowance would be visible.
  • FIG. 7A-7C showing an exemplary embodiment of the present invention
  • the figures show an avatar having an full scene of clothing, and being able to swap different clothes. If they click on the Hat-object it would open the Object- Options Window where the User can go and search for another hat to replace the current model they are using, it's possible to use kitchen sets, or bathroom sets, apparel or any real world 3D object using this Application because of what it makes possible by using projection-texture only.
  • FIG 8A-8D showing an exemplary embodiment of the present invention
  • This example exemplifies the beauty of using projection-texture.
  • One object can be partially (or wholly) blocking the view of the other and still be hit by the projection-image. Notice in Fig 1 that the female partner has been removed from scene. This is done by clicking on the Object in the Object Box. Double-Clicking it will open the Object Options Window.
  • a projection can be removed from an object by double-clicking on the image-cell for that projection on that object.
  • FIG. 9 A-B is a demonstration of the technique and it's affect, and effect when seen by the observer from the viewpoint of the projector-camera, and a view-point away from the camera.
  • This in Fig. 9 A-B is an example of the result attainable using Perspective Projection-Texture, there are countless types of projection-texture employed in 3D animation software to solve particular objectives. Such examples begging with the simplest being perspective, then planar, then cylindrical, then spherical, and then the complexity becomes limitless.
  • Perspective-projection as seen in Fig.9 A-B is the preferred type of projection when you are determined to have a targeted audience in a known location of observation to see the 2D photo illusion on the 3D object it is projected on as an advertisement. Indeed, sunlight or any light source would partially give it away, however with a mat-surface finish (non-reflective) the illusion can be upheld.
  • Planar Projection as just mentioned above as one of the two types of projection-texturing to be used in the present invention
  • FIG.9 A-B Planar-projection works best when you want to use XRAY projection of a pattern through an apparel, so as to keep the symmetry of the pattern consistent - otherwise when using perspective-projection the image/artwork/pattern projected will come out larger on the other side. In most cases you want a continuity of the image-pattern through the whole object to its other side, such as with clothes
  • Fig. 9 A-B is an example of an application of perspective-projection that can also be used in this App to be disclosed below.
  • This photo was taken whilst standing up.
  • An aircraft passenger typical spend 99% of their time sitting down, and when standing up they are either looking for their seat, going the bathroom. The other time they are standing up is when they are opening these cabin doors. From the seated position it was not possible to capture the image well enough, thus it was taken standing up.
  • the point made in this diagram is when comparing it to Fig 9 B of the Mona Lisa on the Kombi. It's possible to achieve a projection-texture 2D image-plane that stairs down at the passenger from the seated observation position they are in for the majority of the flights as shown Fig.9 B.

Abstract

La présente invention concerne un procédé permettant à des clients d'utiliser une visualisation 3D de produits du monde réel pour créer des conceptions de surface personnalisées à l'aide d'une texture de projection seule. Le procédé consiste à projeter une ou plusieurs images 2D consécutivement sur un objet 3D virtuel, à diviser la surface dudit objet en surfaces de maillage de texture de projection, à dérouler l'objet 3D sur chaque surface de maillage de texture de projection pour obtenir une carte UV pour produire un fichier d'impression 2D à utiliser dans la création d'un résultat de réplique de la visualisation 3D dans le monde réel. Dans le cas de produits qui nécessitent une couture à construire dans le monde réel, la présente invention concerne également une solution pour ajouter une tolérance de couture aux fichiers d'impression d'une manière qui résout la discontinuité apparente provoquée par le processus de couture, malgré l'ajout d'une tolérance de couture normale; ce procédé assure une illusion de continuité où sans sans laquelle il existerait une discontinuité perceptible au niveau de la couture.
PCT/IB2018/051396 2017-03-04 2018-03-05 Fichiers d'impression uv déroulés à partir d'une projection de caméra WO2018163042A1 (fr)

Applications Claiming Priority (4)

Application Number Priority Date Filing Date Title
US201762467089P 2017-03-04 2017-03-04
US62/467,089 2017-03-04
US201762578527P 2017-10-30 2017-10-30
US62/578,527 2017-10-30

Publications (1)

Publication Number Publication Date
WO2018163042A1 true WO2018163042A1 (fr) 2018-09-13

Family

ID=63448076

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/IB2018/051396 WO2018163042A1 (fr) 2017-03-04 2018-03-05 Fichiers d'impression uv déroulés à partir d'une projection de caméra

Country Status (1)

Country Link
WO (1) WO2018163042A1 (fr)

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2022191831A1 (fr) * 2021-03-10 2022-09-15 Nanyang Technological University Placement d'éléments de conception sur des surfaces 3d

Citations (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20040049309A1 (en) * 2001-01-19 2004-03-11 Gardner James Holden Patrick Production and visualisation of garments
US20090144173A1 (en) * 2004-12-27 2009-06-04 Yeong-Il Mo Method for converting 2d image into pseudo 3d image and user-adapted total coordination method in use artificial intelligence, and service business method thereof
US20150351477A1 (en) * 2014-06-09 2015-12-10 GroupeSTAHL Apparatuses And Methods Of Interacting With 2D Design Documents And 3D Models And Generating Production Textures for Wrapping Artwork Around Portions of 3D Objects

Patent Citations (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20040049309A1 (en) * 2001-01-19 2004-03-11 Gardner James Holden Patrick Production and visualisation of garments
US20090144173A1 (en) * 2004-12-27 2009-06-04 Yeong-Il Mo Method for converting 2d image into pseudo 3d image and user-adapted total coordination method in use artificial intelligence, and service business method thereof
US20150351477A1 (en) * 2014-06-09 2015-12-10 GroupeSTAHL Apparatuses And Methods Of Interacting With 2D Design Documents And 3D Models And Generating Production Textures for Wrapping Artwork Around Portions of 3D Objects

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2022191831A1 (fr) * 2021-03-10 2022-09-15 Nanyang Technological University Placement d'éléments de conception sur des surfaces 3d

Similar Documents

Publication Publication Date Title
US11367250B2 (en) Virtual interaction with three-dimensional indoor room imagery
US9542069B2 (en) System and method for creating on-demand products
US7062722B1 (en) Network-linked interactive three-dimensional composition and display of saleable objects in situ in viewer-selected scenes for purposes of promotion and procurement
US7523411B2 (en) Network-linked interactive three-dimensional composition and display of saleable objects in situ in viewer-selected scenes for purposes of object promotion and procurement, and generation of object advertisements
Dwyer et al. Commodifying difference: selling EASTern fashion
US6310627B1 (en) Method and system for generating a stereoscopic image of a garment
US8174539B1 (en) Imprint for visualization and manufacturing
EP3562134B1 (fr) Utilisation de l'imagerie infrarouge pour créer des images numériques pour une utilisation dans une personnalisation des produits
KR101995788B1 (ko) 콘텐츠 오버레이 및 실측계산으로 미리보기가 가능한 커스터마이징 물품 제작 서비스 제공 방법
JP2018535458A (ja) バーチャルリアリティ環境内の三次元描画のためのドレスフォーム
US20070294142A1 (en) Systems and methods to try on, compare, select and buy apparel
KR20070044102A (ko) 3차원 텍스타일 디자인 방법, 및 텍스타일 디자인프로그램을 기록한 컴퓨터-판독가능 저장매체
US11562423B2 (en) Systems for a digital showroom with virtual reality and augmented reality
US11348325B2 (en) Generating photorealistic viewable images using augmented reality techniques
CN116670723A (zh) 用于定制产品的合成视图的高质量渲染的系统和方法
US9165409B2 (en) System and method for creating a database for generating product visualizations
US20170039749A1 (en) System and method for generating product visualizations
WO2018163042A1 (fr) Fichiers d'impression uv déroulés à partir d'une projection de caméra
KR20060108271A (ko) 디지털 패션 디자인용 실사 기반 가상 드레이핑 시뮬레이션방법
KR101726397B1 (ko) 재봉선 및 시접 객체 생성 방법 및 장치
US20230051783A1 (en) 3D Digital Imaging Technology for Apparel Sales and Manufacture
Amos et al. Exploring Digital Techniques for Fashion Illustration
Kemper et al. Working with 3D Vectors
Imhof Real-time rendering of proxy based 3D paintings using fin textures
Flavell Uv mapping

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 18764617

Country of ref document: EP

Kind code of ref document: A1

NENP Non-entry into the national phase

Ref country code: DE

122 Ep: pct application non-entry in european phase

Ref document number: 18764617

Country of ref document: EP

Kind code of ref document: A1