EP2159755A1 - Procédé de simulation de tricots, appareil pour le procédé, et support de stockage - Google Patents

Procédé de simulation de tricots, appareil pour le procédé, et support de stockage Download PDF

Info

Publication number
EP2159755A1
EP2159755A1 EP08753121A EP08753121A EP2159755A1 EP 2159755 A1 EP2159755 A1 EP 2159755A1 EP 08753121 A EP08753121 A EP 08753121A EP 08753121 A EP08753121 A EP 08753121A EP 2159755 A1 EP2159755 A1 EP 2159755A1
Authority
EP
European Patent Office
Prior art keywords
knitwear
image
human body
body model
view point
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Withdrawn
Application number
EP08753121A
Other languages
German (de)
English (en)
Other versions
EP2159755A4 (fr
Inventor
Shinji Yamamoto
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Shima Seiki Mfg Ltd
Original Assignee
Shima Seiki Mfg Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Shima Seiki Mfg Ltd filed Critical Shima Seiki Mfg Ltd
Publication of EP2159755A1 publication Critical patent/EP2159755A1/fr
Publication of EP2159755A4 publication Critical patent/EP2159755A4/fr
Withdrawn legal-status Critical Current

Links

Images

Classifications

    • AHUMAN NECESSITIES
    • A41WEARING APPAREL
    • A41HAPPLIANCES OR METHODS FOR MAKING CLOTHES, e.g. FOR DRESS-MAKING OR FOR TAILORING, NOT OTHERWISE PROVIDED FOR
    • A41H3/00Patterns for cutting-out; Methods of drafting or marking-out such patterns, e.g. on the cloth
    • A41H3/007Methods of drafting or marking-out patterns using computers
    • DTEXTILES; PAPER
    • D04BRAIDING; LACE-MAKING; KNITTING; TRIMMINGS; NON-WOVEN FABRICS
    • D04BKNITTING
    • D04B37/00Auxiliary apparatus or devices for use with knitting machines
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F30/00Computer-aided design [CAD]
    • G06F30/20Design optimisation, verification or simulation
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T15/003D [Three Dimensional] image rendering
    • G06T15/50Lighting effects
    • G06T15/60Shadow generation
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F2113/00Details relating to the application field
    • G06F2113/12Cloth

Definitions

  • the present invention relates to simulation of a condition where knitwear is applied onto a human body model or the like, and particularly to simulation to express the details of a thread such as fluff, as well as a shadow generated on the human body model by the knitwear.
  • the inventor has been developing a knitwear simulation method. Simulation of a condition where knitwear is applied onto a human body model or a cloth is carried out three-dimensionally, and the details of a thread such as fluff, and a shadow generated on the human body mode or the like by the knitwear have to be expressed in order to carry out realistic simulation.
  • the thread can be expressed by a pipe
  • the fluff can be expressed by a fine pipe that projects from a thread body.
  • such a process lengthens the time required in the simulation.
  • the knitwear makes a shadow on the human body model, and it is difficult to obtain this shadow based on the thread of the knitwear by means of ray tracing.
  • Patent Literature 1 ( WO 2003/032203 ) and Patent Literature 2 ( WO 2004/090213 ) disclose a method for expressing fluff of a thread by two-dimensionally simulating knitwear.
  • Patent Literature 3 ( WO 2005/082185 ) discloses three-dimensional simulation of knitwear by applying knitwear virtually on a human body model.
  • An object of the present invention is to realistically simulate knitwear by means of stitches with fluff, and to simulate a shadow generated on the human body model by the knitwear, at high speed.
  • the present invention is a method for simulating a condition where knitwear is applied onto a human body model by using design data and thread data of the knitwear, and a three-dimensional image of the human body model, the method being characterized in having the steps of:
  • the method be further provided with a step of preparing an image of a fabric worn inside the knitwear on the human body model, wherein the darkening step darkens the color image of the human body model and a color image of the fabric at the shadow part of the knitwear with a position resolution lower than that of the stitches, and the combining step uses the opacity image of the knitwear to combine the color image of the human body model, the color image of the fabric, and the two-dimensional color image of the knitwear.
  • the image of the fabric is a three-dimensional image when the simulation is performed on a simple piece of cloth, or a two-dimensional image similar to that of the knitwear and a two-dimensional opacity image when the simulation is performed on a knit fabric.
  • the method be further provided with a step of creating a two-dimensional color image of the human body model and an image expressing a position of the human body model in a depth direction viewed from the view point, and a two-dimensional color image of the fabric and an image expressing a position of the fabric in the depth direction viewed from the view point, on the basis of the position of the view point and the position of the light source, and an image expressing a position of the knitwear in the depth direction viewed from the view point be created in addition to the two-dimensional color image and opacity image of the knitwear; a range where the images of the knitwear overlap with the image of the human body model or the fabric be obtained as a shadow part of the knitwear; and the image combination be performed by obtaining a front-rear relationship of the human body model, the fabric, and the knitwear relative to the view point, on the basis of the three images expressing the positions in the depth direction.
  • the method be further provided with a step of creating a two-dimensional color image of the human body model and an image expressing a position of the human body model in a depth direction viewed from the view point, and a two-dimensional color image of the fabric and an image expressing a position of the fabric in the depth direction viewed from the view point, on the basis of the position of the view point and the position of the light source, and an image expressing a position of the knitwear in the depth direction viewed from the view point be created in addition to the two-dimensional color image and opacity image of the knitwear; a part where the knitwear blocks light emitted from the light source to the human body model or the fabric be obtained as the shadow part on the basis of the three images expressing the positions in the depth direction; and the image combination be performed by obtaining a front-rear relationship of the human body model, the fabric, and the knitwear relative to the view point, on the basis of the three images expressing the positions in the depth direction. It is also preferred that the method be further provided with
  • the present invention is also an apparatus for simulating a condition where knitwear is applied onto a human body model by using design data and thread data of the knitwear, and a three-dimensional image of the human body model, the apparatus being characterized in having:
  • the present invention is also a computer-readable recording medium, which stores therein a program for simulating a condition where knitwear is applied onto a human body model by using design data and thread data of the knitwear, and a three-dimensional image of the human body model, the storage medium being characterized in storing the program for causing the computer to execute the steps of:
  • the description of the simulation method applies directly to the simulation apparatus and the simulation program
  • the description of the simulation apparatus applies directly to the simulation method and the simulation program.
  • the fabric include clothes such as garments and scarves, as well as knitwear.
  • An embodiment simulates a condition where a piece of cloth is applied onto the human body model and the knitwear is applied thereon, but the middle cloth may not be required.
  • the stitches of the knitwear are expressed not by a three-dimensional model but by a two-dimensional model configured by a thread body and fluff.
  • the fluff is expressed by not a tube of a three-dimensional model or a polygon on the surface of the tube but a simple two-dimensional image.
  • a shadow made on the human body model by the knitwear is expressed by an average shadow of the knitwear, instead of expressing it by subjecting the shadow of each thread by means of ray tracing.
  • An image of the human body model and an image of the knitwear are combined using the opacity of the knitwear as a parameter.
  • the simulation can be carried out similarly even when an image of another fabric is disposed between the knitwear and the human body model.
  • a shadow of the knitwear may be made on the color image of the human body model and the color image of the fabric, and the color image of the human body model, the color image of the fabric, and the two-dimensional color image of the knitwear may be combined using the opacity image of the knitwear.
  • the images can be combined simply by processing the images of the knitwear, the human body model and the fabric as two-dimensional color images.
  • the front-rear relationship among the knitwear, the human body model, and the fabric can be processed easily by providing the knitwear, the human body model, and the fabric with an image expressing the position in the depth direction.
  • the simplest thing is to obtain an overlap of the two-dimensional image such as the color image of the knitwear, and the two-dimensional image such as the color image of the human body model or the fabric, and to obtain the overlapping range as the shadow.
  • This range is a model where a section covered by the knitwear forms a shadow.
  • the shadow may be realized by obtaining, based on the three images of the knitwear, the human body model and the fabric where the position in the depth direction is expressed, a range where the knitwear, or particularly the contour of the knitwear, blocks the light from the light source.
  • This range is a model where the shadow made by the semi-transparent knitwear is obtained by means of ray tracing with a position resolution courser than that of individual threads.
  • the shadow of the knitwear may be obtained when the filament data and the three-dimensional data of the human body model and fabric are obtained. Although the shadows of individual threads are not obtained at this stage, the range of the shadow of the knitwear can be obtained.
  • Figs. 1 to 5 show an embodiment.
  • reference numeral 2 represents a simulation apparatus, 4 a bus, 6 a stylus, 8 a mouse, and 10 a keyboard. These components are used as manual input devices. A trackball or joystick may be added as other manual input devices.
  • Reference numeral 12 represents a color monitor, 14 a color printer, 16 a communication unit for communicating with a LAN and the Internet, and 18 a disc driver for reading/writing data from/to an appropriate disk.
  • Reference numeral 20 represents a design data editing part for causing the stylus 6, mouse 8 or keyboard 10 to generate design data of knitwear.
  • a data converter converts the design data into knitting data for driving a knitting machine, such as a flat knitting machine.
  • a filament data generating part 24 converts the knitting data into filament data, which is three-dimensional data that expresses the positions of stitches, the types of the stitches and a connecting relation between the stitches. The position of each stitch is specified by one point of three-dimensional position representing, for example, the stitch. In this case, the type of the stitch is added as an attribute. In addition, when specifying the position of the stitch by coordinates of a plurality of points of the stitch, the type of the stitch is found from the shape of the stitch. Therefore, it is not necessary to store the type of the stitch.
  • a scene setting part 26 sets a position of a light source and a position of a view point that are obtained in simulation.
  • a human body model storage unit 30 stores a three-dimensional color image of a human body model therein, and a cloth data storage unit 32 stores therein a three-dimensional color image of a garment made of a cloth put under the knitwear put on the human body model.
  • This garment is simply referred to as "cloth” hereinafter.
  • These three-dimensional color images are composed of polygons and textures in which the human body model and the cloth are opaque, but the cloth may be semi-transparent and opacity of the cloth may be stored.
  • a thread data storage unit 34 stores therein color images and opacity of threads used in the knitwear, wherein the color images of the threads include a thread body part and a fluff part.
  • Reference numeral 35 represents an application processor, which, according to WO 2005/082185 , applies virtual knitwear to the human body model or cloth by using the filament data. Note that the cloth is not taken into consideration in WO 2005/082185 , but the application of the knitwear can be simulated in the same manner as WO 2005/082185 by, for example, applying the cloth to the human body model first and taking the applied cloth as a rigid body or a elastic body that is deformed by a collision between the knit and the stitches.
  • Reference numeral 36 is a hidden surface remover, which uses the three-dimensional data to remove, from the simulation targets, the sections in the knitwear, the cloth and the human body model that are hidden from the sight of the view point.
  • a knit rendering part 38 renders loop images in which the stitches that remain in the filament data of the knitwear without being subjected to hidden surface removal are viewed from the position of the view point, and then connects the loop images to obtain a two-dimensional color image of a section on the knitwear that can be viewed from the view point.
  • This image contains an image of the thread body and an image of the fluff, and the position in a depth direction (Z coordinate) relative to the view point and an image showing the opacity ⁇ of the thread body or fluff are added to this image.
  • Image data items are, for example, RGB, Z and ⁇ .
  • the knit rendering part 38 connects the individual loop images in accordance with the filament data to create the two-dimensional color image (layers of the knitwear) and to further create a depth image (Z image) and the image of the opacity ⁇ (mask image).
  • a shadow image creating part 40 creates an image of an average shadow generated on the human body model or cloth by the knitwear.
  • a ray tracing part 41 performs ray tracing on the images of the human body model, the cloth and the knit. However, ray tracing is not performed to show how the knitwear blocks the light emitted to the human body model or the cloth. Furthermore, the cloth is opaque and ray tracing to show whether the cloth blocks the light emitted to the human body model is nor performed. Note that the cloth may be semi-transparent and the shadow of the cloth generated on the human body model may be obtained. In this case, the shadow image creating part 40 may obtain the shadow of the cloth generated on the human body model, in the same manner as obtaining the shadow of the knitwear.
  • the color image of the human body model or the color image of the cloth data are converted into two-dimensional color images viewed from the view point. These two-dimensional color images are called "layers.”
  • an image showing the position in the depth direction relative to the view point is added, and ray tracing and shadowing are carried out.
  • the color image of the two-dimensional human body model and the color image of the cloth data are obtained, which are then stored in a layer storage unit 42.
  • the image showing the position in the depth direction of the human body model or cloth data is stored in the layer storage unit 42.
  • the layers to be created are the two-dimensional color images of the knitwear, the cloth and the human body model, the image of the opacity of the knitwear (mask image), the image of the shadow of the knitwear, and the mask image of the cloth.
  • An image combining part 44 combines the four images of the knitwear, the human body model, the cloth and the background.
  • the image of the opacity ⁇ of the knitwear is used.
  • the obtained image is stored in an image memory 50, is then displayed on the color monitor 12, and is output from the color printer 14 or the communication unit 16.
  • Fig. 2 shows a flow of the data in the embodiment.
  • the design data is converted into the knitting data, which is then converted into the filament data.
  • the filament data expresses stitch positions by specifying one or plural points of three-dimensional positions for each stitch, and obtains the types of the stitches and the connecting relations therebetween as the attributes.
  • the human body model and the cloth data are both in the form of three-dimensional color images.
  • the stitches on the filament data are arranged around the human body model or the cloth.
  • the positions of the view point and the light source are set, a section on the knitwear that is hidden by the human body model or the cloth are removed, and a section on the cloth that is hidden by the human body model is also removed.
  • the three-dimensional color image of the human body model is converted into a two-dimensional color image viewed from the view point.
  • the color image of the cloth is converted into a two-dimensional color image of the cloth viewed from the view point
  • the filament data of the knitwear is converted into a two-dimensional color image viewed from the view point.
  • Target stitches are the stitches that are not subjected to hidden surface removal.
  • the color image of the thread body and fluff is attached along the filament of each stitch, and at the same time an image of the opacity of the thread body and fluff is created.
  • the color image and the opacity of the thread body and fluff are stored in the thread data storage unit 34.
  • a stitch image in which the stitches are viewed from the front is created once, and this image is rotated so that it is viewed from the view point.
  • the images are combined based on the opacity, and the opacity is increased at the overlapping section. Because the three-dimensional position of each stitch is described in the filament data, the coordinate Z of the depth direction of the stitch relative to the view point is generated using the three-dimensional position of the stitch.
  • ray tracing is performed within the range of the knitwear so that the threads far from the light source become dark in the section where the threads overlap.
  • ray tracing is performed on the light emitted from the light source in the human body model, without taking the cloth or the knitwear into consideration. Furthermore, ray tracing is performed on the light emitted from the light source in the cloth by taking into consideration not the knitwear but the human body model.
  • the shadow generated by the knitwear is added to the color image of the human body model and the color image of the cloth.
  • This shadow which is an average shadow of the knitwear, does not express shadows generated by individual threads of the knitwear.
  • the images of the layers of the knitwear, human body model and cloth are combined.
  • composite image is displayed on the color monitor.
  • Fig. 3 shows a process ranging from the formation of a color image of the knitwear to shadowing.
  • Reference numerals 60, 61 represent knitting courses where stitches 62, 63 are arrayed. The dashed lines in Fig. 3 show the connecting relations of the stitches between the knitting courses.
  • Reference numeral 64 represents the light source, and 65 the view point.
  • An image of the stitch 62 viewed from the view point 65 is composed of a hatched thread body section and its peripheral fluff section. These sections are obtained by rotating an image in which the stitch 62 is viewed from the front, or by attaching an image of the thread body or fluff to the filament viewed from the view point.
  • Opacity is applied to the thread body or fluff when the thread data is obtained, and when a thread overlaps with another thread due to the rotation, the opacity is increased at this overlapping section.
  • a two-dimensional color image of one stitch viewed from the view point 65, an image of the depth direction, and an image of the opacity are obtained.
  • a plurality of stitches of the filament data are overlapped, and the stitches are overlapped based on the value of Z and the opacity ⁇ in the section where the stitches overlap with each other.
  • ray tracing is carried out within the knit structure.
  • Fig. 4 shows a model for performing shadowing.
  • Reference numeral 70 represents a layer of the color image of the knitwear, and the value of this image is expressed by P1.
  • Reference numeral 71 represents a mask image layer of the knitwear, and the value of this image is expressed by ⁇ .
  • Reference numeral 72 represents an image layer of a shadow of the knitwear, and the value of this image is expressed by ⁇ .
  • Reference numeral 73 is a color image layer of the cloth, and the value of this image is expressed by P2.
  • Reference numeral 75 is an image layer of the human body model, and the value of this image is expressed by P3. The positional relationship of these layers to the view point is processed by the hidden surface remover 36.
  • the knitwear forms the uppermost layer
  • the cloth forms the middle layer
  • the human body model forms the lowermost layer.
  • the value ⁇ of the shadow image is obtained by blurring, i.e., averaging, the two dimensions x, y of the two dimensional image in the both directions of x, y.
  • the degree of the blur is defined by specifying the width to be blurred, based on, for example, each of the dimensions of x, y.
  • the shadow image is slid from the mask image in each of the directions of x, y, and values Sx, Sy by which the shadow image is slid can be specified by a user.
  • the shadow image is slid by 5 dots in the x direction and 3 dots in the y direction with respect to the mask image.
  • the shadow image is black-and-white, and ⁇ is one-dimensional and the shadow image is in color, ⁇ becomes three-dimensional data having each of the components of the RGB.
  • a color shadow has a color value obtained by blurring a color value of the knitwear.
  • the darkness of the shadow can be specified, and value ⁇ of the mask can be smoothed as it is and slid.
  • the value ⁇ of the mask can be weakened to, for example, approximately 10 to 50%, smoothed and then slid. Because the cloth is opaque, the value ⁇ of the mask is 1 (with the cloth) or 0 (without the cloth).
  • the composite image can be obtained easily by using the image combining part 44.
  • the darkness/lightness or blurriness of the shadow is specified to realize the shadow realistically, or the direction of the light source can be expressed by sliding the shadow.
  • Figs. 5a) to 5h) show a simulation image obtained in the embodiment.
  • the image of a thread used is shown in the upper left side of a) to d), wherein the same human body model, the cloth, and the design data of the knitwear are used.
  • An enlargement of the simulation image is shown in e) to h), wherein the brightness of the human body model seen through the knitwear is changed by the thickness or the like of the thread. Further, a condition where the fluffy knitwear is applied can be simulated.
  • the simulation apparatus 2 of the present invention is realized by installing the simulation program on a computer, and the computer is caused to read the simulation program through a storage medium such as a CD-ROM or through a carrier wave.
  • Fig. 2 corresponds to a block diagram of the simulation program, which execute each of the processes shown in Fig. 2 .

Landscapes

  • Engineering & Computer Science (AREA)
  • Physics & Mathematics (AREA)
  • Theoretical Computer Science (AREA)
  • General Physics & Mathematics (AREA)
  • Textile Engineering (AREA)
  • Computer Graphics (AREA)
  • Computer Hardware Design (AREA)
  • Evolutionary Computation (AREA)
  • Geometry (AREA)
  • General Engineering & Computer Science (AREA)
  • Processing Or Creating Images (AREA)
  • Image Generation (AREA)
EP08753121.6A 2007-06-13 2008-05-26 Procédé de simulation de tricots, appareil pour le procédé, et support de stockage Withdrawn EP2159755A4 (fr)

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
JP2007155912 2007-06-13
PCT/JP2008/060100 WO2008152931A1 (fr) 2007-06-13 2008-05-26 Procédé de simulation de tricots, appareil pour le procédé, et support de stockage

Publications (2)

Publication Number Publication Date
EP2159755A1 true EP2159755A1 (fr) 2010-03-03
EP2159755A4 EP2159755A4 (fr) 2014-05-07

Family

ID=40129541

Family Applications (1)

Application Number Title Priority Date Filing Date
EP08753121.6A Withdrawn EP2159755A4 (fr) 2007-06-13 2008-05-26 Procédé de simulation de tricots, appareil pour le procédé, et support de stockage

Country Status (4)

Country Link
EP (1) EP2159755A4 (fr)
JP (1) JP5161213B2 (fr)
CN (1) CN101689307B (fr)
WO (1) WO2008152931A1 (fr)

Families Citing this family (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP4792521B2 (ja) * 2009-12-15 2011-10-12 株式会社アイ.エス.テイ 布製品識別装置および布製品把持システム
DE102011106401A1 (de) * 2011-07-02 2013-01-03 H. Stoll Gmbh & Co. Kg Verfahren und Vorrichtung zur Maschendarstellung
CN106608201B (zh) 2015-10-26 2019-04-19 比亚迪股份有限公司 电动车辆及其主动安全控制系统和方法
CN109137245B (zh) * 2018-09-27 2019-11-08 北京大豪科技股份有限公司 手套机机头控制方法、装置、设备及存储介质

Citations (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
EP0640707A1 (fr) * 1993-08-31 1995-03-01 SHIMA SEIKI MFG., Ltd. Système de conception de tricot et procédé pour concevoir des étoffes tricotées
WO2005082185A1 (fr) * 2004-02-26 2005-09-09 Shima Seiki Manufacturing, Ltd. Procédé et dispositif de simulation du port d’un vêtement en tricot sur une silhouette humaine et programme prévu à cet effet

Family Cites Families (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP4189339B2 (ja) * 2004-03-09 2008-12-03 日本電信電話株式会社 3次元モデル生成方法と生成装置およびプログラムと記録媒体

Patent Citations (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
EP0640707A1 (fr) * 1993-08-31 1995-03-01 SHIMA SEIKI MFG., Ltd. Système de conception de tricot et procédé pour concevoir des étoffes tricotées
WO2005082185A1 (fr) * 2004-02-26 2005-09-09 Shima Seiki Manufacturing, Ltd. Procédé et dispositif de simulation du port d’un vêtement en tricot sur une silhouette humaine et programme prévu à cet effet

Non-Patent Citations (3)

* Cited by examiner, † Cited by third party
Title
Anonymous: "Drop shadow - Wikipedia, the free encyclopedia", , 28 February 2007 (2007-02-28), page 1, XP055110590, Retrieved from the Internet: URL:http://en.wikipedia.org/w/index.php?title=Drop_shadow&oldid=111628878 [retrieved on 2014-03-28] *
BAINING GUO ET AL: "Realistic rendering and animation of knitwear", IEEE TRANSACTIONS ON VISUALIZATION AND COMPUTER GRAPHICS, IEEE SERVICE CENTER, LOS ALAMITOS, CA, US, vol. 9, no. 1, 1 January 2003 (2003-01-01) , pages 43-55, XP011095508, ISSN: 1077-2626, DOI: 10.1109/TVCG.2003.1175096 *
See also references of WO2008152931A1 *

Also Published As

Publication number Publication date
CN101689307A (zh) 2010-03-31
WO2008152931A1 (fr) 2008-12-18
JP5161213B2 (ja) 2013-03-13
JPWO2008152931A1 (ja) 2010-08-26
CN101689307B (zh) 2012-02-29
EP2159755A4 (fr) 2014-05-07

Similar Documents

Publication Publication Date Title
US10510183B2 (en) Graphics processing enhancement by tracking object and/or primitive identifiers
JP3294149B2 (ja) 立体テクスチャマッピング処理装置及びそれを用いた3次元画像生成装置
KR20170132840A (ko) 가상 현실 환경 내부에서의 드로잉에 의한 3 차원 패션 객체들의 생성
JP4233547B2 (ja) 画像表示処理方法
JP2006055213A (ja) 画像処理装置、及びプログラム
JP3777149B2 (ja) プログラム、情報記憶媒体及び画像生成装置
JP2023553507A (ja) 特注仕様製品の合成データ表示の高品質レンダリング表示を得るためのシステムおよびその方法
EP2159755A1 (fr) Procédé de simulation de tricots, appareil pour le procédé, et support de stockage
JP4267646B2 (ja) 画像生成装置、画像生成方法、ならびに、プログラム
KR20060108271A (ko) 디지털 패션 디자인용 실사 기반 가상 드레이핑 시뮬레이션방법
KR20010047046A (ko) 제트버퍼를 이용한 입체영상 생성방법
EP2164049B1 (fr) Appareil de simulation de pliage de vêtements tricotés, procédé de simulation et support de stockage
JP2001028064A (ja) ゲーム機における画像処理方法及び当該方法を実行可能なプログラムを記憶した記憶部
KR100927326B1 (ko) 2 차원 아바타 이미지 처리 방법 및 이 방법이 실행가능한 프로그램으로 기록된 기록 매체
JP3491832B2 (ja) ゲーム装置および情報記憶媒体
JP4161613B2 (ja) 画像処理方法
JP3586253B2 (ja) テクスチャマッピングプログラム
JP4517447B2 (ja) 3次元モデルの画像処理方法及びその装置
JP2000339499A (ja) テクスチャマッピング・テクスチャモザイク処理装置
JP5065374B2 (ja) ニット製品のシミュレーション装置とシミュレーション方法
JP6035057B2 (ja) 三次元画像表示装置及び三次元画像表示方法
JP2002056405A (ja) テクスチャマッピング処理装置
US11321899B1 (en) 3D animation of 2D images
JP2020013390A (ja) 情報処理装置、情報処理プログラム及び情報処理方法
JP2007094680A (ja) 画像処理装置、画像処理方法等

Legal Events

Date Code Title Description
PUAI Public reference made under article 153(3) epc to a published international application that has entered the european phase

Free format text: ORIGINAL CODE: 0009012

17P Request for examination filed

Effective date: 20091211

AK Designated contracting states

Kind code of ref document: A1

Designated state(s): AT BE BG CH CY CZ DE DK EE ES FI FR GB GR HR HU IE IS IT LI LT LU LV MC MT NL NO PL PT RO SE SI SK TR

AX Request for extension of the european patent

Extension state: AL BA MK RS

DAX Request for extension of the european patent (deleted)
A4 Supplementary search report drawn up and despatched

Effective date: 20140407

RIC1 Information provided on ipc code assigned before grant

Ipc: D04B 37/00 20060101ALN20140401BHEP

Ipc: G06T 15/60 20060101AFI20140401BHEP

Ipc: G06F 17/50 20060101ALI20140401BHEP

Ipc: A41H 43/00 20060101ALN20140401BHEP

Ipc: G06T 19/00 20110101ALI20140401BHEP

17Q First examination report despatched

Effective date: 20150814

STAA Information on the status of an ep patent application or granted ep patent

Free format text: STATUS: THE APPLICATION HAS BEEN WITHDRAWN

18W Application withdrawn

Effective date: 20170519