WO2023218861A1 - シミュレーション装置 - Google Patents
シミュレーション装置 Download PDFInfo
- Publication number
- WO2023218861A1 WO2023218861A1 PCT/JP2023/015234 JP2023015234W WO2023218861A1 WO 2023218861 A1 WO2023218861 A1 WO 2023218861A1 JP 2023015234 W JP2023015234 W JP 2023015234W WO 2023218861 A1 WO2023218861 A1 WO 2023218861A1
- Authority
- WO
- WIPO (PCT)
- Prior art keywords
- costume
- data
- user
- image
- costumes
- Prior art date
Links
Images
Classifications
-
- G—PHYSICS
- G06—COMPUTING OR CALCULATING; COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T13/00—Animation
- G06T13/20—3D [Three Dimensional] animation
- G06T13/40—3D [Three Dimensional] animation of characters, e.g. humans, animals or virtual beings
-
- G—PHYSICS
- G06—COMPUTING OR CALCULATING; COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T19/00—Manipulating 3D models or images for computer graphics
Definitions
- the present invention relates to a simulation device.
- simulation devices have been used that simulate the appearance of users trying on virtual costumes.
- a simulation device is sometimes used that generates a video in which the user moves while trying on virtual costumes in accordance with the user's own movements.
- Patent Document 1 generates motion body shape data by moving the user's 3D body shape data three-dimensionally, and dresses the 3D body shape data at each time included in the motion body shape data with a costume indicated by costume data.
- a virtual try-on system that generates try-on images is disclosed.
- Patent Document 1 does not even consider the order in which multiple costumes are layered. Specifically, Patent Document 1 describes, as an example, a case where a T-shirt and jeans are worn. However, the technology according to Patent Document 1 does not consider which of the T-shirt and jeans is closer to the user's body.
- an object of the present invention is to provide a simulation device that can specify and simulate the order in which two or more costumes are layered when the user wears them.
- a simulation device includes a processing device that simulates a case where a user wears multiple costumes by referring to a plurality of costume data in one-to-one correspondence with a plurality of costumes;
- Each of the plurality of costume data includes shape data indicating a three-dimensional shape of the costume, and tag data associated with the shape data and indicating the number of times the costume is worn by the user,
- the processing device includes a reception unit that receives input of two or more costumes selected by the user from among the plurality of costumes, and a processing device that receives input of two or more costumes respectively corresponding to the two or more costumes received by the reception unit.
- the simulation device includes an image generation unit that generates a first composite image in which three-dimensional images of the two or more costumes are superimposed on the dimensional image according to the specified superimposition order.
- FIG. 1 is a diagram showing the overall configuration of a simulation system 1 according to a first embodiment.
- FIG. 2 is a block diagram showing a configuration example of a scanning device 20.
- FIG. A configuration example of the first data set DS1.
- Configuration example of costume data CD. A table showing an example of the correspondence between costume IDs and stacking order indexes.
- FIG. 3 is a block diagram showing a configuration example of a server 30.
- FIG. A configuration example of the first database DB1. 1 is a block diagram showing a configuration example of a terminal device 10.
- FIG. A configuration example of the second database DB2.
- FIG. 3 is a functional block diagram of the image generation unit 116.
- FIG. 3 is an explanatory diagram of the operation of the image generation unit 116.
- FIG. 3 is an explanatory diagram of the operation of the image generation unit 116.
- 5 is a flowchart showing the operation of the terminal device 10.
- FIG. 1 shows the overall configuration of a simulation system 1 according to the first embodiment.
- the simulation system 1 includes a terminal device 10, a scanning device 20, and a server 30.
- the terminal device 10, the scanning device 20, and the server 30 are communicably connected to each other via the communication network NET.
- NET communication network
- the terminal device 10 is a device that allows an end user U to simulate his or her own appearance when trying on two or more virtual costumes.
- the terminal device 10 specifies the order in which multiple costumes are stacked on top of each other, that is, the stacking order, and then simulates a full-body image of the end user U trying on two or more virtual costumes.
- the terminal device 10 is an example of a "simulation device.”
- the scanning device 20 three-dimensionally scans two or more actual costumes and generates image data indicating a three-dimensional image of the costumes and costume data regarding the characteristics of the costumes, which are used when the terminal device 10 performs a simulation. do.
- the costume data includes a stacking order index used by the terminal device 10 to specify the stacking order of two or more costumes.
- the scanning device 20 outputs the generated image data and costume data as one data set to the server 30.
- the server 30 acquires image data and costume data from the scanning device 20. Additionally, the server 30 outputs image data to the terminal device 10. Furthermore, the server 30 acquires a costume ID indicating the ID of the costume that the end user U tries on from the terminal device 10, and outputs costume data corresponding to the acquired costume ID to the terminal device 10.
- FIG. 2 is a block diagram showing an example of the configuration of the scanning device 20.
- the scanning device 20 includes a processing device 21 , a storage device 22 , a communication device 23 , an imaging device 24 , a display 25 , and an input device 26 .
- the elements of scanning device 20 are interconnected using one or more buses for communicating information.
- the processing device 21 is a processor that controls the entire scanning device 20. Further, the processing device 21 is configured using, for example, a single chip or a plurality of chips. The processing device 21 is configured using, for example, a central processing unit (CPU) that includes an interface with peripheral devices, an arithmetic unit, registers, and the like. Note that some or all of the functions of the processing device 21 may be implemented using hardware such as a DSP, ASIC, PLD, and FPGA. The processing device 21 executes various processes in parallel or sequentially.
- CPU central processing unit
- the storage device 22 is a recording medium that can be read and written by the processing device 21. Furthermore, the storage device 22 stores a plurality of programs including the control program PR2 executed by the processing device 21. The storage device 22 also stores a first data set DS1.
- the first data set DS1 is a data set corresponding to a three-dimensional model of each costume.
- FIG. 3 shows an example of the configuration of the first data set DS1.
- the first data set DS1 is a set of costume ID, image data PD, and costume data CD.
- the costume ID indicates the ID of the costume scanned by the scanning device 20.
- the first alphabet of the costume ID corresponds to the type of costume, for example, a T-shirt, a long-sleeved shirt, and pants. Further, the numerical value following the alphabet indicates the item number of the costume.
- the image data PD indicates a three-dimensional image obtained as a result of three-dimensional scanning of the costume.
- the costume data CD is data regarding the characteristics of the costume.
- the costume data CD includes data obtained by analyzing the three-dimensional image and data input by the user of the scanning device 20.
- FIG. 4 shows an example of the configuration of the costume data CD.
- the costume data CD includes shape data FD and tag data GD.
- Shape data FD is data indicating the three-dimensional shape of the costume.
- the shape data FD includes structure data SD, texture data TD, and joint data JD.
- the structure data SD is data regarding the structure of the costume. Specifically, the structure data SD includes data indicating what shape and thickness of cloth is placed at what position in one costume, and how the cloths are joined to each other.
- the texture data TD includes data indicating the material, stiffness, color, pattern, gloss, and texture of each cloth.
- the joint data JD includes data indicating positions corresponding to the joints of the general wearer in the costume.
- the joint data JD includes data indicating the relative position of the skeleton of the general wearer with respect to the costume.
- the joint data JD indicates a predetermined range as a position in a costume that corresponds to the position of the skeleton and joints of the wearer of the costume.
- the tag data GD is data used to classify costumes.
- the tag data GD includes, for example, a stacking order index used by the terminal device 10 to specify a stacking order indicating that a plurality of costumes are stacked on top of each other.
- FIG. 5 is a table showing an example of the correspondence between costume IDs and stacking order indexes. Note that in FIG. 5, for convenience of explanation, examples of three-dimensional images of each costume indicated by image data PD corresponding to each costume ID are also shown.
- the stacking order index corresponding to the costume whose first alphabet in the costume ID is "A", that is, the T-shirt, is "1".
- the stacking order index corresponding to the costume whose first alphabet in the costume ID is “B”, that is, the long-sleeved shirt, is “3”.
- the stacking order index corresponding to a costume whose first alphabet in the costume ID is “C”, that is, a sweater or a hoodie, is "4".
- the stacking order index corresponding to the costume whose first alphabet in the costume ID is “D”, that is, the jacket, is “5”.
- the stacking order index corresponding to the costume whose first alphabet in the costume ID is “E”, that is, pants, is “2”.
- the stacking order index corresponding to the costume whose first alphabet in the costume ID is “F”, that is, the coat, is “6”.
- the terminal device 10 uses three-dimensional images of two or more costumes to simulate the end user U trying on the two or more costumes to simulate the appearance of the end user U trying on the two or more costumes.
- the stacking order of two or more costumes is specified using the order index. Specifically, the terminal device 10 overlays a three-dimensional image of a costume with a relatively large stacking order index over a three-dimensional image of a costume with a relatively small stacking order index.
- part or all of the costume data CD may be automatically generated by the generation unit 212 using the analysis results of the analysis unit 213, which will be described later.
- some or all of these costume data CDs may be generated based on input information input by the user of the scanning device 20 via the input device 26.
- the communication device 23 is hardware as a transmitting and receiving device for communicating with other devices.
- the communication device 23 is also called, for example, a network device, a network controller, a network card, a communication module, or the like.
- the communication device 23 may include a connector for wired connection and an interface circuit corresponding to the connector.
- the communication device 23 may include a wireless communication interface. Examples of connectors and interface circuits for wired connections include products compliant with wired LAN, IEEE1394, and USB.
- examples of the wireless communication interface include products compliant with wireless LAN, Bluetooth (registered trademark), and the like.
- the imaging device 24 images the outside world where the object exists.
- the imaging device 24 images the costume.
- the imaging device 24 outputs imaging information indicating an image obtained by imaging the costume.
- the imaging device 24 includes, for example, a lens, an imaging element, an amplifier, and an AD converter.
- the light collected through the lens is converted by the image sensor into an image signal, which is an analog signal.
- the amplifier amplifies the imaging signal and outputs it to the AD converter.
- the AD converter converts the amplified imaging signal, which is an analog signal, into imaging information, which is a digital signal.
- the converted imaging information is output to the processing device 21.
- FIG. 6 shows an example of the installation of the imaging device 24 in this embodiment.
- the scanning device 20 includes eight imaging devices 24-1 to 24-8. Note that it is only an example that the scanning device 20 includes eight imaging devices 24. Scanning device 20 can include any number of imaging devices 24 .
- the imaging devices 24-1 to 24-8 are fixed to the frame F, and image the costume C placed in the hollow inside the frame F from 360° directions in three axes: top, bottom, left, right, front and back. do.
- a generation unit 212 which will be described later, generates a three-dimensional image of the costume C based on imaging information indicating a plurality of images captured by the imaging devices 24-1 to 24-8.
- the display 25 is a device that displays images and text information.
- the display 25 displays various images under the control of the processing device 21.
- various display panels such as a liquid crystal display panel and an organic EL (Electro Luminescence) display panel are suitably used as the display 25.
- the processing device 21 reads the control program PR2 from the storage device 22 and executes it. As a result, the processing device 21 functions as an acquisition section 211, a generation section 212, an analysis section 213, and a communication control section 214.
- the acquisition unit 211 acquires imaging information indicating a captured image of the costume C from the imaging device 24.
- the acquisition unit 211 also acquires input information input by the user of the scanning device 20 via the input device 26 .
- the input information includes, for example, a stacking order index.
- the generation unit 212 generates image data PD representing a three-dimensional image of the costume C based on the imaging information acquired by the acquisition unit 211 from the imaging device 24. Further, the generation unit 212 generates costume data CD using the input information acquired by the acquisition unit 211 from the input device 26. Furthermore, the generation unit 212 generates the costume data CD using also analysis information indicating the analysis result obtained by analyzing the three-dimensional image of the costume C by the analysis unit 213, which will be described later. The generation unit 212 also generates a first data set DS1, which is a data set including a costume ID and a set of image data PD and costume data CD corresponding to the costume ID. The generation unit 212 stores the generated first data set DS1 in the storage device 22.
- the analysis unit 213 analyzes the three-dimensional image of the costume C generated by the generation unit 212.
- the analysis unit 213 extracts, for example, features related to the shape of the costume C as a result of analyzing the three-dimensional image. Analysis information indicating the extracted features is output to the generation unit 212.
- the generation unit 212 generates costume data CD using the analysis information acquired from the analysis unit 213.
- the communication control unit 214 causes the communication device 23 to transmit the first data set DS1 stored in the storage device 22 to the server 30.
- the user of the scanning device 20 can easily create image data PD and costume data CD without having to manually input all of the data elements that make up image data PD and costume data CD one by one. Further, the simulation system 1 can simulate virtual try-on using the image data PD and costume data CD that are simply produced.
- FIG. 7 is a block diagram showing an example of the configuration of the server 30.
- the server 30 includes a processing device 31, a storage device 32, a communication device 33, a display 34, and an input device 35.
- Each element included in the server 30 is interconnected using one or more buses for communicating information.
- the processing device 31 is a processor that controls the entire server 30. Further, the processing device 31 is configured using, for example, a single chip or a plurality of chips. The processing device 31 is configured using, for example, a central processing unit (CPU) that includes an interface with peripheral devices, an arithmetic unit, registers, and the like. Note that some or all of the functions of the processing device 31 may be implemented using hardware such as a DSP, ASIC, PLD, and FPGA. The processing device 31 executes various processes in parallel or sequentially.
- CPU central processing unit
- the storage device 32 is a recording medium that can be read and written by the processing device 31. Furthermore, the storage device 32 stores a plurality of programs including the control program PR3 executed by the processing device 31. The storage device 32 also stores a first database DB1.
- FIG. 8 shows an example of the configuration of the first database DB1.
- the first database DB1 is a database in which a first data set DS1 acquired from the scanning device 20 via the communication device 33 by the acquisition unit 311, which will be described later, is accumulated.
- the communication device 33 is hardware as a transmitting/receiving device for communicating with other devices.
- the communication device 33 is also called, for example, a network device, a network controller, a network card, a communication module, or the like.
- the communication device 33 may include a connector for wired connection and an interface circuit corresponding to the connector.
- the communication device 33 may include a wireless communication interface. Examples of connectors and interface circuits for wired connections include products compliant with wired LAN, IEEE1394, and USB.
- examples of the wireless communication interface include products compliant with wireless LAN, Bluetooth (registered trademark), and the like.
- the display 34 is a device that displays images and text information.
- the display 34 displays various images under the control of the processing device 31.
- various display panels such as a liquid crystal display panel and an organic EL (Electro Luminescence) display panel are suitably used as the display 34.
- the input device 35 accepts operations from the user of the server 30.
- the input device 35 includes a keyboard, a touch pad, a touch panel, or a pointing device such as a mouse.
- the input device 35 may also serve as the display 34.
- the acquisition unit 311 acquires the first data set DS1 from the scanning device 20 via the communication device 33.
- the acquisition unit 311 stores the acquired first data set DS1 in the first database DB1.
- the acquisition unit 311 also acquires selection information indicating the selection result of the costume selected by the end user U from the terminal device 10 via the communication device 33, as described later.
- the extraction unit 312 extracts costume data CD from the first database DB1 based on the selection information acquired by the acquisition unit 311. More specifically, the extraction unit 312 uses the costume ID included in the selection information to extract costume data CD linked to the costume ID from the first database DB1.
- the communication control unit 313 causes the communication device 33 to transmit the set of costume ID and image data PD stored in the first database DB1 to the terminal device 10. As an example, the communication control unit 313 causes the communication device 33 to transmit all costume ID and image data PD pairs stored in the first database DB1 to the terminal device 10. Furthermore, the communication control unit 313 outputs the costume data CD extracted by the extraction unit 312 to the terminal device 10 via the communication device 33 as corresponding data RD that is paired with the costume ID to which the costume data CD corresponds. .
- FIG. 9 is a block diagram showing an example of the configuration of the terminal device 10.
- the terminal device 10 includes a processing device 11 , a storage device 12 , a communication device 13 , an imaging device 14 , a display 15 , and an input device 16 .
- Each element included in the terminal device 10 is interconnected using a single bus or multiple buses for communicating information.
- the processing device 11 is a processor that controls the entire terminal device 10. Further, the processing device 11 is configured using, for example, a single chip or a plurality of chips. The processing device 11 is configured using, for example, a central processing unit (CPU) that includes an interface with peripheral devices, an arithmetic unit, registers, and the like. Note that some or all of the functions of the processing device 11 may be implemented using hardware such as a DSP, an ASIC, a PLD, and an FPGA. The processing device 11 executes various processes in parallel or sequentially.
- CPU central processing unit
- the storage device 12 is a recording medium that can be read and written by the processing device 11. Furthermore, the storage device 12 stores a plurality of programs including the control program PR1 executed by the processing device 11. The storage device 12 also stores a second database DB2, corresponding data RD, body shape data BD, and learning model LM.
- FIG. 10 shows an example of the configuration of the second database DB2.
- the second database DB2 is a database that stores a set of costume ID and image data PD acquired from the server 30 via the communication device 13 by the acquisition unit 111, which will be described later.
- the second database DB2 stores all pairs of costume IDs and image data PD stored in the server 30.
- FIG. 11 shows an example of the configuration of the corresponding data RD.
- the correspondence data RD is correspondence data RD that an acquisition unit 111 (described later) acquires from the server 30 via the communication device 13.
- the corresponding data RD is a set of the costume ID included in the selection information output from the terminal device 10 to the server 30 and the costume data CD extracted by the server 30 using the costume ID.
- the body shape data BD is data representing the body shape of the end user U in three dimensions. Specifically, the body shape data BD is data indicating a three-dimensional image representing the body shape of the end user U.
- the body shape data BD includes skeleton data KD representing the skeleton of the end user U. More specifically, the skeleton data KD represents a change in the posture of the end user's U skeleton in accordance with the end user's U motion.
- the skeleton data KD includes, for example, temporal and discrete data indicating the posture of the skeleton of the end user U at a plurality of points in time during the period in which the end user U uses the terminal device 10. Furthermore, the skeleton data KD includes data regarding the joints of the end user U.
- the trained model LM is generated outside the terminal device 10.
- the learned model LM be generated in a server (not shown).
- the terminal device 10 acquires the learned model LM from a server (not shown) via the communication network NET.
- the imaging device 14 images the outside world where the object exists.
- the imaging device 14 captures a full-body image of the end user U.
- the imaging device 14 captures a full-body image of the end user U who has visited the store in the state of clothing at the time of the visit.
- the imaging device 14 outputs imaging information indicating a captured image obtained by imaging the end user U.
- the imaging device 14 includes, for example, a lens, an imaging element, an amplifier, and an AD converter.
- the light collected through the lens is converted by the image sensor into an image signal, which is an analog signal.
- the amplifier amplifies the imaging signal and outputs it to the AD converter.
- the AD converter converts the amplified imaging signal, which is an analog signal, into imaging information, which is a digital signal.
- the converted imaging information is output to the processing device 11.
- FIG. 12 is an example of the operation screen OM displayed on the display 15.
- a list of three-dimensional images of costumes that are candidates for costumes to be tried on by the end user U is displayed in the leftmost column.
- three-dimensional images of candidates for a T-shirt, long-sleeved shirt, sweater or hoodie, jacket, pants, and coat are displayed in order from the top.
- These three-dimensional images are shown in the second database DB2 by image data PD linked to the costume ID of each costume.
- the end user U selects an outfit to try on from among these outfit candidates using the input device 16, which will be described later.
- the acquisition unit 111 acquires the set of costume ID and image data PD and the corresponding data RD from the server 30 via the communication device 13.
- the acquisition unit 111 stores the set of costume ID and image data PD acquired from the server 30 in the second database DB2.
- the acquisition unit 111 also stores the corresponding data RD acquired from the server 30 in the storage device 12 .
- the acquisition unit 111 acquires imaging information indicating a captured image of the end user U's whole body from the imaging device 14 .
- the stacking order identifying unit 115 identifies the stacking order of two or more costumes selected by the end user U, based on the tag data GD included in the costume data CD acquired by the acquiring unit 111. As described above, the stacking order specifying unit 115 specifies that the larger the stacking order index of the costume, the more outwardly the clothes are stacked when viewed from the three-dimensional image showing the body shape of the end user U.
- the stacking order specifying unit 115 is an example of a “specific unit”.
- the image generation unit 116 generates the first composite image SP1 based on the body shape data BD and the two or more costume data CDs included in the corresponding data RD. Specifically, the image generation unit 116 superimposes the three-dimensional image of the two or more costumes on the three-dimensional image of the body shape of the end user U according to the stacking order of the two or more costumes specified by the stacking order identifying unit 115. A first composite image SP1 is generated. More specifically, the image generation unit 116 reads the costume ID and costume data CD from the corresponding data RD. Next, the image generation unit 116 refers to the second database DB2 using the read costume ID, thereby reading out the image data PD linked to the costume ID.
- the image generation unit 116 superimposes the three-dimensional image of the costume indicated by the image data PD on the three-dimensional image of the body shape of the end user U in the stacking order specified by the stacking order specifying unit 115.
- a composite image SP1 is generated.
- the image generation unit 116 superimposes the 3D image of the costume on the 3D image of the body shape of the end user U, a more natural first composite image SP1 is generated by using the shape data FD included in the costume data CD. be done.
- a three-dimensional image of the body shape of the end user U is represented by body shape data BD.
- a first costume is placed inside the three-dimensional image BM of the body shape of the end user U.
- Part of the 3D image commercial is included.
- the area specifying unit 116-1 specifies the first area AR1.
- costume data CD is linked to the costume ID to which each of the first costume and the second costume corresponds.
- costume data CD includes joint data JD.
- joint data JD indicates, when a typical wearer wears the costume, the range of joint positions of the typical wearer in the costume. Contains the data shown.
- the terminal device 10 can simulate, as a moving image, a full-body image of the end user U while trying on a plurality of costumes.
- the end user U can view a video in which a virtual full-body image of the end user U who has tried on a plurality of costumes moves in accordance with the user's own movements.
- the terminal device 10 moves the joint data JD included in the shape data FD indicating the three-dimensional shape of the costume in accordance with the movement of the skeleton data KD that deforms in accordance with the movement of the end user U. It becomes possible to move the virtual full-body image of U more naturally.
- the display control unit 117 causes the display 15 to display the operation screen OM shown in FIG. In particular, the display control unit 117 causes the display 15 to display the first composite image SP1 generated by the image generation unit 116.
- FIG. 15 is a flowchart showing the operation of the terminal device 10 according to the first embodiment.
- step S1 the processing device 11 functions as the acquisition unit 111.
- the processing device 11 acquires imaging information indicating a captured image of the end user U's whole body from the imaging device 14 .
- step S4 the processing device 11 functions as the communication control unit 114.
- the processing device 11 causes the communication device 13 to transmit selection information indicating the selection result received in step S3 to the server 30.
- step S6 the processing device 11 functions as the stacking order specifying unit 115.
- the processing device 11 specifies the stacking order of two or more costumes based on the costume data CD included in the corresponding data RD acquired in step S5.
- the processing device 11 functions as the image generation unit 116.
- the processing device 11 generates a first composite image SP1 based on body shape data BD representing the body shape of the end user U in three dimensions and two or more pieces of costume data CD. Specifically, the processing device 11 adds two or more images to the three-dimensional image of the end user U by referring to the shape data FD included in the costume data CD according to the stacking order of the two or more costumes specified in step S6.
- a first composite image SP1 is generated by overlapping the three-dimensional images of the costumes described above.
- the terminal device 10 as a simulation device can provide information to end users by referring to a plurality of costume data CDs in one-to-one correspondence with a plurality of costumes.
- a case is simulated in which U wears layers of clothing.
- Each of the plurality of costume data CDs includes shape data FD indicating the three-dimensional shape of the costume, tag data GD associated with the shape data FD, and indicating the number of times the costume will be worn by the end user U. including.
- the terminal device 10 includes a receiving section 113, a stacking order specifying section 115, and an image generating section 116.
- the reception unit 113 receives two or more costumes selected by the end user U from among the plurality of costumes.
- the two or more costumes include the first costume worn by the end user U without any other costume interposed between the first costume and the body of the end user U.
- the image generation section 116 includes an area identification section 116-1 and a modification section 116-2.
- the area specifying unit 116-1 specifies the first area AR1.
- the first area AR1 is such that when the three-dimensional image of the first costume is superimposed on the three-dimensional image of the end user U, a part of the three-dimensional image of the first costume is inside the three-dimensional image of the end user U. This is an area to get into.
- the modification unit 116-2 performs modification to push out a part of the three-dimensional image of the first costume included in the first region AR1 from the three-dimensional image of the end user U.
- the terminal device 10 Since the terminal device 10 has the above configuration, when the end user U tries on underwear virtually, for example, the 3D image showing the underwear is compared to the 3D image showing the body shape of the end user U. Can eliminate digging. As a result, the end user U can see a full-body image of himself wearing underwear in a more natural state.
- the two or more costumes include the second costume that the end user U wears over the first costume.
- the area specifying unit 116-1 specifies the second area.
- the 3D image of the second costume is superimposed on the second composite image SP2 corresponding to the set of the 3D image of the end user U and the 3D image of the first costume, the second area becomes the second composite image SP2.
- This is an area in which a part of the three-dimensional image of the second costume enters.
- the modification unit 116-2 performs modification to push out a part of the three-dimensional image of the second costume included in the second region from the second composite image SP2.
- the body shape data BD includes the skeleton data KD representing the skeleton of the end user U.
- the shape data FD corresponds to the relative position of the skeleton of the end user U with respect to the above costume and the joints of the end user U in the above costume, assuming that the end user U wears the above costume.
- the image generation unit 116 collates the skeletal data KD, which is deformed according to the motion of the end user U, and the joint data JD, and generates a moving image as the first composite image SP1.
- the terminal device 10 Since the terminal device 10 has the above configuration, the end user U can view a video in which a virtual full-body image of the end user U trying on two or more costumes moves in accordance with the user's own movements.
- the terminal device 10 operates the joint data JD included in the shape data FD indicating the three-dimensional shape of the costume in accordance with the movement of the skeleton data KD that deforms in accordance with the end user's U movement.
- the terminal device 10 is able to move the virtual full-body image of the end user U more naturally.
- the simulation system 1 includes the scanning device 20 and the terminal device 10 described above.
- the scanning device 20 three-dimensionally scans the costume described above to generate costume data CD.
- the tag data GD includes a stacking order index used by the terminal device 10 to specify the stacking order of two or more costumes.
- the tag data GD may include type data indicating the type of costume instead of the stacking order index.
- the storage device 12 provided in the terminal device 10 stores a correspondence table that describes the correspondence between types of clothing such as T-shirts, long-sleeved shirts, and sweaters, and stacking order indices.
- the stacking order specifying unit 115 may identify the stacking order of the costumes by comparing the types of costumes specified in the costume data CD obtained from the server 30 with the correspondence table.
- the user of the scanning device 20 inputs a stacking order index indicating the stacking order of the costumes from the input device 26.
- the scanning device 20 may generate the costume stacking order index using the trained model. More specifically, the generation unit 212 included in the scanning device 20 inputs image data PD representing a three-dimensional image of the costume C generated by itself into the learned model, thereby determining the stacking order index of the costume C. may be output from the trained model.
- the learned model is generated by machine learning using teacher data including a plurality of sets of image data PD indicating three-dimensional images of costumes and stacking order indices of costumes.
- the scanning device 20 uses the learned model to identify the type of costume. May be generated. More specifically, the generation unit 212 included in the scanning device 20 learns the type of costume C by inputting image data PD representing a three-dimensional image of the costume C generated by itself into the learned model. It may also be output from the model.
- the learned model is generated by machine learning using teacher data including a plurality of sets of image data PD indicating three-dimensional images of costumes and types of costumes.
- the terminal device 10 outputs selection information indicating the selection results of two or more costumes received by the reception unit 113 to the server 30, and acquires correspondence data RD corresponding to the selection information from the server 30. was. However, the terminal device 10 may not output the selection information to the server 30 and may unconditionally acquire the correspondence data RD for all costumes from the server 30.
- the terminal device 10, the scanning device 20, and the server 30 are separate bodies. However, two or more of the terminal device 10, scanning device 20, and server 30 may be housed in the same housing. That is, two or more devices shown in the overall configuration of FIG. 1 may be realized as a single device.
- the information, signals, etc. described may be represented using any of a variety of different technologies.
- data, instructions, commands, information, signals, bits, symbols, chips, etc. which may be referred to throughout the above description, may refer to voltages, currents, electromagnetic waves, magnetic fields or magnetic particles, light fields or photons, or any of these. It may also be represented by a combination of
- the determination may be made using a value expressed using 1 bit (0 or 1) or a truth value (Boolean: true or false).
- the comparison may be performed by comparing numerical values (for example, comparing with a predetermined value).
- each of the functions illustrated in FIGS. 1 to 15 is realized by an arbitrary combination of at least one of hardware and software.
- the method for realizing each functional block is not particularly limited. That is, each functional block may be realized using one physically or logically coupled device, or may be realized using two or more physically or logically separated devices directly or indirectly (e.g. , wired, wireless, etc.) and may be realized using a plurality of these devices.
- the functional block may be realized by combining software with the one device or the plurality of devices.
- the programs exemplified in the above-described embodiments are instructions, instruction sets, codes, codes, regardless of whether they are called software, firmware, middleware, microcode, hardware description language, or by other names. Should be broadly construed to mean a segment, program code, program, subprogram, software module, application, software application, software package, routine, subroutine, object, executable, thread of execution, procedure, function, etc.
- software, instructions, information, etc. may be sent and received via a transmission medium.
- a transmission medium For example, if the software uses wired technology (coaxial cable, fiber optic cable, twisted pair, digital subscriber line (DSL), etc.) and/or wireless technology (infrared, microwave, etc.) to create a website, When transmitted from a server or other remote source, these wired and/or wireless technologies are included within the definition of transmission medium.
- wired technology coaxial cable, fiber optic cable, twisted pair, digital subscriber line (DSL), etc.
- wireless technology infrared, microwave, etc.
- the information, parameters, etc. described in this disclosure may be expressed using absolute values, relative values from a predetermined value, or other corresponding information. It may also be expressed as
- the terminal device 10, the scanning device 20, and the server 30 may be mobile stations (MS).
- a mobile station is defined by a person skilled in the art as a subscriber station, mobile unit, subscriber unit, wireless unit, remote unit, mobile device, wireless device, wireless communication device, remote device, mobile subscriber station, access terminal, mobile terminal, wireless It may also be referred to as a terminal, remote terminal, handset, user agent, mobile client, client, or some other suitable terminology. Further, in the present disclosure, terms such as “mobile station,” “user terminal,” “user equipment (UE),” and “terminal” may be used interchangeably.
Landscapes
- Engineering & Computer Science (AREA)
- Physics & Mathematics (AREA)
- General Physics & Mathematics (AREA)
- Theoretical Computer Science (AREA)
- Computer Graphics (AREA)
- Computer Hardware Design (AREA)
- General Engineering & Computer Science (AREA)
- Software Systems (AREA)
- Image Processing (AREA)
Priority Applications (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
JP2024520319A JPWO2023218861A1 (enrdf_load_stackoverflow) | 2022-05-12 | 2023-04-14 |
Applications Claiming Priority (2)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
JP2022-078989 | 2022-05-12 | ||
JP2022078989 | 2022-05-12 |
Publications (1)
Publication Number | Publication Date |
---|---|
WO2023218861A1 true WO2023218861A1 (ja) | 2023-11-16 |
Family
ID=88730182
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
PCT/JP2023/015234 WO2023218861A1 (ja) | 2022-05-12 | 2023-04-14 | シミュレーション装置 |
Country Status (2)
Country | Link |
---|---|
JP (1) | JPWO2023218861A1 (enrdf_load_stackoverflow) |
WO (1) | WO2023218861A1 (enrdf_load_stackoverflow) |
Citations (6)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JP2002117414A (ja) * | 2000-10-11 | 2002-04-19 | Toyobo Co Ltd | 衣服衝突処理方法および衣服衝突処理プログラムを記録したコンピュータ読み取り可能な記録媒体 |
JP2013190974A (ja) * | 2012-03-13 | 2013-09-26 | Satoru Ichimura | 情報処理装置および情報処理方法、ならびに、プログラム |
JP2016110652A (ja) * | 2014-12-05 | 2016-06-20 | ダッソー システムズDassault Systemes | 少なくとも1つの衣類を装着したアバタを設計するためのコンピュータ実施方法 |
JP2016532197A (ja) * | 2013-08-04 | 2016-10-13 | アイズマッチ エルティーディー.EyesMatch Ltd. | 鏡における仮想化の装置、システム、及び方法 |
JP2020119156A (ja) * | 2019-01-22 | 2020-08-06 | 日本電気株式会社 | アバター生成システム、アバター生成装置、サーバ装置、アバター生成方法、およびプログラム |
JP2020170394A (ja) * | 2019-04-04 | 2020-10-15 | 株式会社Sapeet | 衣服着用可視化システム、及び衣服着用可視化方法 |
-
2023
- 2023-04-14 JP JP2024520319A patent/JPWO2023218861A1/ja active Pending
- 2023-04-14 WO PCT/JP2023/015234 patent/WO2023218861A1/ja active Application Filing
Patent Citations (6)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JP2002117414A (ja) * | 2000-10-11 | 2002-04-19 | Toyobo Co Ltd | 衣服衝突処理方法および衣服衝突処理プログラムを記録したコンピュータ読み取り可能な記録媒体 |
JP2013190974A (ja) * | 2012-03-13 | 2013-09-26 | Satoru Ichimura | 情報処理装置および情報処理方法、ならびに、プログラム |
JP2016532197A (ja) * | 2013-08-04 | 2016-10-13 | アイズマッチ エルティーディー.EyesMatch Ltd. | 鏡における仮想化の装置、システム、及び方法 |
JP2016110652A (ja) * | 2014-12-05 | 2016-06-20 | ダッソー システムズDassault Systemes | 少なくとも1つの衣類を装着したアバタを設計するためのコンピュータ実施方法 |
JP2020119156A (ja) * | 2019-01-22 | 2020-08-06 | 日本電気株式会社 | アバター生成システム、アバター生成装置、サーバ装置、アバター生成方法、およびプログラム |
JP2020170394A (ja) * | 2019-04-04 | 2020-10-15 | 株式会社Sapeet | 衣服着用可視化システム、及び衣服着用可視化方法 |
Also Published As
Publication number | Publication date |
---|---|
JPWO2023218861A1 (enrdf_load_stackoverflow) | 2023-11-16 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
WO2020203656A1 (ja) | 情報処理装置、情報処理方法及びプログラム | |
CN110716645A (zh) | 一种增强现实数据呈现方法、装置、电子设备及存储介质 | |
KR20190000397A (ko) | 패션 선호도 분석 기법 | |
BRPI0713114A2 (pt) | busca assistida por simulação | |
CN106202304A (zh) | 基于视频的商品推荐方法及装置 | |
JP2014089665A (ja) | 画像処理装置、画像処理方法、及び画像処理プログラム | |
US20200372560A1 (en) | Method for exploring and recommending matching products across categories | |
US20230156079A1 (en) | Fashion item analysis method and system based on user ensembles in online fashion community | |
CN111291746B (zh) | 影像处理系统及影像处理方法 | |
Guan et al. | Extended-XRI body interfaces for hyper-connected metaverse environments | |
KR101322914B1 (ko) | 비주얼 아이템이 적용된 제품의 검색방법 | |
WO2023218861A1 (ja) | シミュレーション装置 | |
Arulananth et al. | Python based smart trial room | |
Chi et al. | Research status and application scenarios of 3D human body modelling methods in the garment ergonomics: a systematic review | |
KR102575382B1 (ko) | 인공지능 기반 온라인 의류 판매 시스템 | |
CN112925941A (zh) | 数据处理方法及装置、电子设备及计算机可读存储介质 | |
CN117891968A (zh) | 一种基于用户需求的服装风格推荐方法及系统 | |
CN117408766A (zh) | 服饰穿搭推荐方法、电视机、介质及设备 | |
Wang | [Retracted] Research on the Architecture of Digital Song and Dance Costume Design System Based on Intelligent Deep Learning Algorithm | |
Yu et al. | Interactive context-aware furniture recommendation using mixed reality | |
Velioglu et al. | Fashionfail: Addressing failure cases in fashion object detection and segmentation | |
WO2023079875A1 (ja) | 情報処理装置 | |
JP7727765B2 (ja) | 表示制御装置 | |
CN112580794A (zh) | 属性识别装置、方法和系统及识别对象属性的神经网络 | |
JP7713541B2 (ja) | 表示制御装置及びサーバ |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
121 | Ep: the epo has been informed by wipo that ep was designated in this application |
Ref document number: 23803343 Country of ref document: EP Kind code of ref document: A1 |
|
WWE | Wipo information: entry into national phase |
Ref document number: 2024520319 Country of ref document: JP |
|
NENP | Non-entry into the national phase |
Ref country code: DE |
|
122 | Ep: pct application non-entry in european phase |
Ref document number: 23803343 Country of ref document: EP Kind code of ref document: A1 |