US20170103579A1 - Interactive textile article and augmented reality system - Google Patents
Interactive textile article and augmented reality system Download PDFInfo
- Publication number
- US20170103579A1 US20170103579A1 US15/385,402 US201615385402A US2017103579A1 US 20170103579 A1 US20170103579 A1 US 20170103579A1 US 201615385402 A US201615385402 A US 201615385402A US 2017103579 A1 US2017103579 A1 US 2017103579A1
- Authority
- US
- United States
- Prior art keywords
- design object
- computing device
- design
- augmented
- depiction
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Abandoned
Links
Images
Classifications
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T19/00—Manipulating 3D models or images for computer graphics
- G06T19/006—Mixed reality
-
- A—HUMAN NECESSITIES
- A63—SPORTS; GAMES; AMUSEMENTS
- A63F—CARD, BOARD, OR ROULETTE GAMES; INDOOR GAMES USING SMALL MOVING PLAYING BODIES; VIDEO GAMES; GAMES NOT OTHERWISE PROVIDED FOR
- A63F13/00—Video games, i.e. games using an electronically generated display having two or more dimensions
- A63F13/20—Input arrangements for video game devices
- A63F13/21—Input arrangements for video game devices characterised by their sensors, purposes or types
- A63F13/213—Input arrangements for video game devices characterised by their sensors, purposes or types comprising photodetecting means, e.g. cameras, photodiodes or infrared cells
-
- A—HUMAN NECESSITIES
- A63—SPORTS; GAMES; AMUSEMENTS
- A63F—CARD, BOARD, OR ROULETTE GAMES; INDOOR GAMES USING SMALL MOVING PLAYING BODIES; VIDEO GAMES; GAMES NOT OTHERWISE PROVIDED FOR
- A63F13/00—Video games, i.e. games using an electronically generated display having two or more dimensions
- A63F13/30—Interconnection arrangements between game servers and game devices; Interconnection arrangements between game devices; Interconnection arrangements between game servers
-
- A—HUMAN NECESSITIES
- A63—SPORTS; GAMES; AMUSEMENTS
- A63F—CARD, BOARD, OR ROULETTE GAMES; INDOOR GAMES USING SMALL MOVING PLAYING BODIES; VIDEO GAMES; GAMES NOT OTHERWISE PROVIDED FOR
- A63F13/00—Video games, i.e. games using an electronically generated display having two or more dimensions
- A63F13/40—Processing input control signals of video game devices, e.g. signals generated by the player or derived from the environment
- A63F13/42—Processing input control signals of video game devices, e.g. signals generated by the player or derived from the environment by mapping the input signals into game commands, e.g. mapping the displacement of a stylus on a touch screen to the steering angle of a virtual vehicle
-
- A—HUMAN NECESSITIES
- A63—SPORTS; GAMES; AMUSEMENTS
- A63F—CARD, BOARD, OR ROULETTE GAMES; INDOOR GAMES USING SMALL MOVING PLAYING BODIES; VIDEO GAMES; GAMES NOT OTHERWISE PROVIDED FOR
- A63F13/00—Video games, i.e. games using an electronically generated display having two or more dimensions
- A63F13/60—Generating or modifying game content before or while executing the game program, e.g. authoring tools specially adapted for game development or game-integrated level editor
- A63F13/65—Generating or modifying game content before or while executing the game program, e.g. authoring tools specially adapted for game development or game-integrated level editor automatically by game devices or servers from real world data, e.g. measurement in live racing competition
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T1/00—General purpose image data processing
- G06T1/0007—Image acquisition
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T2210/00—Indexing scheme for image generation or computer graphics
- G06T2210/16—Cloth
Definitions
- the present disclosure relates to an interactive textile articles and augmented reality system and related methods.
- Augmented reality is a direct or indirect view of a physical, real-world object that is augmented (or supplemented) by computer-generated elements or sensory inputs such as sound, video, graphics or GPS data.
- AR technology functions by enhancing one's current perception of reality.
- AR technology can be implemented in the physical world whereby computer generated elements are projected onto physical objects.
- AR technology can be implemented through computing devices whereby computer generated elements are overlaid onto images of physical objects captured by the computing device.
- Advances in computing, such as user interface technologies, data processing, object recognition have created new opportunities for implementing augmented reality technologies on mobile computing devices, such as smartphones and tablets.
- Textile articles such as garments, bedding, curtains, etc.
- textile articles define much of our sensory experience (e.g. sound, sight, touch, smell).
- wearable technologies includes sensors embedded into the fabric structure of a garment. The sensors measure physiological parameters and signal data can be transmitted to a linked computing device.
- Textile articles, however, as ubiquitous as they are, have not yet converged in a meaningful way with the nearly ubiquitous digital technologies brought about by modern mobile devices and advanced communication networks.
- an embodiment of the present disclosure is a system that includes an interactive textile article including a textile material having a face and a back that is opposed to the face. At least one of the face and the back includes a design object.
- the design object includes one or more design object identifiers.
- the design object is associated with an augmented reality software program configured to include content related to the design object.
- the system also includes a computing device that includes a memory containing the augmented reality program.
- the augmented reality program includes a plurality of augmented content levels.
- the computing device further includes a processor configured to execute the augmented reality program that compiles a design object depiction that is based on one or more design object identifiers in the design object.
- the augmented reality program can also execute a first augmented content level of the plurality of augmented content levels.
- the first augmented content level is configured to display 1) the design object depiction, 2) one or more augmentation elements related to the design object, and 3) at least one input element configured to control the design object depiction and the one or more augmentation elements.
- the textile article is a blanket, bed linen, a comforter, a rug, a carpet, a tapestry, a set of curtains, any other home textile product, or a garment.
- Another embodiment of the disclosure is a method for displaying augmented reality on a computing device based on an interactive textile article.
- the method includes scanning a portion of an interactive textile article that includes a design object so as to identify one or more design object identifiers of the design object.
- the method further includes the step of compiling a design object depiction based on the one or more design object identifiers contained in the design object.
- the method also includes executing a first augmented content level of the plurality of augmented content levels so as to display 1) the design object depiction, 2) one or more augmentation elements that are related to the design object, and 3) at least one input element that controls at least one of the design object depiction and the one or more augmentation elements.
- Another embodiment of the disclosure is a method of making an augmented reality system.
- the method includes manufacturing a textile material having a face and a back that is opposed to the face such that at least one of the face and the back includes a design object including one or more design object identifiers.
- the method also includes forming the textile material into a textile article.
- the method can include packaging the textile article with one or more access codes.
- the input of the access codes into a user interface running a portal on a computing device causes the augmented reality software application to be stored on a memory of the computing device.
- the augmented reality software application is configured to a) compile a design object depiction based on the one or more design object identifiers including in the design object.
- the augmented reality software application is further configured execute a first augmented content level of a plurality of augmented content levels, so as to display 1) the design object depiction, 2) one or more augmentation elements that are related to the design object, and 3) at least one input element that controls at least one of the design object depiction and the one or more augmentation elements.
- FIG. 1 is a schematic illustrating a computing device and an interactive textile article that can implement an augmented reality application, according to an embodiment of the present disclosure.
- FIG. 2 is a diagram illustrating multiple, networked computing devices, according to an embodiment of the present disclosure.
- FIG. 3 is a client computing device as shown in FIGS. 1 and 2 .
- FIG. 4 is a server computing device shown in FIG. 2 .
- FIG. 5 is a schematic diagram illustrating multiple augmented reality programs contained in a memory of a computing device.
- FIG. 6 is a schematic diagram illustrating an exemplary augmented reality program.
- FIG. 7 is an enlarged schematic view of an interactive textile.
- FIGS. 8A-8C are detailed views of design objects embodied in the textile article shown in FIG. 7 .
- FIG. 9 is a sectional view of the textile article taken along line 9 - 9 in FIG. 7 .
- FIG. 10 is a sectional view of a portion of a woven textile including printed design objects, according to an embodiment of the present disclosure.
- FIG. 11 is a sectional view of a portion of a woven textile with integrally formed design objects, according to an embodiment of the present disclosure.
- FIGS. 12A-12C illustrate alternative embodiments of pile fabrics with integrally formed design objects.
- FIG. 13 is a schematic plan view of a portion of a knitted textile, with integrally formed design objects, according to an embodiment of the present disclosure
- FIGS. 14A-14F are schematic illustrations of fabric design elements.
- FIGS. 15A-15C illustrate alternative design objects applied to a textile article.
- FIGS. 16A-16C are process flow diagrams illustrating a method for implementing an augmented reality application based on design elements with an interactive textile, according to an embodiment of the present disclosure.
- An embodiment of the present disclosure is a system 1 configured to implement an augmented reality environment on the computing device 20 based content that is related to one or more designs on the textile article 100 .
- the system 1 can include an interactive textile article 100 and computing device 20 .
- the textile article 100 includes a textile material 110 that includes one or more design objects 120 .
- the computing device 20 can include a scanning device that can scan the textile material and can convert the scanned data into a digital data set.
- An augmented reality software program can cause the display of a) one or more design object depictions 60 and 62 that correspond to design objects 120 contained in the scanned data set, b) augmentation elements 80 than can manipulate the displayed environment, and c) input elements 70 that control design object depictions 60 and 62 and augmentation elements 80 .
- a user can interact with displayed augmented reality via the input elements 70 through a series of content levels or levels that include content that is related the particular design object.
- the augmented reality and interactive textile system 1 is configured to allow a user to interact with different components of the textile article 100 through digital medium.
- an embodiment of the present disclosure is a system 1 including at least one server computing device 10 , a plurality of computing devices 20 a, 20 b , 20 c , . . . 20 n, in electronic communication with the server computing device 10 , and one or more software applications 30 s and 30 c (see FIGS. 3 and 4 ) implemented across computing devices 10 and 20 a, 20 b, 20 c . . . 20 n.
- Each computing device 20 a, 20 b, 20 c , . . . 20 n may be associated with a different person or user.
- one or more up to all of the computing devices 20 a - 20 n can be associated via social network.
- reference number 20 is used interchangeably with reference numbers 20 a, 20 b, 20 c . . . , 20 n, unless noted otherwise.
- the present disclosure describes software applications implemented over system components and configured to execute various steps in the methods described below. It should be appreciated that a software application can implement steps in the described methods utilizing all of the system components or just portions of the system components. Furthermore, the software applications are described below in singular form. It should be appreciated that multiple software applications may interface to perform the described functions and multiple applications can run on more than one computing device to implement the methodologies described herein.
- the system 1 can be implemented via exemplary architecture that includes computing devices 10 , 20 a, 20 b, 20 c . . . , 20 n in electronic communication with each other via a common communications network, such as, for example, the Internet.
- the computing devices 20 a, 20 b, 20 c . . . 20 n and server computing device 10 are arranged in a client-server architecture.
- the server computing device 10 can receive and transmit data to other computing devices 20 via the communications network.
- one up to all the computing devices 20 can receive information from the other computing devices 20 .
- one up to all of the computing devices 20 can transmit information to the other computing devices 20 .
- one or all of the computing devices 10 , 20 can access information on the other computing devices 10 , 20 .
- “Access” or “accessing” as used herein can include retrieving information stored in memory on a computing device.
- “access” or “accessing” includes sending instructions via the network from server computing device 10 to computing device 20 a so as to cause information to be transmitted to the memory of the computing device 20 a for access locally by the computing device 20 a.
- “access” or “accessing” can include the server computing device 10 sending an instruction to computing device 20 a to access information stored in the memory of the computing device 20 a.
- Reference to server computing device 10 and computing device 20 a in this paragraph is exemplary and are used to only clarify use of words “access” or accessing.”
- FIG. 2 illustrates a client-server network.
- the software application can be implemented over any number of network configurations.
- the computing devices 20 a, 20 b, 20 c . . . 20 n are configured as a peer-to-peer network architecture.
- the computing devices 20 a, 20 b, 20 c . . . 20 n can be arranged in a ring-type network architecture.
- the software application can be implemented across computing devices arranged on a network that includes aspects of a client-server network, peer-to-peer network, ring-type network, and/or other network architectures known to a person of ordinary skill in the art. Accordingly, it should be appreciated that numerous suitable alternative communication architectures are envisioned for implementing the software application 30 on a user's computing device.
- the computing device 20 is configured to receive, process, and store various information used to implement one or more software applications, such as client software application 30 c.
- client software application 30 c client software application
- the hardware components of computing device 20 can include any appropriate device, examples of which include a portable computing device, such as a laptop, tablet or smart phone, or other computing devices, such as, a desktop computing device or a server-computing device.
- the computing device 20 includes one or more processors 22 , a memory 24 , an input/output 26 , and a user interface (UI) 28 .
- the processor 22 , memory 24 , input/output portion 26 and user interface 28 can be coupled together to allow communications therebetween, and can interface with the client software application 30 c.
- the client software application 30 c may include an application programmatic interface (API).
- API application programmatic interface
- any of the above components may be distributed across one or more separate devices.
- the computing device 20 can include scanning device, such a camera that captures an image of the design object 120 .
- the camera may include charge-coupled device (CCD) or a contact image sensor (CIS) as the image sensor.
- CCD charge-coupled device
- CIS contact image sensor
- the memory 24 can be volatile (such as some types of RAM), non-volatile (such as ROM, flash memory, etc.), or a combination thereof, depending upon the exact configuration and type of processor 22 .
- the computing device 20 can include additional storage (e.g., removable storage and/or non-removable storage) including, but not limited to, tape, flash memory, smart cards, CD-ROM, digital versatile disks (DVD) or other optical storage, magnetic cassettes, magnetic tape, magnetic storage or other magnetic storage devices, universal serial bus (USB) compatible memory, or any other medium which can be used to store information and which can be accessed by the computing device 20 .
- the input/output portion 26 includes an antenna or an electronic connector for wired connection, or a combination thereof.
- input/output portion 26 can include a receiver and transmitter, transceiver or transmitter-receiver.
- the input/output portion 26 is capable of receiving and/or providing information pertaining to communication with a network such as, for example, the Internet.
- transmit and receive functionality may also be provided by one or more devices external to the computing device 20 .
- the input/output portion 26 can be in electronic communication with a receiver.
- the user interface 28 which can include an input device and/or display (input device and display not shown) that allows a user to communicate with the computing device 20 .
- the user interface 28 can include inputs that provide the ability to control the computing device 20 , via, for example, buttons, soft keys, a mouse, voice actuated controls, a touch screen, movement of the computing device 20 , visual cues (e.g., moving a hand in front of a camera on the computing device 20 ), or the like.
- the user interface 28 can provide outputs, including visual displays. Other outputs can include audio information (e.g., via speaker), mechanically (e.g., via a vibrating mechanism), or a combination thereof.
- the user interface 28 can include a display, a touch screen, a keyboard, a mouse, an accelerometer, a motion detector, a speaker, a microphone, a camera, or any combination thereof.
- the user interface 28 can further include any suitable device for inputting biometric information, such as, for example, fingerprint information, retinal information, voice information, and/or facial characteristic information, for instance, so as to require specific biometric information for access to the computing device 20 .
- the computer devices can operate via any suitable operating system, such as Android, BSD, iOS, Linux, OS X, QNX, Microsoft Windows, Windows Phone, and IBM z/OS.
- the software application can operate with any of the aforementioned operation systems.
- FIG. 4 is an operation diagram of the server computing device 10 .
- the server computing device 10 includes one or more processors 12 , a memory 14 , an input/output 16 , and a user interface (UI) 18 , and one or more software applications, such as server software application 30 s.
- the server software application 30 s may also include an application programmatic interface (API).
- the processor 12 , memory 14 , input/output portion 16 and interface 18 can be coupled together to allow communications therebetween. As should be appreciated, any of the above components may be distributed across one or more separate server computing devices.
- the server computing device processor 12 , memory 14 , input/output 16 , and interface 18 are similar to the processor 22 , memory 24 , input/output 26 , and interface 28 described above with respect computing device 20 .
- server computer device can operate via any suitable operating system, such as Android, BSD, iOS, Linux, OS X, QNX, Microsoft Windows, Windows Phone, and IBM z/OS. It is emphasized that the operation diagram depiction of the server computing device 10 is exemplary and not intended to imply a specific implementation and/or configuration.
- the software application 30 can comprise the client application 30 c and the server application 30 s. Accordingly, certain functions can be implemented on the server computing device 10 and other functions can be implemented on the client computing devices 20 .
- Software application 30 , client application 30 c, server application 30 s may be used interchangeably herein.
- the software application 30 can include one or more augmented reality programs 40 contained in the memory 14 of the server computing device 10 (or client computing device 20 as needed).
- memory 14 includes three augmented reality programs 40 a, 40 b, and 40 c. It should be appreciated that more than the three augmented reality programs 40 can be stored on the server computing device 10 and/or optionally on the client computing device 20 .
- Reference sign 40 will be used interchangeably with reference signs 40 a, 40 b , and 40 c unless noted otherwise.
- portions of an augmented reality program 40 may also be stored in memory 24 of a client computing device 20 .
- all of the augmented reality programs 40 can be stored on the server computing device 10 and transmitted to the client computing device 20 .
- the memory 14 of computing device 10 may include a database (not shown) that contains the association between one or more augmented reality programs 40 and one or more different design objects embodied on textile article 100 .
- each augmented reality program 40 includes an image recognition module 42 , a design object compiler 44 , an interface module 46 , and a plurality of augmented content levels 50 , 52 , and 54 .
- the image recognition module 42 is configured to process data captured via the scanning device, e.g. the camera, on the client computing device 20 and identify the design object.
- the image recognition module 42 can identify one or more design object identifiers in the captured scanned data of the design object. In some instances, spatial relationships are determined among the located design object identifiers. Based on the design object identifiers, the image recognition module 42 can compile a data set that is indicative of the design object contained in the image.
- the design object compiler 44 can compile one or more design object depiction.
- a design object depiction is a virtual representation of the design object 120 embodied in the textile article 100 .
- the design object compiler 44 can create the design object depiction based on the compiled data set and the geometric relationships between various design object identifiers. In other embodiments, the design object compiler 44 can also access design object depictions that may be stored on the computer memory 14 of the server device 10 . In still other embodiments, the design object compiler 44 can compile design object depictions based on the compiled data set and a database of design models related to the design object. In such an embodiment, the design object compiler 44 , when executed by the processor, determines which design models are associated with the compiled data parameters. Upon identification of the associated design models, the design object compiled can build a design object depiction from the design model and other parameters obtained in the compile data set, such as geometric data.
- the interface module 46 is configured to interface the design object depiction with the augmentation levels as well as other data related to visual scene in the captured scanned data. In some instances, the interface module 46 can also integrate other portions of the image with the design object depiction and the augmentation levels.
- the augmented content levels 50 , 52 , and 54 implement the design object depiction into an interactive augmented reality display on screen 90 ( FIG. 1 ) of the computing device 20 .
- each augmented content level 50 , 52 , and 54 is configured to display on the screen 90 1) respective design object depiction 60 (or 62 ) ( FIG. 1 ), 2) one or more augmentation elements 80 that are related to the design object 120 , and 3 ) at least one input element 80 configured to control the design object depiction 60 and the one or more augmentation elements 80 .
- the augment reality program includes content related the design object 120 .
- the content can be thematically associated with the design object 120 on the textile article 100 obtained in the image.
- one exemplary design object is a house 120 a
- the augmented elements 80 can partially define the content related to the design object depiction 60 , such as the house 120 a design into the textile material 110 .
- augmentation elements 80 can include an arrangement of the house 120 in a virtual neighborhood that includes a street, a yard, neighboring houses, a tree, etc.
- the augmentation elements 80 can also include internal structures of the house, such as, a living room, kitchen, dining room, and a bedroom.
- the augmentation elements 80 are configured to dynamically respond to inputs received via the input elements, or alter the environment based on predefined rules. For example, in the example shown in FIG. 1 , the augmentation elements 80 may be configured to modify the displayed environment based on a perceived time of day, e.g. the sun is out during the day and sets at night, such that lighting and shadows are shown in the display based on the perceived time of date.
- the input element 70 can cause a person 80 to enter the house, turn on the lights, and enter different rooms. Accordingly, the input elements 70 permit the user to control and/or interact with the design object depiction 60 and the augmentation elements 80 .
- the augmented content levels 50 , 52 , and 54 are configured to allow user progression through the augmented reality environment.
- the augmented content levels can also be referred to as a content level or stage. While a house 120 a is shown in FIG. 1 and described above as the design object, the design object 120 can be any particular design. The related content in augment reality environment can therefore vary along with the particular design object embodied on the textile article.
- the augmented reality program 40 includes a first augmented content level 50 , a second augmented content level 52 , and a third augmented content level 54 .
- Each augmented content level 50 , 52 , and 54 can include one or design object depictions, such as design object depictions 60 and 62 as shown in FIG. 1 .
- Each augmented content level may include similar content among the plurality of content levels. For example, all content is related to a house.
- each augmented content level 50 , 52 , and 54 may include content that varies among the plurality of augmented content levels. For example, content can related to house 120 a in the first augmented content level 50 , a tree in the second augmented content level 52 , and the moon 120 d in the third augmented content level.
- the second augmented content level includes one more design object depictions.
- the second augmentation level can include either or a first design object depiction based the design object contained in the initial image, or a second design object depiction based a second design object in the textile article.
- the second augmentation level also includes one or more second augmentation elements that are related to the respective design object and at least one second input element that controls the design object depiction and the one or more second augmentation elements.
- the second augmentation elements can include content specific to the kitchen in the house, for instance; to allow the user to engage virtual cooking.
- the second design object is the tree next to the house, and the first augmentation element relates to the structure of the house
- the second augmentation elements include content specific to tree, for example, such as a tree structure for climbing, tree species, etc.
- the third augmented content level includes either the design object depiction contained in the initial image, the second design object depiction, or a third design object depiction based a third design object of the textile article.
- the third augmentation level also includes one or more third augmentation elements that are related to the design object.
- the third augmentation level also includes at least one third input element that controls at least one of the design object depictions and the third augmentation elements.
- the third augmentation elements can include content specific to the neighborhood, such as streets. For instance, the third augmentation level can allow a user engage in a bike race through the neighborhood.
- the user interface displays a content change element 92 ( FIG. 1 ). Selection of the content change element 92 can initiate the second augmented content level 52 . As the user progresses through and completes the second augmented content level 52 , the user interface displays another content change element. Selection of the content change element 92 can initiate the third augmented content level 54 . At each step where a subsequent augmented content level is initiated, the application can cause the display of a payment portal to access the next augmented content level. Although three augmented reality content levels are illustrated, the augmented reality program 40 can have two or more than three augmented content levels.
- the interactive textile article 100 is illustrated.
- the textile article 100 is a blanket 102 including a border 104 that defines a perimeter of the textile article 100 .
- the textile article 100 can be any other article that includes generally planar textile material that forms a substantial component of the article.
- a textile article 100 as used herein includes, but is not limited to, a blanket, bed linen, a comforter, a rug, a carpet, a tapestry, a tapestry, a set of curtains, or a garment.
- the textile article 100 can be packaged with one or more access codes.
- the access codes can be tags, alphanumeric codes, QR codes, hyperlinks including on a textile material itself or on tags package with the textile article 100 .
- Input of the access codes into an interface on the computing device 20 causes augmented reality software application 30 to be stored on a memory of the computing device 20 .
- the textile article 100 includes a textile material 110 and at least one (a plurality) of design objects 120 .
- the design objects include one or more design object identifiers 130 .
- the textile material 110 has a face 112 and a back 114 that is opposed to the face 112 .
- the design objects 120 are disposed on the face 112 of the textile material 110 . In some instances, the design object may be disposed on the face 112 and back 114 of the textile material.
- the textile material is further described below. As illustrated, the textile material 110 includes multiple design objects, such as a house 120 a, a tree 120 b, clouds 120 c, moon 120 d, and stars 120 e.
- the design objects 120 can be any design, shape or arrangement of elements.
- the design object may include, but is not limited to, a person, a building, an automobile ( FIG. 15A ), an airplane ( FIG. 15B ), a boat, a castle ( FIG. 15C ), sports gear (e.g. a football, soccer ball, basketball, golf club and golf ball, baseball and bat), an animal, a geographic location, a landscape, a forest, a tree, a mountain, a river, a lake, an ocean, a sun, a star, a moon, a planet, or a galaxy.
- sports gear e.g. a football, soccer ball, basketball, golf club and golf ball, baseball and bat
- an animal a geographic location, a landscape, a forest, a tree, a mountain, a river, a lake, an ocean, a sun, a star, a moon, a planet, or a galaxy.
- the design object may also be a character from an audio-visual work, a literary work, or a theatrical work, or any type of multi-media animation.
- the design object can include one or more characters within the portfolio of Disney branded characters, such as characters from the Frozen franchise, or other Disney characters.
- the design object can include one or more characters, such as Spider-man, Superman, Batman, X-Men, or other contemporary characters.
- each design object 120 includes one or design object identifiers 130 .
- the design object identifier can be one or more portions the design object 120 .
- the design object identifier 130 is one more the edge portions 131 of design object 120 .
- the edge portion can be a linear edge, angled linear, a curved edge, a circular edge, or an intersection of two edges.
- the edge portion can be defined by contrast between two colors in the design.
- FIGS. 8A-8C illustrate exemplary object identifiers for design objects illustrated as the house 120 a, a tree 120 b, clouds 120 c, and the moon 120 d. In FIG.
- the design object 120 a includes object identifiers configured as linear design elements 132 and angled elements 134 .
- the moon 120 d includes object identifiers configured as curved elements 136 .
- the tree 120 b includes object identifiers configured as monolithic shape elements 138 .
- the object identifiers are illustrated as components of the design object.
- the object identifier can be a code, an alphanumeric code, QR codes, embedded into or printed onto the textile material.
- Such an electronic code can be associated with predefined design object stored in the server memory 14 .
- the image recognition module 42 can be used to identify the design object on the textile material.
- the object identifier can include an electronic object identifier in the form of a transmitter formed into the textile structure via a yarn or conductive printed materials. The electronic object identifier can transmit signal concerning the design object to the computing device.
- the textile material 110 can be a woven fabric, a knit fabric (weft or warp), a nonwoven fabric, and laminates thereof.
- the textile material can be laminate that includes: a) one or more films and a woven fabric; b) one or more films and a knit fabric; and c) one or more films and a nonwoven fabric.
- the design object 120 can be integrally formed into the textile material, as illustrated in FIGS. 11-13 .
- the design object may be printed onto the textile material, as illustrated in FIGS. 9, 10, and 15A-15C .
- the textile material 110 can be a woven fabric 140 that defines a woven structure with the design object 120 is integrally formed into the woven structure.
- the woven fabric 140 b includes a plurality of warp yarns 142 b interwoven with a plurality of weft yarns 144 b to define the design object 120 .
- the areas where the warp yarns 142 b are exposed to the face of the fabric can be arranged in such a way so as to define the design object 120 and object identifier 130 .
- woven fabrics can be based on dobby, jacquard, complex (multiple warps), 3D fabric weaving systems, which can be used to create simple to complex patterns on the face of the woven fabric.
- FIG. 11 illustrates an embodiment of woven fabric 140 a utilizing a multiple warp system to create complex design objects.
- FIGS. 14A-14F are schematic diagrams illustrating how the warp yarns and weft yarns in the woven fabric design can define the object identifier 130 .
- Each diagram includes multiple unit cells 190 with a machine or warp direction 2 that is perpendicular to a cross or weft direction 4 .
- the shaded cells 192 represent a warp yarn exposed to the fabric face over a weft yarn.
- the woven fabric 140 can define a series of warp yarns exposed on the face in such way to define a linear edge design element 132 .
- An angled linear design element 134 is shown FIG. 14A and a linear design element 132 aligned with the warp direction is shown in FIG. 14D .
- the fabric design can define two linear elements 132 a that intersect, as illustrated in FIG. 14B .
- the woven fabric design can define a circular design element 139 , as shown in FIG. 14C .
- the woven structure can define a curved element 136 .
- the woven structure can define a complete shape element 138 depending on the type of fabric formation system used.
- woven structures can appear pixilated at the location where the warp yarn travels back beneath the weft yarn (see e.g. intersection of until cells 196 and 198 in FIGS. 14A-14F ).
- the image recognition module 42 described above is configured to account for such a pixilation by compiling a design element line along the identified design element, similar to the line 191 shown in FIGS. 14A-14F .
- the design element line 191 can be used to define the edge of the design object, and therefore an object identifier as described above.
- the woven fabric can include distinct sets of weft yarns or in-laid yarns that are exposed to the face adjacent the warp facing yarn.
- the distinct in-laid yarns may be a different color, construction or fiber type that helps further define the specific design element and line path 191 .
- FIGS. 12A-12C illustrate a textile material configured as pile fabric that includes a pile structure and a design object integrally formed in the pile structure.
- specific piles can be formed from different colored yarns arranged so as to define a portion of the design object 120 .
- one or more pile floats can be formed into the pile fabric to define a design element, such an edge of a design object 120 .
- a pile fabric 150 a includes looped piles 152 a and a ground or base fabric 154 a.
- the looped piles 152 a are selectively placed along a warp and weft direction 2 and 4 (weft direction 4 not shown) so as define portion of the design object 120 and an edge that define an design object identifier.
- a pile fabric 150 b includes cut piles 152 b and a ground or base fabric 154 b.
- the cut piles 152 b are selectively placed along a warp and weft direction 2 and 4 (weft direction 4 not shown) so as define portion of the design object 120 .
- Floats can define an edge or design object identifier 130 . As shown in FIG.
- a pile fabric 150 c configured as tuft rug that includes cut piles 152 c and a ground or base fabric 154 c.
- One or more of cut piles 152 c include color yarns 156 c that are selectively placed along a warp and weft direction 2 and 4 (not shown) so as define portion of the design object 120 .
- the edge of colored yarns 156 c can define an edge or design object identifier 130 .
- FIGS. 14A-14F can also illustrate how various pile yarns in a pile fabrics 150 a , 150 b, and 150 c can define the design object 120 and the one or more object identifiers 130 .
- each unit cell in design illustrations in FIGS. 14A-14F could represent a single pile.
- the shaded cells 192 can represent specific piles formed from different colored yarns arranged so as to define a portion of the design object. Unshaded shells can represent different yarns or pile floats.
- the pile fabric can include pile yarns of a first color or type arranged to define an angled linear design element 134 is shown FIG.
- the piles fabric design can arrange the first color pile yarns arranged to define: two linear elements 132 a that intersect as illustrated in FIG. 14B ; a circular design element 139 shown in FIG. 14C ; a curved element 136 as shown in FIG. 14E ; or a complete shape element 138 as shown in FIG. 14F .
- the textile material is a knit fabric 160 includes a knit structure 161 and design objects 120 integrally formed into the knit structure 161 .
- FIG. 13 illustrates an exemplary schematic of a knit fabric 160 with a knit structure 161 having the a plurality of knit stitches 162 that define the design object 120 .
- the arrangement of knit stitches 162 can further define the object identifier 130 .
- the knit fabric 160 may be a weft knit, such as a double knit, rib knit, or jacquard knit formed to define specific arrangements of knit stitches 162 .
- the knit stitches 162 may include tuck, held, float stitches, and/or inlaid yarns.
- FIGS. 14A-14F also illustrate how various yarn components in knit structure define the design object 120 and object identifier 130 .
- the shaded cells 192 can represent a specific stitch type, such as a tuck, held, or float stitch, or inlay.
- the shaded shells 192 represent a knit structural component that is used to define a portion of the design object 120 .
- specific yarn types can be inserted or inlayed adjacent to knit stiches represented by the shaded cells 192 in order to further define the edge or design element, as described above.
- FIGS. 14A-14F can also illustrate how various knit structure in a knit fabric 160 can define the design object 120 and the one or more object identifiers 130 .
- each unit cell in design illustrations in FIGS. 14A-14F could represent a knit stitch.
- the shaded cells 192 can represent specific types of knit stitches.
- the knit fabric 161 can define an arrangement stitches to define: an angled linear design element 134 is shown FIG. 14A ;a linear design element 132 aligned with the warp direction 2 is shown in FIG. 14D ; two linear elements 132 a that intersect as illustrated in FIG. 14B ; a circular design element 139 shown in FIG. 14C ; a curved element 136 as shown in FIG. 14E ; or a complete shape element 138 as shown in FIG. 14F .
- the knit fabric may alternatively be a warp knit, such as a tricot or rachel warp knitted fabric.
- the woven, knit and pile fabrics can be formed from a wide range of fiber types and yarn formation systems.
- the woven, knit and pile fabrics can be formed from any number of yarn types, such a spun yarns or continuous filament yarns.
- Spun yarns may include natural fibers, synthetic fibers, or blends of natural and synthetic staple fibers.
- Natural fibers include cotton, wool, or others.
- Synthetic fibers may include polyethylene terephthalate (PET), viscose rayon, acrylic, or other fiber types, such as flame resistant fibers, as needed.
- Suitable thermoplastic synthetic staple fibers may be mono-component, bi-component, or tri-component type fibers. Such fibers can be splitable to define microfibers.
- Spun yarns can therefore include: spun cotton yarns and/or spun cotton and polyethylene terephthalate (PET) blended yarns.
- Continuous filaments yarns may include either or both mono-component or bicomponent filaments types.
- Continuous filament yarns can be polyethylene terephthalate, polyolefin, and/or polyamide 6, polyamide 6,6, polylactic acid filaments.
- the textile material 110 may also be a nonwoven fabric having a design object 120 printed thereon. Any suitable printing technique can be used to define the design object.
- Suitable nonwoven fabrics include melt-spun nonwovens, such as spunbond and meltblown materials.
- a meltspun nonwoven can include a single spunbond layer, multiple spunbond layers, a single meltblown layer, multiple meltblown layers, or multiple layers of spunbond and meltblown materials.
- Meltspun nonwovens can be formed with polyethylene terephthalate, polyolefin, and/or polyamide 6, polyamide 6,6, or polylactic acid polymers.
- Nonwoven fabrics can be formed with mono-component, bi-component, or tri-component fibers. Such fibers can be splitable and further processed to define microfibers.
- nonwoven fabric is Evolon® manufactured by Freudenberg & Co.
- the nonwoven fabrics can be bonded thermally bonded, chemically bonded, and/or mechanically bonded, e.g. via needles, stitch bonded, or hydraulically bonded.
- the nonwoven fabrics can be embossed to define one or more design objects.
- the design object 120 can be printed on the any one of the aforementioned textile materials including, nonwoven materials. Printing can include digital printing, screen printing, sublimation printing.
- the textile material can be a woven fabric 140 with the design object 120 printed on its face 112 (and/or back).
- the textile material can be a weft knitted fabric 150 with the design object 120 printed on its face 112 (and/or back).
- the textile material can be a warp knitted fabric with the design object 120 printed on its face 112 (and/or back).
- the textile material can be a nonwoven fabric with the design object 120 printed on its face 112 (and/or back).
- FIGS. 15A-15C illustrate alternative embodiments of a textile article 300 a, 300 b , and 300 c.
- Textile articles 300 a, 300 b, and 300 c include textile materials 310 a, 310 b, and 310 c , respectively.
- Textile articles 300 a, 300 b, and 300 c include design objects 320 a, 320 b, and 320 c printed onto the textile material.
- the design object 320 a can be an automobile, such as a race car. Content in the related augmented reality program is related to race cars.
- the design object 320 b can be a castle.
- Content in the related augmented reality program is related to a castle.
- the design object 320 c can be an airplane. Content in the related augmented reality program is related to an airplane.
- FIGS. 16A-16C illustrate a method 200 for implementing an augmented reality environment utilizing the interactive textile article 100 and computing device 20 .
- the method initiates in block 202 when the user accesses and stores the application 30 on the computing device 20 .
- the user can input an electronic access code packaged with the textile article 100 into an interface on the computing device 20 .
- the user can access a portal via the computing device 20 to download the application 30 .
- a scanning device on the scan a portion of an interactive textile article that includes a design object.
- scanning the textile includes capturing an image of the textile via a camera.
- the application 30 identifies one or more design object identifiers of the design object in the image.
- the application compiles a design object depiction based on the one or more design object identifiers contained in the design object. Process control is transferred to block 214 .
- the first augmented content level is initiated.
- Process control is transferred to block 220 .
- the first augmented content level displays on the screen 90 ( FIG. 1 ) 1) the design object depiction, 2) one or more augmentation elements that are related to the design object, and 3) at least one input element that controls one of the design object depiction and the one or more augmentation elements.
- the application determines if the input elements are engaged. If the input elements have not been engaged, process control is transferred to block 228 . In block 228 , the first augmentation level progressives passively. If the input elements have been engaged, process control is transferred to block 232 , where the first augmentation level progresses interactively. For instance, the user can control interaction of the design object depiction with the at least one augmentation elements.
- the application causes the display of portals or elements, the selection of which cause the display of additional augmentation levels, such as the second augmentation level.
- the application determines if the second augmentation level element has been selected by the user. If the second augmentation level element is not selected by the user, process control is transferred to block 232 . The user can progress though the first augmentation level interactively. If the second augmentation level element has been selected by the user, process control is transferred to block 242 .
- the application initiates the second augmented content level.
- Process control is transferred to block 244 .
- the second augmented content level displays 1) the first design object depiction and/or the second design object depiction), 2) one or more second augmentation elements that are related to the respective design object, and 3) at least one second input element that controls at least one of the design object depiction and the one or more second augmentation elements.
- Process control is transferred to block 248 .
- the application determines if the second input elements are engaged. If the second input elements have not been engaged, process control is transferred to block 252 . In block 252 , the second augmentation level progresses passively. If the second input elements have been engaged, process control is transferred to block 256 , where the second augmentation level progressives interactively. For instance, the user can control interaction of the first and/or second design object depiction with the second augmentation elements.
- the application causes the display of content change elements 92 ( FIG. 1 ). Selection of content change element 92 causes the display of additional augmentation levels, for example, such as the third augmentation level.
- the application determines if the third augmentation level element has been selected by the user. If the third augmentation level element is not selected by the user, process control is transferred to block 256 and user can progress though the third augmentation level interactively. If the third augmentation level element has been selected by the user, process control is transferred to block 274 . In block 274 , the method progresses through steps similar to the steps illustrated in blocks 242 through 270 . For instance, additional augmentation levels, such as the third augmentation level can be initiated. At each step (blocks 242 and 272 ) where a subsequent augmented content level is initiated, the application can cause the display of a payment portal to access the next augmented content level.
- each computing device 20 a, 20 b, 20 c , . . . , 20 n may be associated with a person or user.
- Each user can participate in an networked augmented reality environment by associating their computing device with a networking platform, such as a gaming or application platform (not shown). Once associated with the gaming platform, the user can enter and participate in one or more the augmented reality content levels described above.
- a first user can be associated with a first computing device, which can include an augmented reality software program that executes the first augmented content level.
- a second user can be associated with a second computing device, which can include an augmented software program that is executing a first augment content level that is interfaced to the first augmented content level of the first computing device.
- the first and second users can each control their respective design object depictions and augmentation elements within a common augmented reality environment.
- Another embodiment of the present disclosure is a method for making an augmented reality system including an interactive textile article.
- the method includes manufacturing a textile material to includes a design object having one or more design object identifiers. Then, the textile material is formed into a textile article, such as cutting an hemming and adding binding as needed.
- the textile article is packaged with one or more electronic access codes as described above.
- the codes can be tags, alphanumeric codes, QR codes, or hyperlinks. Input of the codes into an interface on the computing device 20 causes the augmented reality software application 30 to be stored on a memory of the computing device 20 .
- the textile article can be manufactured as according to formation systems described above. Such that the design object is printed on the textile material or the design object is integrally formed into the textile structure.
Landscapes
- Engineering & Computer Science (AREA)
- Multimedia (AREA)
- Human Computer Interaction (AREA)
- Physics & Mathematics (AREA)
- General Physics & Mathematics (AREA)
- Theoretical Computer Science (AREA)
- Computer Graphics (AREA)
- Computer Hardware Design (AREA)
- General Engineering & Computer Science (AREA)
- Software Systems (AREA)
- User Interface Of Digital Computer (AREA)
Abstract
A system and method an interactive textile article including a textile material having a design object that includes one or more design object identifiers. The design object is associated with an augmented reality software program configured to include content related the design object. A processor configured to execute the augmented reality program so as to compile a design object depiction based the design object identifiers contained in the design object. The processor executes a first augmented content level of a plurality of augmented content levels so as to display 1) the design object depiction, 2) augmentation elements that are related to the design object, and 3) an input element configured to control the design object depiction and the augmentation elements.
Description
- The present application is a continuation of U.S. application Ser. No. 14/856,250, filed Sep. 16, 2015, issued as U.S. Pat. No. 9,524,589, which claims the benefit under 35 U.S.C. §119 of Indian Patent Application No. 3334/MUM/2015, filed Aug. 31, 2015. The contents of each application listed in this paragraph are incorporated by reference into the present application.
- The present disclosure relates to an interactive textile articles and augmented reality system and related methods.
- Augmented reality (AR) is a direct or indirect view of a physical, real-world object that is augmented (or supplemented) by computer-generated elements or sensory inputs such as sound, video, graphics or GPS data. AR technology functions by enhancing one's current perception of reality. AR technology can be implemented in the physical world whereby computer generated elements are projected onto physical objects. Alternatively, AR technology can be implemented through computing devices whereby computer generated elements are overlaid onto images of physical objects captured by the computing device. Advances in computing, such as user interface technologies, data processing, object recognition have created new opportunities for implementing augmented reality technologies on mobile computing devices, such as smartphones and tablets.
- Textile articles, such as garments, bedding, curtains, etc., are ubiquitous products. In a given day, textile articles define much of our sensory experience (e.g. sound, sight, touch, smell). But advances in textile technologies has led to new interactions between the user and textile materials. For example, wearable technologies includes sensors embedded into the fabric structure of a garment. The sensors measure physiological parameters and signal data can be transmitted to a linked computing device. Textile articles, however, as ubiquitous as they are, have not yet converged in a meaningful way with the nearly ubiquitous digital technologies brought about by modern mobile devices and advanced communication networks.
- There is need to implement an augmented reality application that interacts with design elements of interactive textile articles. Accordingly, an embodiment of the present disclosure is a system that includes an interactive textile article including a textile material having a face and a back that is opposed to the face. At least one of the face and the back includes a design object. The design object includes one or more design object identifiers. The design object is associated with an augmented reality software program configured to include content related to the design object. The system also includes a computing device that includes a memory containing the augmented reality program. The augmented reality program includes a plurality of augmented content levels. The computing device further includes a processor configured to execute the augmented reality program that compiles a design object depiction that is based on one or more design object identifiers in the design object. The augmented reality program can also execute a first augmented content level of the plurality of augmented content levels. The first augmented content level is configured to display 1) the design object depiction, 2) one or more augmentation elements related to the design object, and 3) at least one input element configured to control the design object depiction and the one or more augmentation elements. In such an embodiment, the textile article is a blanket, bed linen, a comforter, a rug, a carpet, a tapestry, a set of curtains, any other home textile product, or a garment.
- Another embodiment of the disclosure is a method for displaying augmented reality on a computing device based on an interactive textile article. The method includes scanning a portion of an interactive textile article that includes a design object so as to identify one or more design object identifiers of the design object. The method further includes the step of compiling a design object depiction based on the one or more design object identifiers contained in the design object. The method also includes executing a first augmented content level of the plurality of augmented content levels so as to display 1) the design object depiction, 2) one or more augmentation elements that are related to the design object, and 3) at least one input element that controls at least one of the design object depiction and the one or more augmentation elements.
- Another embodiment of the disclosure is a method of making an augmented reality system. The method includes manufacturing a textile material having a face and a back that is opposed to the face such that at least one of the face and the back includes a design object including one or more design object identifiers. The method also includes forming the textile material into a textile article. The method can include packaging the textile article with one or more access codes. The input of the access codes into a user interface running a portal on a computing device, causes the augmented reality software application to be stored on a memory of the computing device. In such an embodiment, the augmented reality software application is configured to a) compile a design object depiction based on the one or more design object identifiers including in the design object. The augmented reality software application is further configured execute a first augmented content level of a plurality of augmented content levels, so as to display 1) the design object depiction, 2) one or more augmentation elements that are related to the design object, and 3) at least one input element that controls at least one of the design object depiction and the one or more augmentation elements.
- The foregoing summary, as well as the following detailed description of illustrative embodiments of the present application, will be better understood when read in conjunction with the appended drawings. For the purposes of illustrating the present application, illustrative embodiments of the disclosure are shown in the drawings. It should be understood, however, that the application is not limited to the precise arrangements and instrumentalities shown.
-
FIG. 1 is a schematic illustrating a computing device and an interactive textile article that can implement an augmented reality application, according to an embodiment of the present disclosure. -
FIG. 2 is a diagram illustrating multiple, networked computing devices, according to an embodiment of the present disclosure. -
FIG. 3 is a client computing device as shown inFIGS. 1 and 2 . -
FIG. 4 is a server computing device shown inFIG. 2 . -
FIG. 5 is a schematic diagram illustrating multiple augmented reality programs contained in a memory of a computing device. -
FIG. 6 is a schematic diagram illustrating an exemplary augmented reality program. -
FIG. 7 is an enlarged schematic view of an interactive textile. -
FIGS. 8A-8C are detailed views of design objects embodied in the textile article shown inFIG. 7 . -
FIG. 9 is a sectional view of the textile article taken along line 9-9 inFIG. 7 . -
FIG. 10 is a sectional view of a portion of a woven textile including printed design objects, according to an embodiment of the present disclosure. -
FIG. 11 is a sectional view of a portion of a woven textile with integrally formed design objects, according to an embodiment of the present disclosure. -
FIGS. 12A-12C illustrate alternative embodiments of pile fabrics with integrally formed design objects. -
FIG. 13 is a schematic plan view of a portion of a knitted textile, with integrally formed design objects, according to an embodiment of the present disclosure -
FIGS. 14A-14F are schematic illustrations of fabric design elements. -
FIGS. 15A-15C illustrate alternative design objects applied to a textile article. -
FIGS. 16A-16C are process flow diagrams illustrating a method for implementing an augmented reality application based on design elements with an interactive textile, according to an embodiment of the present disclosure. - An embodiment of the present disclosure is a
system 1 configured to implement an augmented reality environment on thecomputing device 20 based content that is related to one or more designs on thetextile article 100. Thesystem 1 can include aninteractive textile article 100 andcomputing device 20. As illustrated, thetextile article 100 includes atextile material 110 that includes one or more design objects 120. Thecomputing device 20 can include a scanning device that can scan the textile material and can convert the scanned data into a digital data set. An augmented reality software program can cause the display of a) one or moredesign object depictions objects 120 contained in the scanned data set, b)augmentation elements 80 than can manipulate the displayed environment, and c)input elements 70 that controldesign object depictions augmentation elements 80. A user can interact with displayed augmented reality via theinput elements 70 through a series of content levels or levels that include content that is related the particular design object. In this way, the augmented reality andinteractive textile system 1 is configured to allow a user to interact with different components of thetextile article 100 through digital medium. - Referring to
FIG. 2 , an embodiment of the present disclosure is asystem 1 including at least oneserver computing device 10, a plurality ofcomputing devices server computing device 10, and one ormore software applications FIGS. 3 and 4 ) implemented acrosscomputing devices computing device computing devices 20 a-20 n can be associated via social network. For purposes of clarifying how the software application is implemented across the various computing devices,reference number 20 is used interchangeably withreference numbers - Continuing with reference to
FIG. 2 , thesystem 1 can be implemented via exemplary architecture that includescomputing devices computing devices server computing device 10 are arranged in a client-server architecture. Theserver computing device 10 can receive and transmit data toother computing devices 20 via the communications network. In addition, one up to all thecomputing devices 20 can receive information from theother computing devices 20. And one up to all of thecomputing devices 20 can transmit information to theother computing devices 20. Furthermore, one or all of thecomputing devices other computing devices server computing device 10 tocomputing device 20 a so as to cause information to be transmitted to the memory of thecomputing device 20 a for access locally by thecomputing device 20 a. In addition or alternatively, “access” or “accessing” can include theserver computing device 10 sending an instruction tocomputing device 20 a to access information stored in the memory of thecomputing device 20 a. Reference toserver computing device 10 andcomputing device 20 a in this paragraph is exemplary and are used to only clarify use of words “access” or accessing.” -
FIG. 2 illustrates a client-server network. But the software application can be implemented over any number of network configurations. For example, in alternate embodiments, thecomputing devices computing devices software application 30 on a user's computing device. - Turning to
FIG. 3 , thecomputing device 20 is configured to receive, process, and store various information used to implement one or more software applications, such asclient software application 30 c. It will be understood that the hardware components ofcomputing device 20 can include any appropriate device, examples of which include a portable computing device, such as a laptop, tablet or smart phone, or other computing devices, such as, a desktop computing device or a server-computing device. - As illustrated in
FIG. 3 , thecomputing device 20 includes one ormore processors 22, amemory 24, an input/output 26, and a user interface (UI) 28. It is emphasized that the operation diagram depiction of thecomputing device 20 is exemplary and not intended to imply a specific implementation and/or configuration. Theprocessor 22,memory 24, input/output portion 26 and user interface 28 can be coupled together to allow communications therebetween, and can interface with theclient software application 30 c. Theclient software application 30 c may include an application programmatic interface (API). As should be appreciated, any of the above components may be distributed across one or more separate devices. Thecomputing device 20 can include scanning device, such a camera that captures an image of thedesign object 120. For instance, the camera may include charge-coupled device (CCD) or a contact image sensor (CIS) as the image sensor. - Continuing with
FIG. 3 , thememory 24 can be volatile (such as some types of RAM), non-volatile (such as ROM, flash memory, etc.), or a combination thereof, depending upon the exact configuration and type ofprocessor 22. Thecomputing device 20 can include additional storage (e.g., removable storage and/or non-removable storage) including, but not limited to, tape, flash memory, smart cards, CD-ROM, digital versatile disks (DVD) or other optical storage, magnetic cassettes, magnetic tape, magnetic storage or other magnetic storage devices, universal serial bus (USB) compatible memory, or any other medium which can be used to store information and which can be accessed by thecomputing device 20. - Continuing with
FIG. 3 , in various embodiments, the input/output portion 26 includes an antenna or an electronic connector for wired connection, or a combination thereof. In some implementations, input/output portion 26 can include a receiver and transmitter, transceiver or transmitter-receiver. The input/output portion 26 is capable of receiving and/or providing information pertaining to communication with a network such as, for example, the Internet. As should be appreciated, transmit and receive functionality may also be provided by one or more devices external to thecomputing device 20. For instance, the input/output portion 26 can be in electronic communication with a receiver. - Referring to
FIG. 3 , the user interface 28, which can include an input device and/or display (input device and display not shown) that allows a user to communicate with thecomputing device 20. The user interface 28 can include inputs that provide the ability to control thecomputing device 20, via, for example, buttons, soft keys, a mouse, voice actuated controls, a touch screen, movement of thecomputing device 20, visual cues (e.g., moving a hand in front of a camera on the computing device 20), or the like. The user interface 28 can provide outputs, including visual displays. Other outputs can include audio information (e.g., via speaker), mechanically (e.g., via a vibrating mechanism), or a combination thereof. In various configurations, the user interface 28 can include a display, a touch screen, a keyboard, a mouse, an accelerometer, a motion detector, a speaker, a microphone, a camera, or any combination thereof. The user interface 28 can further include any suitable device for inputting biometric information, such as, for example, fingerprint information, retinal information, voice information, and/or facial characteristic information, for instance, so as to require specific biometric information for access to thecomputing device 20. It should be appreciated that the computer devices can operate via any suitable operating system, such as Android, BSD, iOS, Linux, OS X, QNX, Microsoft Windows, Windows Phone, and IBM z/OS. Furthermore, the software application can operate with any of the aforementioned operation systems. -
FIG. 4 is an operation diagram of theserver computing device 10. Theserver computing device 10 includes one ormore processors 12, amemory 14, an input/output 16, and a user interface (UI) 18, and one or more software applications, such asserver software application 30 s. Theserver software application 30 s may also include an application programmatic interface (API). Theprocessor 12,memory 14, input/output portion 16 and interface 18 can be coupled together to allow communications therebetween. As should be appreciated, any of the above components may be distributed across one or more separate server computing devices. The servercomputing device processor 12,memory 14, input/output 16, and interface 18 are similar to theprocessor 22,memory 24, input/output 26, and interface 28 described above withrespect computing device 20. It should be appreciated that the server computer device can operate via any suitable operating system, such as Android, BSD, iOS, Linux, OS X, QNX, Microsoft Windows, Windows Phone, and IBM z/OS. It is emphasized that the operation diagram depiction of theserver computing device 10 is exemplary and not intended to imply a specific implementation and/or configuration. - The
software application 30 can comprise theclient application 30 c and theserver application 30 s. Accordingly, certain functions can be implemented on theserver computing device 10 and other functions can be implemented on theclient computing devices 20.Software application 30,client application 30 c,server application 30 s may be used interchangeably herein. - Referring to
FIG. 5 , thesoftware application 30 can include one or moreaugmented reality programs 40 contained in thememory 14 of the server computing device 10 (orclient computing device 20 as needed). As illustrated,memory 14 includes threeaugmented reality programs augmented reality programs 40 can be stored on theserver computing device 10 and/or optionally on theclient computing device 20.Reference sign 40 will be used interchangeably withreference signs augmented reality program 40 may also be stored inmemory 24 of aclient computing device 20. In addition, in some instances, all of theaugmented reality programs 40 can be stored on theserver computing device 10 and transmitted to theclient computing device 20. Thememory 14 ofcomputing device 10 may include a database (not shown) that contains the association between one or moreaugmented reality programs 40 and one or more different design objects embodied ontextile article 100. - As shown in
FIG. 6 , eachaugmented reality program 40 includes animage recognition module 42, adesign object compiler 44, aninterface module 46, and a plurality ofaugmented content levels image recognition module 42 is configured to process data captured via the scanning device, e.g. the camera, on theclient computing device 20 and identify the design object. Theimage recognition module 42 can identify one or more design object identifiers in the captured scanned data of the design object. In some instances, spatial relationships are determined among the located design object identifiers. Based on the design object identifiers, theimage recognition module 42 can compile a data set that is indicative of the design object contained in the image. - The
design object compiler 44 can compile one or more design object depiction. A design object depiction is a virtual representation of thedesign object 120 embodied in thetextile article 100. Thedesign object compiler 44 can create the design object depiction based on the compiled data set and the geometric relationships between various design object identifiers. In other embodiments, thedesign object compiler 44 can also access design object depictions that may be stored on thecomputer memory 14 of theserver device 10. In still other embodiments, thedesign object compiler 44 can compile design object depictions based on the compiled data set and a database of design models related to the design object. In such an embodiment, thedesign object compiler 44, when executed by the processor, determines which design models are associated with the compiled data parameters. Upon identification of the associated design models, the design object compiled can build a design object depiction from the design model and other parameters obtained in the compile data set, such as geometric data. - The
interface module 46 is configured to interface the design object depiction with the augmentation levels as well as other data related to visual scene in the captured scanned data. In some instances, theinterface module 46 can also integrate other portions of the image with the design object depiction and the augmentation levels. - Continuing with
FIG. 6 , theaugmented content levels FIG. 1 ) of thecomputing device 20. Specifically, eachaugmented content level screen 90 1) respective design object depiction 60 (or 62) (FIG. 1 ), 2) one ormore augmentation elements 80 that are related to thedesign object 120, and 3) at least oneinput element 80 configured to control thedesign object depiction 60 and the one ormore augmentation elements 80. - The augment reality program includes content related the
design object 120. Furthermore, the content can be thematically associated with thedesign object 120 on thetextile article 100 obtained in the image. For instance, as illustrated inFIG. 1 , one exemplary design object is ahouse 120 a Theaugmented elements 80 can partially define the content related to thedesign object depiction 60, such as thehouse 120 a design into thetextile material 110. For instance,augmentation elements 80 can include an arrangement of thehouse 120 in a virtual neighborhood that includes a street, a yard, neighboring houses, a tree, etc. Theaugmentation elements 80 can also include internal structures of the house, such as, a living room, kitchen, dining room, and a bedroom. Theaugmentation elements 80 are configured to dynamically respond to inputs received via the input elements, or alter the environment based on predefined rules. For example, in the example shown inFIG. 1 , theaugmentation elements 80 may be configured to modify the displayed environment based on a perceived time of day, e.g. the sun is out during the day and sets at night, such that lighting and shadows are shown in the display based on the perceived time of date. For example, theinput element 70 can cause aperson 80 to enter the house, turn on the lights, and enter different rooms. Accordingly, theinput elements 70 permit the user to control and/or interact with thedesign object depiction 60 and theaugmentation elements 80. Theaugmented content levels house 120 a is shown inFIG. 1 and described above as the design object, thedesign object 120 can be any particular design. The related content in augment reality environment can therefore vary along with the particular design object embodied on the textile article. - As discussed above, the
augmented reality program 40 includes a firstaugmented content level 50, a secondaugmented content level 52, and a thirdaugmented content level 54. Eachaugmented content level depictions FIG. 1 . Each augmented content level may include similar content among the plurality of content levels. For example, all content is related to a house. Alternatively, eachaugmented content level house 120 a in the firstaugmented content level 50, a tree in the secondaugmented content level 52, and themoon 120 d in the third augmented content level. - In accordance with an embodiment, the second augmented content level includes one more design object depictions. For instance, the second augmentation level can include either or a first design object depiction based the design object contained in the initial image, or a second design object depiction based a second design object in the textile article. The second augmentation level also includes one or more second augmentation elements that are related to the respective design object and at least one second input element that controls the design object depiction and the one or more second augmentation elements. In the example shown in
FIG. 1 , if thedesign object 120 is thehouse 120 a and thefirst augmentation element 80 relate to the structure of thehouse 120 a, the second augmentation elements can include content specific to the kitchen in the house, for instance; to allow the user to engage virtual cooking. In another example if the second design object is the tree next to the house, and the first augmentation element relates to the structure of the house, the second augmentation elements include content specific to tree, for example, such as a tree structure for climbing, tree species, etc. - Furthermore, the third augmented content level includes either the design object depiction contained in the initial image, the second design object depiction, or a third design object depiction based a third design object of the textile article. The third augmentation level also includes one or more third augmentation elements that are related to the design object. The third augmentation level also includes at least one third input element that controls at least one of the design object depictions and the third augmentation elements. In one example, if the first design object is the house and first augmentation element relate the structure of the house, the third augmentation elements can include content specific to the neighborhood, such as streets. For instance, the third augmentation level can allow a user engage in a bike race through the neighborhood.
- As the user progresses through and completes the first
augmented content level 50, the user interface displays a content change element 92 (FIG. 1 ). Selection of thecontent change element 92 can initiate the secondaugmented content level 52. As the user progresses through and completes the secondaugmented content level 52, the user interface displays another content change element. Selection of thecontent change element 92 can initiate the thirdaugmented content level 54. At each step where a subsequent augmented content level is initiated, the application can cause the display of a payment portal to access the next augmented content level. Although three augmented reality content levels are illustrated, theaugmented reality program 40 can have two or more than three augmented content levels. - Turing now to
FIGS. 7-8C , theinteractive textile article 100 is illustrated. As illustrated, thetextile article 100 is ablanket 102 including aborder 104 that defines a perimeter of thetextile article 100. Thetextile article 100 can be any other article that includes generally planar textile material that forms a substantial component of the article. Furthermore, atextile article 100, as used herein includes, but is not limited to, a blanket, bed linen, a comforter, a rug, a carpet, a tapestry, a tapestry, a set of curtains, or a garment. Thetextile article 100 can be packaged with one or more access codes. The access codes can be tags, alphanumeric codes, QR codes, hyperlinks including on a textile material itself or on tags package with thetextile article 100. Input of the access codes into an interface on thecomputing device 20, causes augmentedreality software application 30 to be stored on a memory of thecomputing device 20. - Turning to
FIGS. 7 and 9 , as described above, thetextile article 100 includes atextile material 110 and at least one (a plurality) of design objects 120. The design objects include one or moredesign object identifiers 130. Thetextile material 110 has aface 112 and a back 114 that is opposed to theface 112. The design objects 120 are disposed on theface 112 of thetextile material 110. In some instances, the design object may be disposed on theface 112 and back 114 of the textile material. The textile material is further described below. As illustrated, thetextile material 110 includes multiple design objects, such as ahouse 120 a, atree 120 b, clouds 120 c,moon 120 d, and stars 120 e.FIG. 7 is an exemplary illustration of design object embodied in thetextile article 100. Thus, the design objects 120 can be any design, shape or arrangement of elements. For instance, the design object may include, but is not limited to, a person, a building, an automobile (FIG. 15A ), an airplane (FIG. 15B ), a boat, a castle (FIG. 15C ), sports gear (e.g. a football, soccer ball, basketball, golf club and golf ball, baseball and bat), an animal, a geographic location, a landscape, a forest, a tree, a mountain, a river, a lake, an ocean, a sun, a star, a moon, a planet, or a galaxy. The design object may also be a character from an audio-visual work, a literary work, or a theatrical work, or any type of multi-media animation. For instance, the design object can include one or more characters within the portfolio of Disney branded characters, such as characters from the Frozen franchise, or other Disney characters. In another example, the design object can include one or more characters, such as Spider-man, Superman, Batman, X-Men, or other contemporary characters. - Referring to
FIGS. 7-9C , eachdesign object 120 includes one ordesign object identifiers 130. The design object identifier can be one or more portions thedesign object 120. In one example, thedesign object identifier 130 is one more theedge portions 131 ofdesign object 120. For example, the edge portion can be a linear edge, angled linear, a curved edge, a circular edge, or an intersection of two edges. The edge portion can be defined by contrast between two colors in the design.FIGS. 8A-8C , illustrate exemplary object identifiers for design objects illustrated as thehouse 120 a, atree 120 b, clouds 120 c, and themoon 120 d. InFIG. 8A , thedesign object 120 a includes object identifiers configured aslinear design elements 132 andangled elements 134. InFIG. 8B , themoon 120 d includes object identifiers configured ascurved elements 136. InFIG. 8C , thetree 120 b includes object identifiers configured asmonolithic shape elements 138. - The object identifiers are illustrated as components of the design object. In alternative embodiments, the object identifier can be a code, an alphanumeric code, QR codes, embedded into or printed onto the textile material. Such an electronic code can be associated with predefined design object stored in the
server memory 14. Theimage recognition module 42 can be used to identify the design object on the textile material. In still other embodiments, the object identifier can include an electronic object identifier in the form of a transmitter formed into the textile structure via a yarn or conductive printed materials. The electronic object identifier can transmit signal concerning the design object to the computing device. - The
textile material 110 can be a woven fabric, a knit fabric (weft or warp), a nonwoven fabric, and laminates thereof. In other embodiments, the textile material can be laminate that includes: a) one or more films and a woven fabric; b) one or more films and a knit fabric; and c) one or more films and a nonwoven fabric. Regardless of the construction, thedesign object 120 can be integrally formed into the textile material, as illustrated inFIGS. 11-13 . Alternatively, the design object may be printed onto the textile material, as illustrated inFIGS. 9, 10, and 15A-15C . - In accordance with one embodiment, the
textile material 110 can be a woven fabric 140 that defines a woven structure with thedesign object 120 is integrally formed into the woven structure. As shown inFIG. 11 , thewoven fabric 140 b includes a plurality ofwarp yarns 142 b interwoven with a plurality ofweft yarns 144 b to define thedesign object 120. For instance, the areas where thewarp yarns 142 b are exposed to the face of the fabric can be arranged in such a way so as to define thedesign object 120 and objectidentifier 130. In such an example, woven fabrics can be based on dobby, jacquard, complex (multiple warps), 3D fabric weaving systems, which can be used to create simple to complex patterns on the face of the woven fabric.FIG. 11 illustrates an embodiment of wovenfabric 140 a utilizing a multiple warp system to create complex design objects. -
FIGS. 14A-14F are schematic diagrams illustrating how the warp yarns and weft yarns in the woven fabric design can define theobject identifier 130. Each diagram includesmultiple unit cells 190 with a machine orwarp direction 2 that is perpendicular to a cross or weft direction 4. For woven structures, the shadedcells 192 represent a warp yarn exposed to the fabric face over a weft yarn. Thus, the woven fabric 140 can define a series of warp yarns exposed on the face in such way to define a linearedge design element 132. An angledlinear design element 134 is shownFIG. 14A and alinear design element 132 aligned with the warp direction is shown inFIG. 14D . Furthermore, the fabric design can define two linear elements 132 a that intersect, as illustrated inFIG. 14B . In another example, the woven fabric design can define acircular design element 139, as shown inFIG. 14C . In yet another example, as shown inFIG. 14E , the woven structure can define acurved element 136. In still another example shown inFIG. 14F , the woven structure can define acomplete shape element 138 depending on the type of fabric formation system used. Furthermore, woven structures can appear pixilated at the location where the warp yarn travels back beneath the weft yarn (see e.g. intersection of untilcells FIGS. 14A-14F ). Theimage recognition module 42 described above is configured to account for such a pixilation by compiling a design element line along the identified design element, similar to theline 191 shown inFIGS. 14A-14F . Thedesign element line 191 can be used to define the edge of the design object, and therefore an object identifier as described above. Furthermore, in alternative embodiments, the woven fabric can include distinct sets of weft yarns or in-laid yarns that are exposed to the face adjacent the warp facing yarn. The distinct in-laid yarns may be a different color, construction or fiber type that helps further define the specific design element andline path 191. -
FIGS. 12A-12C illustrate a textile material configured as pile fabric that includes a pile structure and a design object integrally formed in the pile structure. In such embodiments, specific piles can be formed from different colored yarns arranged so as to define a portion of thedesign object 120. In addition, one or more pile floats can be formed into the pile fabric to define a design element, such an edge of adesign object 120. As shown inFIG. 12A , apile fabric 150 a includes loopedpiles 152 a and a ground orbase fabric 154 a. The looped piles 152 a are selectively placed along a warp andweft direction 2 and 4 (weft direction 4 not shown) so as define portion of thedesign object 120 and an edge that define an design object identifier. As shown inFIG. 12B , apile fabric 150 b includes cut piles 152 b and a ground orbase fabric 154 b. The cut piles 152 b are selectively placed along a warp andweft direction 2 and 4 (weft direction 4 not shown) so as define portion of thedesign object 120. Floats can define an edge ordesign object identifier 130. As shown inFIG. 12C , apile fabric 150 c configured as tuft rug that includes cutpiles 152 c and a ground orbase fabric 154 c. One or more of cut piles 152 c includecolor yarns 156 c that are selectively placed along a warp andweft direction 2 and 4 (not shown) so as define portion of thedesign object 120. The edge ofcolored yarns 156 c can define an edge ordesign object identifier 130. -
FIGS. 14A-14F can also illustrate how various pile yarns in apile fabrics design object 120 and the one ormore object identifiers 130. For instance, for apile fabrics FIGS. 14A-14F could represent a single pile. The shadedcells 192 can represent specific piles formed from different colored yarns arranged so as to define a portion of the design object. Unshaded shells can represent different yarns or pile floats. Thus, the pile fabric can include pile yarns of a first color or type arranged to define an angledlinear design element 134 is shownFIG. 14A , and alinear design element 132 aligned with thewarp direction 2 is shown inFIG. 14D . Furthermore, the piles fabric design can arrange the first color pile yarns arranged to define: two linear elements 132 a that intersect as illustrated inFIG. 14B ; acircular design element 139 shown inFIG. 14C ; acurved element 136 as shown inFIG. 14E ; or acomplete shape element 138 as shown inFIG. 14F . - In another embodiment, the textile material is a
knit fabric 160 includes aknit structure 161 anddesign objects 120 integrally formed into theknit structure 161.FIG. 13 illustrates an exemplary schematic of aknit fabric 160 with aknit structure 161 having the a plurality of knit stitches 162 that define thedesign object 120. The arrangement of knit stitches 162 can further define theobject identifier 130. Theknit fabric 160 may be a weft knit, such as a double knit, rib knit, or jacquard knit formed to define specific arrangements of knit stitches 162. The knit stitches 162 may include tuck, held, float stitches, and/or inlaid yarns. - Furthermore, the
FIGS. 14A-14F also illustrate how various yarn components in knit structure define thedesign object 120 and objectidentifier 130. For instance, for a knit structure, the shadedcells 192 can represent a specific stitch type, such as a tuck, held, or float stitch, or inlay. Theshaded shells 192 represent a knit structural component that is used to define a portion of thedesign object 120. Again, specific yarn types can be inserted or inlayed adjacent to knit stiches represented by the shadedcells 192 in order to further define the edge or design element, as described above.FIGS. 14A-14F can also illustrate how various knit structure in aknit fabric 160 can define thedesign object 120 and the one ormore object identifiers 130. For instance, forknit fabrics 160, each unit cell in design illustrations inFIGS. 14A-14F could represent a knit stitch. The shadedcells 192 can represent specific types of knit stitches. Thus, theknit fabric 161 can define an arrangement stitches to define: an angledlinear design element 134 is shownFIG. 14A ;alinear design element 132 aligned with thewarp direction 2 is shown inFIG. 14D ; two linear elements 132 a that intersect as illustrated inFIG. 14B ; acircular design element 139 shown inFIG. 14C ; acurved element 136 as shown inFIG. 14E ; or acomplete shape element 138 as shown inFIG. 14F . The knit fabric may alternatively be a warp knit, such as a tricot or rachel warp knitted fabric. - The woven, knit and pile fabrics can be formed from a wide range of fiber types and yarn formation systems. For example, the woven, knit and pile fabrics can be formed from any number of yarn types, such a spun yarns or continuous filament yarns. Spun yarns may include natural fibers, synthetic fibers, or blends of natural and synthetic staple fibers. Natural fibers include cotton, wool, or others. Synthetic fibers may include polyethylene terephthalate (PET), viscose rayon, acrylic, or other fiber types, such as flame resistant fibers, as needed. Suitable thermoplastic synthetic staple fibers may be mono-component, bi-component, or tri-component type fibers. Such fibers can be splitable to define microfibers. A variety of yarn spinning types can be used, such as ring spun, open end, air-jet, and the like. Spun yarns, can therefore include: spun cotton yarns and/or spun cotton and polyethylene terephthalate (PET) blended yarns. Continuous filaments yarns may include either or both mono-component or bicomponent filaments types. Continuous filament yarns can be polyethylene terephthalate, polyolefin, and/or polyamide 6, polyamide 6,6, polylactic acid filaments.
- The
textile material 110 may also be a nonwoven fabric having adesign object 120 printed thereon. Any suitable printing technique can be used to define the design object. Suitable nonwoven fabrics, include melt-spun nonwovens, such as spunbond and meltblown materials. A meltspun nonwoven can include a single spunbond layer, multiple spunbond layers, a single meltblown layer, multiple meltblown layers, or multiple layers of spunbond and meltblown materials. Meltspun nonwovens can be formed with polyethylene terephthalate, polyolefin, and/or polyamide 6, polyamide 6,6, or polylactic acid polymers. Nonwoven fabrics can be formed with mono-component, bi-component, or tri-component fibers. Such fibers can be splitable and further processed to define microfibers. One suitable nonwoven fabric is Evolon® manufactured by Freudenberg & Co. Alternatively, the nonwoven fabrics can be bonded thermally bonded, chemically bonded, and/or mechanically bonded, e.g. via needles, stitch bonded, or hydraulically bonded. In alternative embodiments, the nonwoven fabrics can be embossed to define one or more design objects. - In another embodiment, the
design object 120 can be printed on the any one of the aforementioned textile materials including, nonwoven materials. Printing can include digital printing, screen printing, sublimation printing. For instance, in one example, the textile material can be a woven fabric 140 with thedesign object 120 printed on its face 112 (and/or back). In another example, the textile material can be a weft knitted fabric 150 with thedesign object 120 printed on its face 112 (and/or back). In another example, the textile material can be a warp knitted fabric with thedesign object 120 printed on its face 112 (and/or back). In another example, the textile material can be a nonwoven fabric with thedesign object 120 printed on its face 112 (and/or back). -
FIGS. 15A-15C illustrate alternative embodiments of atextile article Textile articles textile materials Textile articles design objects FIG. 15A , thedesign object 320 a can be an automobile, such as a race car. Content in the related augmented reality program is related to race cars. InFIG. 15B , thedesign object 320 b can be a castle. Content in the related augmented reality program is related to a castle. InFIG. 15C , thedesign object 320 c can be an airplane. Content in the related augmented reality program is related to an airplane. -
FIGS. 16A-16C illustrate amethod 200 for implementing an augmented reality environment utilizing theinteractive textile article 100 andcomputing device 20. The method initiates inblock 202 when the user accesses and stores theapplication 30 on thecomputing device 20. Inblock 202, the user can input an electronic access code packaged with thetextile article 100 into an interface on thecomputing device 20. For instance, the user can access a portal via thecomputing device 20 to download theapplication 30. Inblock 204, a scanning device on the scan a portion of an interactive textile article that includes a design object. In one example, scanning the textile includes capturing an image of the textile via a camera. Inblock 206, theapplication 30, for instance the image recognition module, identifies one or more design object identifiers of the design object in the image. Inblock 210, the application compiles a design object depiction based on the one or more design object identifiers contained in the design object. Process control is transferred to block 214. - In
block 214, the first augmented content level is initiated. Process control is transferred to block 220. In block 220 (FIG. 16B ), the first augmented content level displays on the screen 90 (FIG. 1 ) 1) the design object depiction, 2) one or more augmentation elements that are related to the design object, and 3) at least one input element that controls one of the design object depiction and the one or more augmentation elements. - In
block 224, the application determines if the input elements are engaged. If the input elements have not been engaged, process control is transferred to block 228. Inblock 228, the first augmentation level progressives passively. If the input elements have been engaged, process control is transferred to block 232, where the first augmentation level progresses interactively. For instance, the user can control interaction of the design object depiction with the at least one augmentation elements. - In
block 236, the application causes the display of portals or elements, the selection of which cause the display of additional augmentation levels, such as the second augmentation level. Inblock 240, the application determines if the second augmentation level element has been selected by the user. If the second augmentation level element is not selected by the user, process control is transferred to block 232. The user can progress though the first augmentation level interactively. If the second augmentation level element has been selected by the user, process control is transferred to block 242. - In
block 242, the application initiates the second augmented content level. Process control is transferred to block 244. Inblock 244, the second augmented content level displays 1) the first design object depiction and/or the second design object depiction), 2) one or more second augmentation elements that are related to the respective design object, and 3) at least one second input element that controls at least one of the design object depiction and the one or more second augmentation elements. Process control is transferred to block 248. - In
block 248, the application determines if the second input elements are engaged. If the second input elements have not been engaged, process control is transferred to block 252. Inblock 252, the second augmentation level progresses passively. If the second input elements have been engaged, process control is transferred to block 256, where the second augmentation level progressives interactively. For instance, the user can control interaction of the first and/or second design object depiction with the second augmentation elements. - In
block 260, the application causes the display of content change elements 92 (FIG. 1 ). Selection ofcontent change element 92 causes the display of additional augmentation levels, for example, such as the third augmentation level. Inblock 260, the application determines if the third augmentation level element has been selected by the user. If the third augmentation level element is not selected by the user, process control is transferred to block 256 and user can progress though the third augmentation level interactively. If the third augmentation level element has been selected by the user, process control is transferred to block 274. Inblock 274, the method progresses through steps similar to the steps illustrated inblocks 242 through 270. For instance, additional augmentation levels, such as the third augmentation level can be initiated. At each step (blocks 242 and 272) where a subsequent augmented content level is initiated, the application can cause the display of a payment portal to access the next augmented content level. - In another embodiment, multiple users can the networked together so as to interface within each other's augmented relatedly environments. As described above, each
computing device - Another embodiment of the present disclosure is a method for making an augmented reality system including an interactive textile article. The method includes manufacturing a textile material to includes a design object having one or more design object identifiers. Then, the textile material is formed into a textile article, such as cutting an hemming and adding binding as needed. The textile article is packaged with one or more electronic access codes as described above. The codes can be tags, alphanumeric codes, QR codes, or hyperlinks. Input of the codes into an interface on the
computing device 20 causes the augmentedreality software application 30 to be stored on a memory of thecomputing device 20. As noted above, the textile article can be manufactured as according to formation systems described above. Such that the design object is printed on the textile material or the design object is integrally formed into the textile structure. - While the disclosure is described herein using a limited number of embodiments, these specific embodiments are not intended to limit the scope of the disclosure as otherwise described and claimed herein. The precise arrangement of various elements and order of the steps of articles and methods described herein are not to be considered limiting. For instance, although the steps of the methods are described with reference to a sequential series of reference signs and progression of the blocks in the Figures, the method can be implemented in a particular order, as desired.
Claims (20)
1. A system, comprising:
an interactive textile article including a textile material having a face and a back that is opposed to the face, at least one of the face and the back including a design object, the design object including one or more design object identifiers, wherein the design object is associated with an augmented reality software program configured to include content related to the design object; and
an augmented reality program contained in a memory of a computing device and executable by a processor, the augmented reality program including a plurality of augmented content levels, the augmented reality program when executed by the processor:
a) compiles a design object depiction that is based on the one or more design object identifiers included in the design object; and
b) executes a first augmented content level of the plurality of augmented content levels, the first augmented content level configured to display on a computing device 1) the design object depiction, 2) one or more augmentation elements that are related to the design object, and 3) at least one input element configured to control the design object depiction and the one or more augmentation elements.
2. The system of claim 1 , wherein the textile article is a blanket, bed linen, a comforter, a rug, a carpet, a tapestry, a set of curtains, or a garment.
3. The system of claim 1 , wherein the textile material is a woven fabric that defines a woven structure, wherein the design object is integrally formed into the woven structure.
4. The system of claim 1 , wherein the textile material is a knit fabric that defines a knit structure, wherein the design object is integrally formed into the knit structure.
5. The system of claim 1 , wherein the textile material is a pile fabric including a plurality of pile structures, and the design object is embodied in the plurality of pile structures.
6. The system of claim 1 , wherein the design object is printed onto the textile material.
7. The system of claim 6 , wherein the design object is digitally printed onto the textile material.
8. The system of claim 6 , wherein the design object is sublimation printed onto the textile material.
9. The system of claim 1 , wherein each design object identifier is an edge of the design object.
10. The system of claim 1 , wherein at least one of the design object identifier is a marker embedded in the textile material.
11. The system of claim 1 , wherein the processor is configured to execute a second augmented content level that includes the design object depiction, one or more second virtual elements, and at least one second input device, and wherein engagement of the at least one second input element causes progression through the second augmented content level.
12. The system of claim 11 , wherein the processor is configured to execute a third augmented content level that includes the design object depiction, one or more third virtual elements, and at least one third input device, and wherein engagement of the at least one third input element causes progression through the third augmented content level.
13. The system of claim 1 , wherein the computing device includes a camera configured to obtain the image of the design object.
14. The system of claim 1 , wherein the augmented reality program is configured to interface with one or more additional augmented relative programs running on one or more additional computing devices.
15. The system of claim 14 , wherein the computing device is a first computing device, and the one or more additional computing devices is a second computing device, wherein respective augmented reality programs when executed by the processor interfaces the first augmented content level of the first computing device with the first augmented content level running on the second computing device in a common augmented reality environment displayed on the first and second computing devices.
16. A method for displaying augmented reality on a computing device, the method comprising the steps of:
scanning a portion of an interactive textile article that includes a design object so as to identify one or more design object identifiers of the design object;
compiling a design object depiction based on the one or more design object identifiers; and
initiating a first augmented content level so as to display 1) the design object depiction, 2) one or more augmentation elements that are related to the design object, and 3) at least one input element that controls at least one of the design object depiction and the one or more augmentation elements.
17. The method of claim 16 , further comprising the step of controlling interaction of the design object depiction with the one or more augmentation elements.
18. The method of claim 16 , further comprising the step of progressing through a series of displays of the first augmented content level.
19. The method of claim 16 , further comprising the step of initiating a second augmented content level so as to display 1) the design object depiction, 2) one or more second augmentation elements that are related to the design object, and 3) at least one second input element that controls at least one of the design object depiction and the one or more second augmentation elements.
20. The method of claim 16 , further comprising the step, after progression through the second augmented content level, initiating a third augmented content level so as to display 1) the design object depiction, 2) one or more third augmentation elements that are related to the design object, and 3) at least one third input element that controls at least one of the design object depiction and the one or more third augmentation elements.
Priority Applications (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
US15/385,402 US20170103579A1 (en) | 2015-08-31 | 2016-12-20 | Interactive textile article and augmented reality system |
Applications Claiming Priority (4)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
IN3334/MUM/2015 | 2015-08-31 | ||
IN3334MU2015 | 2015-08-31 | ||
US14/856,250 US9524589B1 (en) | 2015-08-31 | 2015-09-16 | Interactive textile article and augmented reality system |
US15/385,402 US20170103579A1 (en) | 2015-08-31 | 2016-12-20 | Interactive textile article and augmented reality system |
Related Parent Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
US14/856,250 Continuation US9524589B1 (en) | 2015-08-31 | 2015-09-16 | Interactive textile article and augmented reality system |
Publications (1)
Publication Number | Publication Date |
---|---|
US20170103579A1 true US20170103579A1 (en) | 2017-04-13 |
Family
ID=57538660
Family Applications (2)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
US14/856,250 Active US9524589B1 (en) | 2015-08-31 | 2015-09-16 | Interactive textile article and augmented reality system |
US15/385,402 Abandoned US20170103579A1 (en) | 2015-08-31 | 2016-12-20 | Interactive textile article and augmented reality system |
Family Applications Before (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
US14/856,250 Active US9524589B1 (en) | 2015-08-31 | 2015-09-16 | Interactive textile article and augmented reality system |
Country Status (4)
Country | Link |
---|---|
US (2) | US9524589B1 (en) |
EP (2) | EP3135356A1 (en) |
ES (1) | ES2905219T3 (en) |
WO (1) | WO2017037633A1 (en) |
Cited By (1)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US20190257011A1 (en) * | 2018-02-21 | 2019-08-22 | Welspun India Limited | Soft twist terry article |
Families Citing this family (6)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US10108846B2 (en) * | 2016-09-30 | 2018-10-23 | Autodesk, Inc. | Scanning for materials and scale information |
USD879509S1 (en) * | 2016-11-14 | 2020-03-31 | Welspun India Ltd. | Duvet |
USD879508S1 (en) * | 2016-11-14 | 2020-03-31 | Welspun India Ltd. | Pillow cover |
US11613831B2 (en) * | 2018-04-14 | 2023-03-28 | Ronak Rajendra Gupta | High thread/yarn count woven textile fabric and process of preparation thereof |
IN201821014302A (en) * | 2018-04-14 | 2018-10-05 | ||
US10831261B2 (en) | 2019-03-05 | 2020-11-10 | International Business Machines Corporation | Cognitive display interface for augmenting display device content within a restricted access space based on user input |
Citations (4)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US20090075075A1 (en) * | 2007-02-14 | 2009-03-19 | High Voltage Graphics, Inc. | Sublimation dye printed textile |
US20100316832A1 (en) * | 2009-04-10 | 2010-12-16 | High Voltage Graphics, Inc. | Flocked article having a woven insert and method for making the same |
US20130004747A1 (en) * | 2009-11-09 | 2013-01-03 | Stephen Schwarz | Textile Composite Article |
US20150002506A1 (en) * | 2013-06-28 | 2015-01-01 | Here Global B.V. | Method and apparatus for providing augmented reality display spaces |
Family Cites Families (12)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US8206219B2 (en) * | 2002-10-30 | 2012-06-26 | Nike, Inc. | Interactive gaming apparel for interactive gaming |
DE102004021499B3 (en) | 2004-04-30 | 2005-12-08 | Rinke Etiketten Karl Rinke Gmbh & Co. Kg | Method for producing a printed label |
US8547401B2 (en) * | 2004-08-19 | 2013-10-01 | Sony Computer Entertainment Inc. | Portable augmented reality device and method |
US7885857B1 (en) * | 2004-11-15 | 2011-02-08 | Kaoru Fukuya | Appearel production method and system |
US8606645B1 (en) * | 2012-02-02 | 2013-12-10 | SeeMore Interactive, Inc. | Method, medium, and system for an augmented reality retail application |
JP6192264B2 (en) * | 2012-07-18 | 2017-09-06 | 株式会社バンダイ | Portable terminal device, terminal program, augmented reality system, and clothing |
WO2014031899A1 (en) * | 2012-08-22 | 2014-02-27 | Goldrun Corporation | Augmented reality virtual content platform apparatuses, methods and systems |
US9224231B2 (en) * | 2012-09-14 | 2015-12-29 | Nagabhushanam Peddi | Augmented reality system indexed in three dimensions |
US9588730B2 (en) * | 2013-01-11 | 2017-03-07 | Disney Enterprises, Inc. | Mobile tele-immersive gameplay |
US20140270335A1 (en) * | 2013-03-14 | 2014-09-18 | Vor Data Systems, Inc. | System and Method for Embedding and Retrieving Covert Data in Overt Media |
US20140340423A1 (en) * | 2013-03-15 | 2014-11-20 | Nexref Technologies, Llc | Marker-based augmented reality (AR) display with inventory management |
BR112016003987B1 (en) * | 2013-09-04 | 2021-10-05 | Biteam Ab | THREE-DIMENSIONAL FABRIC ARTICLE, METHOD FOR PRODUCING A THREE-DIMENSIONAL FABRIC ARTICLE AND APPARATUS FOR PRODUCING A THREE-DIMENSIONAL FABRIC ARTICLE |
-
2015
- 2015-09-16 US US14/856,250 patent/US9524589B1/en active Active
- 2015-12-24 EP EP15202707.4A patent/EP3135356A1/en not_active Ceased
-
2016
- 2016-08-31 EP EP16840941.5A patent/EP3381193B1/en active Active
- 2016-08-31 WO PCT/IB2016/055203 patent/WO2017037633A1/en active Application Filing
- 2016-08-31 ES ES16840941T patent/ES2905219T3/en active Active
- 2016-12-20 US US15/385,402 patent/US20170103579A1/en not_active Abandoned
Patent Citations (4)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US20090075075A1 (en) * | 2007-02-14 | 2009-03-19 | High Voltage Graphics, Inc. | Sublimation dye printed textile |
US20100316832A1 (en) * | 2009-04-10 | 2010-12-16 | High Voltage Graphics, Inc. | Flocked article having a woven insert and method for making the same |
US20130004747A1 (en) * | 2009-11-09 | 2013-01-03 | Stephen Schwarz | Textile Composite Article |
US20150002506A1 (en) * | 2013-06-28 | 2015-01-01 | Here Global B.V. | Method and apparatus for providing augmented reality display spaces |
Cited By (4)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US20190257011A1 (en) * | 2018-02-21 | 2019-08-22 | Welspun India Limited | Soft twist terry article |
US11021816B2 (en) * | 2018-02-21 | 2021-06-01 | Welspun India Limited | Soft twist terry article |
US20210285134A1 (en) * | 2018-02-21 | 2021-09-16 | Welspun India Limited | Soft twist terry article |
US11702774B2 (en) * | 2018-02-21 | 2023-07-18 | Welspun India Limited | Soft twist terry article |
Also Published As
Publication number | Publication date |
---|---|
EP3135356A1 (en) | 2017-03-01 |
EP3381193A1 (en) | 2018-10-03 |
WO2017037633A1 (en) | 2017-03-09 |
EP3381193B1 (en) | 2021-11-03 |
US9524589B1 (en) | 2016-12-20 |
EP3381193A4 (en) | 2019-07-17 |
ES2905219T3 (en) | 2022-04-07 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
US9524589B1 (en) | Interactive textile article and augmented reality system | |
US11550401B2 (en) | Virtual or augmediated topological sculpting, manipulation, creation, or interaction with devices, objects, materials, or other entities | |
JP6660344B2 (en) | Method and system for manufacturing clothing | |
US9550124B2 (en) | Projection of an interactive environment | |
Simonetti Ibañez et al. | Vuforia v1. 5 SDK: Analysis and evaluation of capabilities | |
Prain | Strange material: Storytelling through textiles | |
CN109564703A (en) | Information processing unit, method and computer program | |
Kuusk | Crafting sustainable smart textile services | |
CN107844195B (en) | Intel RealSense-based development method and system for virtual driving application of automobile | |
US10698474B2 (en) | Apparatus and method for designing patterns for wearable items | |
Chu et al. | Game interface using digital textile sensors, accelerometer and gyroscope | |
WO2023141340A1 (en) | A user controlled three-dimensional scene | |
Brown et al. | Knitalong: celebrating the tradition of knitting together | |
Atkin | Nostalgia, Myth, and Memories of Dress | |
KR20190080530A (en) | Active interaction system and method for virtual reality film | |
CN114241132B (en) | Scene content display control method and device, computer equipment and storage medium | |
CN114660918B (en) | Holographic stereoscopic image display device, method, device and medium | |
Chapin | We Got Here Under False Pretenses | |
Shaikh | Tactile Textile |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
STCB | Information on status: application discontinuation |
Free format text: ABANDONED -- FAILURE TO RESPOND TO AN OFFICE ACTION |