GB2566734A - Wearable device, system and method - Google Patents
Wearable device, system and method Download PDFInfo
- Publication number
- GB2566734A GB2566734A GB1715486.5A GB201715486A GB2566734A GB 2566734 A GB2566734 A GB 2566734A GB 201715486 A GB201715486 A GB 201715486A GB 2566734 A GB2566734 A GB 2566734A
- Authority
- GB
- United Kingdom
- Prior art keywords
- location
- augmented reality
- operable
- user
- digital asset
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Withdrawn
Links
Classifications
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F3/00—Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
- G06F3/01—Input arrangements or combined input and output arrangements for interaction between user and computer
- G06F3/011—Arrangements for interaction with the human body, e.g. for user immersion in virtual reality
-
- G—PHYSICS
- G02—OPTICS
- G02B—OPTICAL ELEMENTS, SYSTEMS OR APPARATUS
- G02B27/00—Optical systems or apparatus not provided for by any of the groups G02B1/00 - G02B26/00, G02B30/00
- G02B27/01—Head-up displays
- G02B27/017—Head mounted
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F1/00—Details not covered by groups G06F3/00 - G06F13/00 and G06F21/00
- G06F1/16—Constructional details or arrangements
- G06F1/1613—Constructional details or arrangements for portable computers
- G06F1/163—Wearable computers, e.g. on a belt
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T19/00—Manipulating 3D models or images for computer graphics
- G06T19/006—Mixed reality
-
- G—PHYSICS
- G02—OPTICS
- G02B—OPTICAL ELEMENTS, SYSTEMS OR APPARATUS
- G02B27/00—Optical systems or apparatus not provided for by any of the groups G02B1/00 - G02B26/00, G02B30/00
- G02B27/01—Head-up displays
- G02B27/017—Head mounted
- G02B27/0176—Head mounted characterised by mechanical features
Landscapes
- Engineering & Computer Science (AREA)
- Physics & Mathematics (AREA)
- Theoretical Computer Science (AREA)
- General Engineering & Computer Science (AREA)
- General Physics & Mathematics (AREA)
- Computer Hardware Design (AREA)
- Human Computer Interaction (AREA)
- Computer Graphics (AREA)
- Software Systems (AREA)
- Optics & Photonics (AREA)
- User Interface Of Digital Computer (AREA)
Abstract
A wearable device 100 is adapted to be retained on a head of a user. The wearable device has an augmented reality apparatus and communicates with a first communication module 201 associated with a first location so as to determine whether the device 100 is in the first location. When, as a result of the communication, it is determined that the device 100 is in the first location, the augmented reality apparatus provides augmented reality image data of a first digital asset, the first digital asset being associated withthe first location. A plurality of communication modules 201 may be provided in different locations such that different digital assets are provided dependent on the location of the device 100. The device may be used in tours of locations such as a museum or theme park.
Description
WEARABLE DEVICE, SYSTEM AND METHOD
The present disclosure relates to a wearable device, system and method. In particular, the wearable device is adapted to be retained on a head of a user and comprises an augmented reality apparatus.
Background
Augmented reality apparatuses in the form of goggles, glasses and headsets are known. One example of which is the HOLOLENS ® by Microsoft Corporation. These augmented reality apparatuses have been successful in adding augmented reality interactive, animated content and real-time data and information to a scene as viewed by a user of the augmented reality apparatus.
It is an object of the present disclosure to improve on existing augmented reality apparatus, or at least provide an alternative to existing augmented reality apparatuses.
Summary
According to the present disclosure there is provided a device, system and method as set forth in the appended claims. Other features of the invention will be apparent from the dependent claims, and the description which follows.
According to a first aspect there may be provided a wearable device adapted to be retained on a head of a user, the wearable device comprising an augmented reality apparatus, wherein the device is operable to communicate with a first communication module associated with a first location so as to determine whether the device is in the first location, and wherein when, as a result of the communication, it is determined that the device is in the first location, the augmented reality apparatus is operable to provide augmented reality image data of a first digital asset, the first digital asset being associated with the first location.
Significantly, the augmented reality apparatus may thus provide augmented reality image data of a first digital asset associated with the first location to the user of the device when the user is in the first location. The wearable device may advantageously be used in an interactive setting where a user moves to different locations. This may be for example, during a tour in a visitor attraction, museum, zoo or theme park environment. That is, the wearable device may provide or be part of a touring system for the user that enables the user to experience different augmented reality image data as they move through the touring environment. In such examples, a user of the wearable device may be led or be directed through different locations each of which is associated with a communication module. The communication modules are physically separate from the device and may be physically located in the different locations. As the user moves to the different locations, the communication between the device and the communication modules is able to be used to identify which location the user is in. As a result, different digital assets dependent on the location of the user may be provided, and the user thus benefits from an immersive interactive environment. The augmented reality image data for the different digital assets may pre-stored on the device (e.g. before the tour starts), and, in response to it being determined that the device is in a particular location, the augmented reality apparatus may be operable to display the relevant augmented reality image data for the digital asset associated with that location. In an addition or alternatively, the augmented reality image data is transmitted to the device from a server apparatus in response to determining that the device is in a particular location. For example, part of the augmented reality image data may be pre-stored on the device, and the augmented reality image data may be updated and/or supplemented as the tour progress by data transmission from a server apparatus.
The wearable device may comprise a headset arranged to retain the augmented reality apparatus. The headset may comprise a band arranged to extend around at least part of the circumference of the head such that the device may be supported on the crown region of the head of the user. The headset may comprise cushioning material disposed around the band of the headset. The headset may comprise at least one strap (or other supporting component) arranged to support the device over the top of the head of the user. This might provide support in addition or alternatively to the band around the crown of the head. The at least one strap may be constructed of a lightweight, breathable material. The at least one strap may be a breathable mesh strap.
The headset may further comprise an adjustment arrangement for enabling the position and/or the tightness of the headset around the head of the user to be adjusted. The adjustment arrangement may be located on the nape region of the head of the user. This might provide support in addition or alternatively to the band around the crown of the head. The adjustment arrangement may comprise a rotatable knob such that the position and/or the tightness of the headset may be adjusted by rotating the knob. The adjustment arrangement may be connected to the remainder of the headset by at least one strap.
The headset retains the augmented reality apparatus. This simplifies the process by which a user may wear the wearable device. Positioning an augmented reality apparatus (e.g. augmented reality glasses) over the eyes of the user may be difficult and the user may have to precisely adjust the position of the augmented reality apparatus to gain the benefit of being able to visualise the augmented reality image data. The headset helps mitigate this problem because it is easier for the user to place the headset on their head, and the headset helps ensure that the augmented reality apparatus is correctly positioned for the user to visualise augmented reality image data. In existing systems which do not provide a headset for retaining the augmented reality apparatus, the user may place their fingers over the lenses of the augmented reality apparatus when placing the augmented reality apparatus on their head. This can smudge the lenses, negatively affecting the visualisation of the augmented reality image data. The headset retaining the augmented reality apparatus helps mitigate this problem as the user may hold the headset, rather than the augmented reality apparatus, when placing the device on their head.
The headset may be arranged to distribute the weight of the augmented reality apparatus around the head of the user of the device. Existing augmented reality apparatus are typically supported on the ears and the bridge of the wearer’s nose. In addition, some existing augmented reality apparatuses are supported around the crown of the head. Both arrangements can be uncomfortable for the user, especially over extended periods of the time. The headset helps mitigate this problem by distributing the weight of the augmented reality apparatus around the head of the user (e.g. around the top and/or sides of the head of the user, and not just the crown of the user). This increases the comfort for the user. The headset may be arranged to distribute the weight of the augmented reality apparatus around the head of the user of the device by providing at least one strap (or other supporting components) arranged to support the device over the top of the head of the user. The headset may comprise additional supporting arrangements such as straps to support the headset around the nape of the head of the user.
The headset may provide a synergistic advantage with the feature of the device being operable to communicate with a first communication module associated with a first location so as to determine whether the device is in the first location, In particular, the headset may mean that the user is less aware of the presence of the augmented reality apparatus because the load is more evenly distributed around the head of the user. This means that the user is less aware that they are wearing an augmented reality apparatus. As the user moves through different locations, augmented reality image data for different digital assets are automatically provided. This means that immersive environment is provided and further helps the user to be less aware that they are wearing the augmented reality apparatus. The two concepts therefore combine to provide the technical advantage of providing an immersive, and lifelike augmented reality environment for the user.
The headset may be a helmet. The helmet may provide better or more even distribution of the weight of the augmented reality apparatus around the head of the user in a more robust manner. The helmet may also offer a degree of protection to the user. This may be beneficial when a user is moving around an unfamiliar environment or in an unknown or potentially dangerous situation. The helmet may be a light-weight helmet. The helmet may comprise one or more air holes, such as for providing circulation to the head of the user. The helmet may have a light-weight, skeletal or ribcage, construction. Advantageously, this makes the device easier to wear both over extended periods of times and across different age groups.
The device may be operable to communicate wirelessly with the first communication module. The communication may be automatic, i.e. without user input. The communication may be over a wireless local area network. The first communication module may provide a local WiFi hotspot. Other forms of communication over a local wireless network, such as Bluetooth ® are within the scope of the present disclosure.
The device may comprise a removable liner material arranged to, in use, act as a barrier layer between at least part of the head of the user and the device. Significantly, the removable liner material improves the hygiene of the wearable device. The liner material may be removed and replaced so that different users may hygienically wear the device without requiring cleaning of the device itself. The liner material may be placed inside the device. The removable liner material may be a hairnet. The removable liner material may be a paper liner.
The device may comprise a first power unit for powering the augmented reality apparatus. The first power unit may be coupled to the headset. The augmented reality apparatus may comprise a second power unit. The first power unit may be operable to supplement the second power unit. That is, the first power unit may increase the operating life of the augmented reality apparatus. The first power unit may be a battery. The battery may be a rechargeable battery. The first power unit may be removable from the device. Significantly, the device (e.g. the headset) may be used to house additional components for the augmented reality apparatus. This allows the augmented reality apparatus to have more functionality.
The augmented reality apparatus may be operable to provide a calibration mode for calibrating the augmented reality apparatus. By activating the calibration mode, a calibration screen may be displayed to the user which allows the device to be aligned against an external display or image feature, such as a display on a wall of a building. The headset advantageously aids in this calibration process by supporting tilting, horizontal, forwards and backwards movements of the augmented reality apparatus. That is a mechanical connection between the headset and the augmented reality apparatus may enable adjustment of the augmented reality apparatus during the calibration process.
The first digital asset may be a 2D and/or 3D digital model of an object. The first digital asset may be associated with the appearance of the first location. The first digital asset may be provided to the augmented reality apparatus by a server apparatus. This may be in response to the server apparatus determining that the device is in the first location as a result of the communication between the device and the first communication module. The augmented reality image data provided by the augmented reality apparatus may give the appearance of the 3D digital model being animated. The animations performed by the 3D digital model may depend on the location of the device as determined by communication with the first communication module. For example, the 3D digital model may appear to attack or act aggressively towards the user of the device as the user approaches a location of the 3D digital model. By location of the 3D digital model, we mean a position in the first location where the 3D digital model is determined to be located, such that the augmented reality image data makes it appear that the 3D digital model is in that location. The 3D digital model may, for example, by a 3D digital model of an animal.
The device may be operable to communicate with a second communication module associated with a second location so as to determine whether the device is in the second location. The communication may be automatic, i.e without user input. When, as a result of the communication, it is determined that the device is in the second location, the augmented reality apparatus may be operable to provide augmented reality image data of a second digital asset. The second digital asset may be associated with the second location. For example, the first location may be a jungle themed location, and as a result the first digital asset may be a 3D digital model of a snake. The second location may be a Jurassic themed location, and as a result the second digital asset may be a 3D digital model of a dinosaur.
The device may be operable to communicate wirelessly with the second communication module. The communication may be over a local area network. The second communication module may provide a local WiFi hotspot. Other forms of communication over a local wireless network, such as Bluetooth ® are within the scope of the present disclosure.
The device may comprise a camera apparatus operable to capture media data. The camera apparatus may be a component of the augmented reality apparatus. The camera apparatus may be a separate component to the augmented reality apparatus and may be coupled to the headset. Again, the headset is well suited for supporting additional components. The camera apparatus may be a forward facing camera apparatus.
The device may comprise a hand-held user-manipulatable device for triggering the acquisition of media data by the camera apparatus. The hand-held device may be a remote device wirelessly connected to the camera apparatus. The hand-held device may comprise a push button, wherein in response to the user pressing the push button, the hand-held device may be operable to communicate with the camera apparatus to trigger the acquisition of image data. Different user inputs as received by the hand-held device may result in different image acquisition operations. For example, a user pressing the push button once may result in the camera apparatus acquiring a photo. A user pressing the push button twice in quick succession may result in the camera apparatus acquiring video data. The hand-held device may be attached to the user, such as via a cable around the neck of the user. The hand-held device avoids problems with hand and finger gesture recognition and voice recognition. Both these recognition techniques may be difficult to perform in busy, noisy, and crowded environments, and may be difficult fora user with limited experience or training with the device to perform correctly. For example, a number of people speaking voice commands in the environment may result in a camera apparatus being unintentionally triggered to acquire media data.
The device may be operable to communicate with a third communication module associated with a third location so as to determine whether the device is in the third location. When, as a result of the communication, it is determined that the device is in the third location, the device may be operable to transfer media data captured by the camera apparatus to a server apparatus.
The device may be operable to automatically (i.e. without user input) transfer media data captured by the camera apparatus to the server apparatus when it is determined that the device is in the third location. The media data may be image data. The image data may be photo and/or video data.
The device may be operable to communicate wirelessly with the third communication module. The communication may be automatic, i.e. without user input. The communication may be over a local area network. The third communication module may provide a local WiFi hotspot. Other forms of communication over a local wireless network, such as Bluetooth ® are within the scope of the present disclosure.
The device may be operable to transmit information comprising at least one of unique identifier for the device and a location of the device to a server apparatus. The unique identifier may be a software of firmware coded certificate. The server apparatus may thus be able to identify the device and the location of the device.
The device may be operable to communicate with a fourth communication module associated with a fourth location so as to determine whether the device is in the fourth location. The communication may be automatic, i.e. without user input. When, as a result of the communication, it is determined that the device is in the fourth location, the device may be operable to communicate with a server apparatus. Communicating with the server apparatus may comprise receiving software updates from the server apparatus. Communicating with the server apparatus may comprise transmitting status information to the server apparatus. Significantly, the device may be managed, monitored, and our updated when in the fourth location. The communication with the server apparatus may be triggered automatically when the device is in the fourth location. Thus, the fourth location may act as a software maintenance and digital contents management system.
The device may be operable to communicate wirelessly with the fourth communication module. The communication may be over a local area network. The fourth communication module may provide a local WiFi hotspot. Other forms of communication over a local wireless network, such as Bluetooth ® are within the scope of the present disclosure.
The augmented reality apparatus may be removable from and/or attachable to the headset. Alternatively, the augmented reality apparatus may be an integral part of the headset.
The augmented reality apparatus may be in the form of augmented reality goggles or augmented reality glasses. These may be retained in the headset.
The headset may comprise a chin strap for securing the headset to the head of the user. The chin strap may be a quick-release and/or an adjustable chin strap.
The device may comprise a voice recognition system, such that the user may control aspects of the augmented reality apparatus by use of voice commands. The voice recognition system may be part of the augmented reality apparatus.
The device may comprise audio speakers, such that the device may provide two or three dimensional spatial audio. The audio speakers may be part of the augmented reality apparatus. The audio provided may be dependent on the location of the device as determined by communication between the device and the first communication module.
The device may comprise a hand and/or finger gesture recognition system, such that a user may control aspects of the augmented reality apparatus through hand and/or finger gestures. The hand and/or finger gesture recognition system may be part of the augmented reality apparatus.
The device may communicate with the first communication module using a communication component. The communication component may be part of the augmented reality apparatus.
The device may comprise a motion recognition system, such that the device may determine the position and orientation of the user wearing the device. The motion recognition system may be a six degree-of-freedom motion recognition system. The motion recognition system may be part of the augmented reality apparatus.
The device may be operable to communicate with a plurality of communication modules each associated with a different location. The communication may be automatic, i.e. without user input. The device may be operable to communicate with the plurality of communication modules so as to determine the location of the device. The augmented reality apparatus may be operable to provide augmented reality image data of a digital asset associated with the location in which the device is determined to be located.
The augmented reality image data of the digital assets may result in the digital asset being mapped on to the room itself. The augmented reality apparatus may comprises a sensor for performing depth/surface mapping for assisting in the process of mapping the digital asset onto the room. The digital asset may also be mapped onto other users such that the other users may be given a particular appearance, e.g. by modifying the appearance of their clothing.
The wearable device may be operable to determine the location of the device according to a different means than communication with communication modules. That is, the wearable device in some implementations is not required to communicate with communication modules to determine the location of the device. For example, the wearable device may recognise a visual cue within a first location to determine that the wearable device is in the first location. This may comprise the wearable device scanning a barcode or QR code or by recognising a particular image within the first location. Each location may have a different visual cue within it such that the device can determine its location as it moves through the tour. The wearable device recognising visual cues to determine the location of the device may supplement the communication with communication modules to determine the location of the device. For example, if the device is unable to communicate due to a communication fault, the device may still be able to determine its location by scanning a visual cue within the location. The device may alternatively or additionally comprise a user input means for the user to manually identify their location. The user input means may be a touch interface, a motion gesture recognition system, or a voice recognition system. In one or more embodiments, the location may be determined by any one or more of the means discussed above or apparent to the skilled person. It is envisaged that communication with the communication module may provide a better, immersive experience, as it may provide a more seamless transition through locations for the user.
According to a second aspect, there may be provided a system comprising: a first communication module associated with a first location; and a wearable device adapted to be retained on a head of a user, the wearable device comprising an augmented reality apparatus, wherein the device is operable to communicate with the first communication module so as to determine whether the device in the first location, and wherein when, as a result of the communication, it is determined that the device is in the first location, the augmented reality apparatus is operable to provide augmented reality image data of a first digital asset, the first digital asset being associated with the first location.
The wearable device may comprise a headset arranged to retain the augmented reality apparatus.
The wearable device may be a wearable device according to the first aspect of the disclosure.
The system may comprise a plurality of the wearable devices. Each wearable device may be arranged, in use, to be worn by a different user. The plurality of devices may be operable to display augmented reality image data of the same first digital asset when it is determined that the plurality of the augmented reality systems are in the first location.
The plurality of devices may comprise a first group of devices and a master device. The master device may be operable to display at least one of the status and location of one or more of the first group of devices. The master device may be wearable by a host or leader of a touring environment. The master device can thus be used to track the movement of the first group of devices who may be guest or participants in the tour.
Each of the plurality of devices may be operable to display different augmented reality image data of the same first digital asset dependent on a predetermined setting on the device. The predetermined setting on the device may be a category rating identifying a type of content that it is appropriate for viewing on the device.
The system may comprise a plurality of communication modules each associated with a different location. The device may be operable to communicate with the plurality of communication modules so as to determine the location of the device. The augmented reality apparatus may be operable to provide augmented reality image data of a digital asset associated with the location in which the device is determined to be located.
According to a third aspect, there may be provided a method performed by a wearable device adapted to be retained on a head of a user, the wearable device comprising an augmented reality apparatus, the method comprising: communicating with a first communication module associated with a first location so as to determine whether the device is in the first location, and wherein when, as a result of the communication, it is determined that the device is in the first location, the method further comprises providing, by the augmented reality apparatus, augmented reality image data of a first digital asset to the user, the first digital asset being associated with the first location.
The wearable device may comprise a headset arranged to retain the augmented reality apparatus
The wearable device may be a wearable device according to the first aspect of the disclosure.
The method may comprise communicating with a second communication module associated with a second location so as to determine whether the device is in the second location. When, as a result of the communication, it is determined that the device is in the second location, the method further comprises providing, by the augmented reality apparatus, augmented reality image data of a second digital asset to the user. The second digital asset may be associated with the second location.
The method may comprise communicating with a third communication module associated with a third location so as to determine whether the device is in the third location. When, as a result of the communication, it is determined that the device is in the third location, the method may comprise transferring image data captured by a camera apparatus of the device to a server apparatus. The method may comprise automatically transferring image data captured by the camera apparatus to the server apparatus when it is determined that the device is in the third location.
The method may comprise transmitting information comprising at least one of a unique identifier for the device and a location of the device to a server apparatus. The unique identifier may be a software of firmware coded certificate. The information transmitted to the server apparatus may comprise the location of the device. The server apparatus may thus be able to identify the device and the location of the device.
The method may comprise communicating with a fourth communication module associated with a fourth location so as to determine whether the device is in the fourth location. When, as a result of the communication, the device is in the fourth location, the method comprises communicating with a server apparatus. Communicating with the server apparatus may comprise receiving software updates from the server apparatus. Communicating with the server apparatus may comprise transmitting status information to the server apparatus. Significantly, the device may be managed, monitored, and our updated when in the fourth location. The communication with the server apparatus may be triggered automatically when the device is in the fourth location. Thus, the fourth location may act as a software maintenance and digital contents management system.
The method may comprise communicating with a plurality of communication modules each associated with a different location. The communicating with the plurality of communication modules may be so as to determine the location of the device. The augmented reality apparatus may be operable to provide augmented reality image data of a digital asset associated with the location in which the device is determined to be located.
The method may be or may be part of a touring method. The touring method may enable the user to experience different augmented reality image data as they move through a touring environment. The touring environment may be for educational and/or entertainment purposes.
According to a fourth aspect, there is provided a touring system comprising the system or device of any other aspect. The touring system may enable the user to experience different augmented reality image data as they move through a touring environment. The touring environment may be for educational and/or entertainment purposes.
Although a few preferred embodiments of the present invention have been shown and described, it will be appreciated by those skilled in the art that various changes and modifications might be made without departing from the scope of the invention, as defined in the appended claims.
Brief Description of the Drawings
Examples of the present disclosure will now be described with reference to the accompanying drawings, in which:
Figure 1a shows a view of a wearable device in accordance with aspects of the present disclosure;
Figures 1b to 1d show views of another wearable device in accordance with aspects of the present disclosure;
Figure 2 shows a view of a wearable device communicating with a first communication module associated with a first location in accordance with aspects of the present disclosure;
Figure 3 shows an example arrangement of a plurality of communication modules associated with a plurality of locations;
Figure 4 shows an example system according to aspects of the disclosure;
Figure 5 shows an example system according to aspects of the disclosure; and
Figure 6 shows an example method according to aspects of the disclosure.
Detailed Description
Referring to Figure 1a there is shown a wearable device according to aspects of the present disclosure. The wearable device 100 comprises a headset 101a and an augmented reality apparatus 103. The headset 101a retains the augmented reality apparatus 103. The augmented reality apparatus 103 comprises augmented reality glasses 105. The augmented reality apparatus 103 in this example is the existing HOLOLENS ® by Microsoft Corporation. The HOLOLENS ® is designed to be supported on the crown of the head of the user through use of a relatively narrow band 107. The wearable device 100 of Figure 1a improves on this by providing breathable mesh straps 113 that additionally support the augmented reality apparatus on the top of the head of the user. Further, the headset 101a comprises an adjustment arrangement 115 for enabling the position and/or the tightness of the headset 101 around the head of the user to be adjusted. The adjustment arrangement 115 is located on the nape region of the head of the user. The adjustment arrangement 115 comprises a rotatable knob such that the position and/or the tightness of the headset 101a may be adjusted by rotating the knob. The adjustment arrangement 115 is connected to the remainder of the headset 101 a by at least one strap 117.
Referring to Figures 1 b-1 d, there is shown another wearable device 100 according to aspects of the present disclosure. The wearable device 100 comprises a headset 101b in the form of a helmet 101b and an augmented reality apparatus 103. The augmented reality apparatus 103 comprises augmented reality glasses 105 supported on a relatively narrow band 107. The augmented reality apparatus 103 in this example is the existing HOLOLENS ® by Microsoft Corporation. The HOLOLENS ® is designed to be supported on the crown of the head of the user. The wearable device 100 of Figure 1b-d improves on this by providing the helmet 101b to retain the augmented reality apparatus 103. In this example the helmet 101b is coupled to the band 107 of the augmented reality apparatus 103. In this way, the augmented reality apparatus 103 is not only supported on the crown of the head of the user, but also supported around and on top of the head of the user. The helmet 101b is a lightweight helmet 101b with an adjustable chin strap 109.
While the example of Figures 1a-d and the other examples described below include the advantageous arrangement of the headset 101a ,b, the headset 101a,b is not required in all aspects of the present disclosure. That is, the device 100 without the headset 101a, b is still able to achieve beneficial technical effects of the present disclosure. For example, the wearable device 100 in some aspects may be augmented reality goggles or glasses without a supportive headset arrangement.
The device 100 comprises a removable liner material (not shown). The removable liner material is positioned within the headset 101a,b such that when the headset 101a,b is worn, by a user, the liner material is disposed between the top of the head of the user and the internal surface of the headset 101a,b. The liner material thus acts as a barrier layer between at least part of the head of a user of the device 100 and the headset 101a,b.
The headset 101a,b comprises a first power unit 111 for powering the augmented reality apparatus 103. The first power unit 111 is a rechargeable battery 111 positioned in a docking station located on the rear of the headset 101a,b. The augmented reality apparatus 103 has an internal rechargeable power unit, which may have limited capacity. The rechargeable battery 111 thus increases the operating life of the device 100 meaning that it can be used for longer without requiring recharging. During use, a depleted battery 111 may be quickly replaced with a fully charged battery 111. This could happen while a tour is on-going or between tours. This means that the device 100 has limited down-time.
The augmented reality apparatus 103 comprises a voice recognition system (not shown), audio speakers (not shown), a hand and/or finger gesture recognition system (not shown), a motion recognition system (not shown), and a communication component (not shown). The augmented reality apparatus 103 further comprises a front facing camera apparatus (not shown) to enable the user of the device 100 to record videos and/or take photos of the augmented reality scene as viewed by the user.
Referring to Figure 2, there is shown a view of the wearable device 100, a first communication module 201, and a first location 301. The wearable device 100 is within the first location 301. The first location 301 is a room 301. The room 301 has the first communication module 201 located within it. The first communication module 201 is associated with the first location. The device 100 is operable to communicate with the first communication module 201 associated with the first location 301 so as to determine whether the device 100 is in the first location 301. This may be due to the first communication module 201 having a limited communication range such that the device 100 is only able to communicate with the first communication module 201 when it is within the first location 301. Other techniques for determining whether the device
100 is within the first location 301 as a result of the communication between the first communication module 201 and the device 100 are within the scope of the present disclosure.
When, as a result of the communication, it is determined that the device 100 is in the first location 301, the augmented reality apparatus 103 is operable to provide augmented reality image data of a first digital asset to the user. The first digital asset is associated with the first location 301. For example the first location 301 may be a room having a certain theme, such as a Jurassic theme. The first digital asset associated with the first location 301 may be a 3D model of a dinosaur. In this way, when a user of the wearable device 100 is in the first location 301, the augmented reality apparatus 103 may provide augmented reality image data of the 3D model of the dinosaur to the user such that the scene viewed by the user is augmented by the appearance of a dinosaur.
Referring to Figure 3, there is shown an example arrangement such as for a museum or theme park with multiple locations. In particular, there is shown a building environment 300 with multiple rooms 301-313. The rooms 301-311 each include a communication module 201-211 associated with the respective room. In Figure 3, the dotted arrows represent routes that may be taken by a user wearing the device 100 (Figures 1a-1c) through the building environment 300. As the user moves through the building environment 300, the device 100 will communicate with the communication modules 201-211 such that the location of the user may be determined. In this way, the augmented reality apparatus 103 of the device 100 may provide augmented reality image data of different digital assets dependent on the location of the user. The effect of this is that the user is presented with different digital assets in an automatic and seamless manner as they transition through the rooms 301-313. A user input is not required to change the digital assets, and thus an immersive experience is provided.
In Figure 3, the user enters the building environment 300 and moves into room 301. Room 301 is a preparation room 301 where the user is provided with a wearable device 100 (Figure 1) that they may wear through the rest of the building environment 300.
The preparation room 301 may include a selection of devices 100 in different sizes, such as in small, medium, and large sizes. In this way, a user may select a device 100 based on their head size, and may wear the device 100 with minimal adjustment or difficulty.
The preparation room 301 may also include a selection of devices 100 with different category ratings. The different category ratings mean that the devices 100 are operable to display different augmented reality image data of the same first digital asset dependent on the category rating of the device 100.
A first category rating for device 100 may only allow child-friendly augmented reality image data to be viewed. For example, if a first digital asset is an animated 3D model of a dangerous creature such as a lion or tiger, the device 100 with the first category rating would be operable to provide augmented reality image data that does not give the impression that the dangerous creature is acting in an aggressive or threatening way towards the user. The first digital asset would not appear to attack the user of the device 100.
A second category rating for device 100 may allow teenage/young-adult friendly augmented reality image data to be viewed. For example, the device 100 with the second category rating may be operable to provide augmented reality image data that makes the dangerous creature appear aggressive to the user of the device 100, but not attack them.
A third category rating for device 100 may allow adult level augmented reality image data to be viewed. For example, the device 100 with the third category rating may be operable to provide augmented reality image data that makes the dangerous creature appear to attack or threaten the user.
A fourth category rating that may supplement any of the first to third category ratings is a premium category rating. The premium category rating enables the user of the device 100 to view extra augmented reality image data when in the different locations, and may also enable the user to have additional experiences in additional rooms of the building environment.
Before wearing the device 100, a disposable barrier layer may be inserted into the headset 101a,b so as to provide a more hygienic environment. While in the preparation room 301 a calibration procedure may also be performed for calibrating the augmented reality apparatus 103 (Figure 1). This procedure acts to enable the user to view the augmented reality image data at the optimum viewing position. The preparation room 301 comprises a passive or active wall mounted display which the user may use to calibrate the augmented reality apparatus 103. In particular, by activating a calibration mode on the augmented reality apparatus 103, a calibration screen is displayed to the user which allows the device 100 to be aligned against the wall mounted display. The headset 101a,b advantageously aids in this calibration process by supporting (e.g. allowing/facilitating) tilting, horizontal, forwards and backwards movements of the augmented reality apparatus 103.
From the preparation room 301, the user enters a second room 303 with a second communication module 203. The device 100 communicates with the second communication module 203 so as to determine that the device 100 is in the second room 303. As a result, the augmented reality apparatus 103 provides, to the user, augmented reality image data of a second digital asset that is associated with the second room 303. The communication with the second communication module 203 may also be used to identify the device 100. The device 100 may transmit a unique identifier to the second communication module 203. In this way, the movement of the particular device 100 through the building environment 300 may be tracked and monitored.
From the second room 303, the user enters a third room 305 with a third communication module 205. The device 100 communicates with the third communication module 205 so as to determine that the device 100 is in the third room 305. As a result, the augmented reality apparatus 103 provides augmented reality image data of a third digital asset to the user that is associated with the third room 305. The communication with the third communication module 205 may also be used to identify the device 100. In particular, the device 100 may transmit a unique identifier to the third communication module 205.
From the third room 305, the user has the option of entering a fourth room 307 with a fourth communication module 207. The fourth room 307 is a premium room for users with wearable devices 100 that have the premium category rating. The device 100 communicates with the fourth communication module 207 so as to determine that the device 100 is in the third room 307. If the device 100 has the premium category rating, the augmented reality apparatus 103 provides augmented reality image data of a fourth digital asset to the user. The fourth digital asset is associated with the fourth room 307. If the device 100 does not have the premium category rating, it may not provide augmented reality image data, or may display an indication to the user that they need to upgrade to a premium category device 100 in order to view the content of the fourth room 307. In another example, only non-premium (standard) augmented reality image data may be provided.
From the third room 305, the user enters a fifth room 309 with a fifth communication module 209. The fifth room 309 is a de-rigging room where the user may remove their device 100. The device 100 communicates with the fifth communication module 209 to determine that the device 100 is in the fifth room 309. As a result, the device 100 transfers any image data captured by a camera apparatus of the device 100 to a server apparatus. In addition, the augmented reality apparatus 103 may provided augmented reality image data to the user indicating that the device 100 should be removed and placed in a certain location in the fifth room 309. The image data taken by the device 100 may include photo or video clips taken by the user during their passage through rooms 301-307. The fifth room 309 is also the location where the devices 100 may be checked for potential problems or faults.
From the fifth room 309, the devices 100 may be transferred to a sixth room 311 with a sixth communication module 211. The sixth room 311 is a storage room where the devices 100 may be stored when not in use. The device 100 is operable to communicate with the sixth communication module 211 associated with the sixth room 311 so as to determine that the device 100 is in the sixth room 311. As a result, the device 100 is operable to communicate with a server apparatus, such as to receive software updates and transmit status information to the server apparatus. When in the sixth room 311, the device 100 may be uniquely identified by a software of firmware coded certificate.
From the fifth room 309, the user may enter a seventh room 313. The seventh room 313 is a shopping environment where they may purchase copies of the image data taken during their passage through rooms 301-307 and uploaded to the server apparatus in the fifth room 309.
It will be appreciated that the present disclosure is not limited to any particular number of locations 301-313. In addition the locations may not be rooms, but may be different environments. These environments may be external spaces, or a mixture of internal and external spaces. The device may communicate with more than one location at a time such as for transitioning between locations 301-313.
In some examples, a plurality of users each wearing a device 100 (Figure 1) may pass through the building environment 300 at the same time. All of the devices 100 or a subset of the devices 100 may be operable to display augmented reality image data of the same digital assets. This allows groups of users to experience the same content. In addition, each user may opt to view their own digital asset, or be part of a smaller group such as a family. This allows groups of users to see the same digital asset but from their own point of view.
One of the devices 100 may be a master device 100 that is operable to display at least on one of the status and location of one or more of the other devices 100 passing through the building environment 300. The master device 100 may be worn by a tour guide such that they monitor the progress of users through a tour of the building environment 300. The master device 100 also enables the tour guide to experience the same augmented reality experience as the users of the tour.
Referring to Figure 4, there is shown a system in accordance with the second aspect of the invention.
The system comprises a plurality of wearable devices 100 (Figure 1) and communication modules 201 (Figure 2), as above.
The system comprises a server apparatus 401 that is a local client server apparatus 401 installed or associated with the local site of the client that provides the environment through which the users wearing the devices 100 may pass through. This may be, for example a building environment 300 (Figure 3), through which a user passes through as part of an interactive tour provided by the client. The server apparatus 401 is able to receive information from the devices 100 and transmit information to the devices 100 via the communication modules 201.
The server apparatus 401 comprises or is associated with a client management database 403. The client management database 403 is used to collect information on users wearing the devices 100. This information may include the time spent viewing specific features in the different locations through which the user passes. This information may include how many photos or video may be taken by the user, and any information on the use of the different categories of devices 100.
The server apparatus 401 comprises or is associated with an ecommerce system 407. The ecommerce system 407 may be used by the local client to manage bookings, purchases and groups. The ecommerce system 407 may be managed and installed by the client. The ecommerce system 407 may interact with the local server apparatus 401, such as to manage group bookings, the categorisation of devices 100, and media data (such as images and videos) purchases by users of the devices 100.
The server apparatus 401 comprises or is associated with a digital assets database 409. The digital assets database 409 comprises digital assets that may be provided to the devices 100. For example, in response to determining that a device 100 is in a particular location, the server apparatus 401 may obtain a particular digital asset from the digital asset database 409 and provide the obtained digital asset to the device 100.
The server apparatus 401 comprises or is associated with a media database 413. The media database 413 may be used to store image data, such as photos and media, captured by users of the devices 100. The media may be provided to the ecommerce system 407 such that a user may purchase the captured media data.
The server apparatus 401 comprises or is associated with a client analytics database 411. This provides the server apparatus 401 with access to analytical data. The server apparatus 401 may be connected to the internet 415.
Referring to Figure 5, there is shown a system in accordance with the second aspect of the invention. In the system of Figure 5 multiple local client servers 401a-401c (Figure 4) are connected to a central server 501. In this way, the central server 500 may control, manage, and/or monitor the local client servers 401a-401c. Each local client server 401a-401c may be associated with a different client site, such as a different theme park, museum, or zoo.
The central server 501 provides dashboard analytics, allowing the local client servers 401a401c to access real-time reports on the devices 100 within the local client server 401a-401c environment. This central server 501 enables the identification of problems and faults such as within the local client servers 401a-401c, devices 100, communication modules 201, ecommerce systems 407, and digital assets database 409. The central server 501 provides for content management. The central server 501 enables the digital assets and software for the local clients to be updated remotely and automatically. The central server 501 further provides for licence management, and the certification and management of digital contents and equipment at the local client sites.
The central server 501 comprises or is associated with a client management database 503 for storing client specific information. A digital assets database 507 is also provided for managing and maintaining the digital assets used by the local client sites. The client analytics database 505 stores local client site activity information such that clients have online access to reports and dashboards. The central server 501 also manages the commercial contract arrangements of each local client site. For example, a local client site may have a licence to operate from a certain start date through to a contract end date, with a pre-agreed number devices 100 and digital assets. The central server 501 will monitor the local client site to ensure that the licence terms are complied with.
Referring to Figure 6, there is shown an example method according to the third aspect of the disclosure.
The method is performed by a wearable device 100 (Figure 1). In step 601 of the method, the wearable device 100 communicates with a first communication module associated with a first location so as to determine whether the device 100 is in the first location. When, as a result of the communication, it is determined that the device 100 is in the first location, step 603 of the method is performed whereby the augmented reality apparatus 103 provides augmented reality image data of a first digital asset, the first digital asset being associated with the first location.
It will be appreciated that reference to “first”, “second”, “third” locations, etc. does not necessarily refer to a particular sequence of locations. That is, the user may visit the third location, for example, before visiting the first or second location. The first, second, third, etc. locations may all be different locations or some or all of the locations may be the same. For example, the different locations may be different regions within one room or internal or outside environment.
At least some of the example embodiments described herein may be constructed, partially or wholly, using dedicated special-purpose hardware. Terms such as ‘component’, ‘module’ or ‘unit’ used herein may include, but are not limited to, a hardware device, such as circuitry in the form of discrete or integrated components, a Field Programmable Gate Array (FPGA) or Application Specific Integrated Circuit (ASIC), which performs certain tasks or provides the associated functionality. In some embodiments, the described elements may be configured to reside on a tangible, persistent, addressable storage medium and may be configured to execute on one or more processors. These functional elements may in some embodiments include, by way of example, components, such as software components, object-oriented software components, class components and task components, processes, functions, attributes, procedures, subroutines, segments of program code, drivers, firmware, microcode, circuitry, data, databases, data structures, tables, arrays, and variables. Although the example embodiments have been described with reference to the components, modules and units discussed herein, such functional elements may be combined into fewer elements or separated into additional elements. Various combinations of optional features have been described herein, and it will be appreciated that described features may be combined in any suitable combination. In particular, the features of any one example embodiment may be combined with features of any other embodiment, as appropriate, except where such combinations are mutually exclusive. Throughout this specification, the term “comprising” or “comprises” means including the component(s) specified but not to the exclusion of the presence of others.
The described and illustrated embodiments are to be considered as illustrative and not restrictive in character, it being understood that only the preferred embodiments have been shown and described and that all changes and modifications that come within the scope of the inventions as defined in the claims are desired to be protected. It should be understood that while the use of words such as “preferable”, “preferably”, “preferred” or “more preferred” in the description suggest that a feature so described may be desirable, it may nevertheless not be necessary and embodiments lacking such a feature may be contemplated as within the scope of the invention as defined in the appended claims. In relation to the claims, it is intended that when words such as “a,” “an,” “at least one,” or “at least one portion” are used to preface a feature there is no intention to limit the claim to only one such feature unless specifically stated to the contrary in the claim. When the language “at least a portion” and/or “a portion” is used the item can include a portion and/or the entire item unless specifically stated to the contrary.
In summary, there is provided a wearable device 100 adapted to be retained on a head of a user, the wearable device comprising an augmented reality apparatus 103. The device 100 communicates with a first communication module 201 associated with a first location so as to determine whether the device 100 is in the first location. When, as a result of the communication, it is determined that the device 100 is in the first location, the augmented reality apparatus 103 provides augmented reality image data of a first digital asset, the first digital asset being associated with the first location. A plurality of communication modules 201 may be provided in different locations such that different digital assets are provided dependent on the location of the device 100.
Attention is directed to all papers and documents which are filed concurrently with or previous to this specification in connection with this application and which are open to public inspection with this specification, and the contents of all such papers and documents are incorporated herein by reference.
All of the features disclosed in this specification (including any accompanying claims, abstract 10 and drawings), and/or all of the steps of any method or process so disclosed, may be combined in any combination, except combinations where at least some of such features and/or steps are mutually exclusive.
Each feature disclosed in this specification (including any accompanying claims, abstract and drawings) may be replaced by alternative features serving the same, equivalent or similar purpose, unless expressly stated otherwise. Thus, unless expressly stated otherwise, each feature disclosed is one example only of a generic series of equivalent or similar features.
The invention is not restricted to the details of the foregoing embodiment(s). The invention extends to any novel one, or any novel combination, of the features disclosed in this specification (including any accompanying claims, abstract and drawings), or to any novel one, or any novel combination, of the steps of any method or process so disclosed.
Claims (22)
1. A wearable device adapted to be retained on a head of a user, the wearable device comprising an augmented reality apparatus wherein the device is operable to communicate with a first communication module associated with a first location so as to determine whether the device is in the first location, and wherein when, as a result of the communication, it is determined that the device is in the first location, the augmented reality apparatus is operable to provide augmented reality image data of a first digital asset, the first digital asset being associated with the first location.
2. A device as claimed in claim 1, wherein the device comprises a removable liner material arranged to, in use, act as a barrier layer between at least part of the head of the user and the device.
3. A device as claimed in claim 1 or 2, wherein the device is arranged to distribute the weight of the augmented reality apparatus around the head of the user.
4. A device as claimed in any preceding claim, wherein the device comprises a power unit for powering the augmented reality apparatus.
5. A device as claimed in any preceding claim, wherein the augmented reality apparatus is operable to provide a calibration mode for calibrating the augmented reality apparatus.
6. A device as claimed in any preceding claim, wherein the first digital asset is a 3D digital model of an object, and preferably wherein the first digital asset is associated with the appearance of the first location.
7. A device as claimed in any preceding claim, wherein the device is operable to communicate with a second communication module associated with a second location so as to determine whether the device is in the second location, and wherein when, as a result of the communication, it is determined that the device is in the second location, the augmented reality apparatus is operable to provide augmented reality image data of a second digital asset, the second digital asset being associated with the second location.
8. A device as claimed in any preceding claim, wherein the device comprises a camera apparatus operable to capture image data, wherein the device is operable to communicate with a third communication module associated with a third location so as to determine whether the device is in the third location, and wherein when, as a result of the communication, it is determined that the device is in the third location, the device is operable to transfer image data captured by the camera apparatus to a server apparatus.
9. A device as claimed in any preceding claim, wherein the device is operable to transmit information comprising at least one of a unique identifier for the device and a location of the device to a server apparatus.
10. A device as claimed in any preceding claim, wherein the device is operable to communicate with a fourth communication module associated with a fourth location so as to determine whether the device is in the fourth location, and wherein, when the device is in the fourth location, the device is operable to communicate with a server apparatus.
11. A device as claimed in any preceding claim, wherein the device comprises a camera apparatus and a hand-held user-manipulatable device for triggering the acquisition of media data by the camera apparatus.
12. A device as claimed in any preceding claim, wherein the device is further operable to recognise a visual cue within the first location so as to determine that the device is within the first location.
13. A device as claimed in any preceding claim, wherein the device comprises a headset arranged to retain the augmented reality apparatus.
14. A device as claimed in claim 13, wherein the headset is a helmet.
15. A device as claimed in any preceding claim, wherein the device is suitable for use in a touring system.
16. A system comprising:
a first communication module associated with a first location; and a wearable device adapted to be retained on a head of a user, the wearable device comprising an augmented reality apparatus, wherein the device is operable to communicate with the first communication module so as to determine whether the device is in the first location, and wherein when, as a result of the communication, it is determined that the device is in the first location, the augmented reality apparatus is operable to provide augmented reality image data of a first digital asset, the first digital asset being associated with the first location.
17. A system as claimed in claim 16, further comprising a plurality of the wearable devices each arranged, in use, to be worn by a different user, wherein the plurality of devices are operable to display augmented reality image data of the same first digital asset when it is determined that the plurality of the devices are in the first location.
18. A system as claimed in claim 17, wherein the plurality of devices comprise a first group of devices and a master device, wherein the master device is operable to display at least one of the status and location of one or more of the first group of devices.
19. A system as claimed in claim 17 or claim 18, wherein each of the plurality of devices is operable to display different augmented reality image data of the same first digital asset dependent on a predetermined setting on the device.
20. A touring system comprising the system or device of any preceding claim.
21. A method performed by a wearable device adapted to be retained on a head of a user, the wearable device comprising an augmented reality apparatus, the method comprising:
communicating with a first communication module associated with a first location so as to determine whether the device is in the first location, and wherein when, as a result of the communication, it is determined that the device is in the first location, the method further comprises providing, by the augmented reality apparatus, augmented reality image data of a first digital asset, the first digital asset being associated with the first location.
22. A method of touring through a touring environment using the method of claim 21.
Intellectual
Property
Office
Priority Applications (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
GB1715486.5A GB2566734A (en) | 2017-09-25 | 2017-09-25 | Wearable device, system and method |
Applications Claiming Priority (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
GB1715486.5A GB2566734A (en) | 2017-09-25 | 2017-09-25 | Wearable device, system and method |
Publications (2)
Publication Number | Publication Date |
---|---|
GB201715486D0 GB201715486D0 (en) | 2017-11-08 |
GB2566734A true GB2566734A (en) | 2019-03-27 |
Family
ID=60244427
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
GB1715486.5A Withdrawn GB2566734A (en) | 2017-09-25 | 2017-09-25 | Wearable device, system and method |
Country Status (1)
Country | Link |
---|---|
GB (1) | GB2566734A (en) |
Citations (6)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JPH0767056A (en) * | 1993-06-15 | 1995-03-10 | Olympus Optical Co Ltd | Head mount display system |
WO2007020591A2 (en) * | 2005-08-15 | 2007-02-22 | Koninklijke Philips Electronics N.V. | System, apparatus, and method for augmented reality glasses for end-user programming |
US20090013052A1 (en) * | 1998-12-18 | 2009-01-08 | Microsoft Corporation | Automated selection of appropriate information based on a computer user's context |
US20130083062A1 (en) * | 2011-09-30 | 2013-04-04 | Kevin A. Geisner | Personal a/v system with context relevant information |
WO2015103623A1 (en) * | 2014-01-06 | 2015-07-09 | Qualcomm Incorporated | Calibration of augmented reality (ar) optical see-through display using shape-based alignment |
US20150348591A1 (en) * | 2010-08-26 | 2015-12-03 | Blast Motion Inc. | Sensor and media event detection system |
-
2017
- 2017-09-25 GB GB1715486.5A patent/GB2566734A/en not_active Withdrawn
Patent Citations (6)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JPH0767056A (en) * | 1993-06-15 | 1995-03-10 | Olympus Optical Co Ltd | Head mount display system |
US20090013052A1 (en) * | 1998-12-18 | 2009-01-08 | Microsoft Corporation | Automated selection of appropriate information based on a computer user's context |
WO2007020591A2 (en) * | 2005-08-15 | 2007-02-22 | Koninklijke Philips Electronics N.V. | System, apparatus, and method for augmented reality glasses for end-user programming |
US20150348591A1 (en) * | 2010-08-26 | 2015-12-03 | Blast Motion Inc. | Sensor and media event detection system |
US20130083062A1 (en) * | 2011-09-30 | 2013-04-04 | Kevin A. Geisner | Personal a/v system with context relevant information |
WO2015103623A1 (en) * | 2014-01-06 | 2015-07-09 | Qualcomm Incorporated | Calibration of augmented reality (ar) optical see-through display using shape-based alignment |
Also Published As
Publication number | Publication date |
---|---|
GB201715486D0 (en) | 2017-11-08 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
JP6382468B1 (en) | Movie distribution system, movie distribution method, and movie distribution program for distributing movie including animation of character object generated based on movement of actor | |
JP6316387B2 (en) | Wide-area simultaneous remote digital presentation world | |
US10341612B2 (en) | Method for providing virtual space, and system for executing the method | |
JP6420930B1 (en) | Movie distribution system, movie distribution method, and movie distribution program for distributing movie including animation of character object generated based on movement of actor | |
US10455184B2 (en) | Display device and information processing terminal device | |
KR102331780B1 (en) | Privacy-Sensitive Consumer Cameras Coupled to Augmented Reality Systems | |
WO2018100800A1 (en) | Information processing device, information processing method, and computer program | |
US20190073830A1 (en) | Program for providing virtual space by head mount display, method and information processing apparatus for executing the program | |
WO2019216146A1 (en) | Moving picture delivery system for delivering moving picture including animation of character object generated based on motions of actor, moving picture delivery method, and moving picture delivery program | |
US20190005731A1 (en) | Program executed on computer for providing virtual space, information processing apparatus, and method of providing virtual space | |
US20180374275A1 (en) | Information processing method and apparatus, and program for executing the information processing method on computer | |
JP2018163461A (en) | Information processing apparatus, information processing method, and program | |
US20190079298A1 (en) | Method executed on computer for providing contents in transportation means, program for executing the method on computer, contents providing apparatus, and contents providing system | |
JPWO2017064926A1 (en) | Information processing apparatus and information processing method | |
JP6596452B2 (en) | Display device, display method and display program thereof, and entertainment facility | |
WO2022091832A1 (en) | Information processing device, information processing system, information processing method, and information processing terminal | |
JP6919568B2 (en) | Information terminal device and its control method, information processing device and its control method, and computer program | |
GB2566734A (en) | Wearable device, system and method | |
JP6498832B1 (en) | Video distribution system that distributes video including messages from viewing users | |
JP2019075805A (en) | Computer-implemented method for providing content in mobile means, program for causing computer to execute the method, content providing device, and content providing system | |
JP6937803B2 (en) | Distribution A video distribution system, video distribution method, and video distribution program that delivers live video including animation of character objects generated based on the movement of the user. | |
JP6431242B1 (en) | Video distribution system that distributes video including messages from viewing users | |
JP2019198057A (en) | Moving image distribution system, moving image distribution method and moving image distribution program distributing moving image including animation of character object generated based on actor movement | |
JP2019198054A (en) | Video distribution system for distributing video including animation of character object generated on the basis of actor movement and video distribution program | |
TR201701744A2 (en) | ENTERTAINMENT SYSTEM WITH IMAGE TRANSFER WITH DIFFERENT ANGLES |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
WAP | Application withdrawn, taken to be withdrawn or refused ** after publication under section 16(1) |