GB2619551A - Computer implemented method and system - Google Patents

Computer implemented method and system Download PDF

Info

Publication number
GB2619551A
GB2619551A GB2208501.3A GB202208501A GB2619551A GB 2619551 A GB2619551 A GB 2619551A GB 202208501 A GB202208501 A GB 202208501A GB 2619551 A GB2619551 A GB 2619551A
Authority
GB
United Kingdom
Prior art keywords
real
world item
extended reality
notification
world
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
GB2208501.3A
Other versions
GB202208501D0 (en
Inventor
Federico Quijada Leyton Pedro
Erwan Damien Uberti David
Preston Stemple Lloyd
Giuseppe Visciglia Aron
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Sony Interactive Entertainment Europe Ltd
Original Assignee
Sony Interactive Entertainment Europe Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Sony Interactive Entertainment Europe Ltd filed Critical Sony Interactive Entertainment Europe Ltd
Priority to GB2208501.3A priority Critical patent/GB2619551A/en
Publication of GB202208501D0 publication Critical patent/GB202208501D0/en
Publication of GB2619551A publication Critical patent/GB2619551A/en
Pending legal-status Critical Current

Links

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/011Arrangements for interaction with the human body, e.g. for user immersion in virtual reality
    • AHUMAN NECESSITIES
    • A63SPORTS; GAMES; AMUSEMENTS
    • A63FCARD, BOARD, OR ROULETTE GAMES; INDOOR GAMES USING SMALL MOVING PLAYING BODIES; VIDEO GAMES; GAMES NOT OTHERWISE PROVIDED FOR
    • A63F13/00Video games, i.e. games using an electronically generated display having two or more dimensions
    • A63F13/20Input arrangements for video game devices
    • AHUMAN NECESSITIES
    • A63SPORTS; GAMES; AMUSEMENTS
    • A63FCARD, BOARD, OR ROULETTE GAMES; INDOOR GAMES USING SMALL MOVING PLAYING BODIES; VIDEO GAMES; GAMES NOT OTHERWISE PROVIDED FOR
    • A63F13/00Video games, i.e. games using an electronically generated display having two or more dimensions
    • A63F13/50Controlling the output signals based on the game progress
    • A63F13/53Controlling the output signals based on the game progress involving additional visual information provided to the game scene, e.g. by overlay to simulate a head-up display [HUD] or displaying a laser sight in a shooting game
    • AHUMAN NECESSITIES
    • A63SPORTS; GAMES; AMUSEMENTS
    • A63FCARD, BOARD, OR ROULETTE GAMES; INDOOR GAMES USING SMALL MOVING PLAYING BODIES; VIDEO GAMES; GAMES NOT OTHERWISE PROVIDED FOR
    • A63F13/00Video games, i.e. games using an electronically generated display having two or more dimensions
    • A63F13/60Generating or modifying game content before or while executing the game program, e.g. authoring tools specially adapted for game development or game-integrated level editor
    • A63F13/65Generating or modifying game content before or while executing the game program, e.g. authoring tools specially adapted for game development or game-integrated level editor automatically by game devices or servers from real world data, e.g. measurement in live racing competition
    • A63F13/655Generating or modifying game content before or while executing the game program, e.g. authoring tools specially adapted for game development or game-integrated level editor automatically by game devices or servers from real world data, e.g. measurement in live racing competition by importing photos, e.g. of the player
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T19/00Manipulating 3D models or images for computer graphics
    • G06T19/006Mixed reality
    • AHUMAN NECESSITIES
    • A63SPORTS; GAMES; AMUSEMENTS
    • A63FCARD, BOARD, OR ROULETTE GAMES; INDOOR GAMES USING SMALL MOVING PLAYING BODIES; VIDEO GAMES; GAMES NOT OTHERWISE PROVIDED FOR
    • A63F2300/00Features of games using an electronically generated display having two or more dimensions, e.g. on a television screen, showing representations related to the game
    • A63F2300/60Methods for processing data by generating or executing the game program
    • A63F2300/66Methods for processing data by generating or executing the game program for rendering three dimensional images
    • A63F2300/6653Methods for processing data by generating or executing the game program for rendering three dimensional images for altering the visibility of an object, e.g. preventing the occlusion of an object, partially hiding an object

Abstract

A method, device and system for representing tracking of a real-world item in extended reality (XR). If tracking of the real-world item is lost a notification is provided to a user 102 via an extended reality display. The notification may be an XR model of the real-world item corresponding to its last known location and orientation in 3D space. The notification may be semi-transparent and / or may be animated when the item is first detected. Tracking may be conducted using object recognition, the real-world item may have XR marker tags. A camera 106 transmits video of the tracked item to computing device 104 which sends notifications to a display. The display may be worn by user 102 for example a virtual or augmented reality headset or glasses. In some embodiments the real-world item may be a tool 108 such as a welding torch or a scalpel. The XR device may provide visual clues such as arrows, curves or other indicators to assist the user in completing a task e.g. manufacturing an object 110 or operating on a patient 110. In other embodiments the real-world item may be a video game controller (122a, 122b fig. 1b).

Description

COMPUTER IMPLEMENTED METHOD AND SYSTEM
FIELD
The present disclosure relates to notification of loss of tracking of an item in an extended reality context. More particularity, the present disclosure provides a notification based on the lost item in the extended reality context.
BACKGROUND
Extended Reality (including virtual, augmented and assisted realities, among others) has shown itself to be a useful technology in many different technology and sociological areas. Its use in assisting lifesaving surgeries via Augmented reality-assisted surgery (ARAS) has led to improved patient outcomes. Other uses include assistance in training of students enabling safer environments while still allowing students to investigate their studies in greater depth thereby resulting in a deeper level of understanding of their topic at hand. Another use is the ability to onboard new employees through use of showing them the ins and outs of their new roles, their new spaces, and the new processes they will be encountering. An extended reality environment can comprise all the elements perceived by a user enhanced or replaced (and even entirely synthesized in a virtual reality context) by a computer-generated information such as visual models, audio, and sensory modalities. The experience is provided to the user in a seamless manner such that they perceive any extended reality objects (visual, auditory, etc) without distracting or disorienting a user.
The ability for extended reality to display all manner of information to the user must be balanced however as information overload on a user can be a burden.
Tracking location (and optionally orientation) of real-world objects is another aspect of extended reality that allows said objects to be represented in an extended reality experience. Such tracking thereby allowing either the user to interact with their extended reality experience using real-world items or allow the extended reality experience to enhance the real-world object, or both, or some other combination.
SUMMARY OF THE INVENTION
According to a first aspect, the present disclosure proposes a computer implemented method for representing internal state relating to tracking of a real-world item in an extended reality environment, the method comprising the steps: tracking a location of the real-world item; detecting that tracking of the real-world item has been lost; determining a notification based on the detection that the tracking of the real-world item has been lost, and transmitting the notification or data indicative of the notification an extended reality display device for display to a user.
Advantageously, providing the notification to a user through the context of the extended reality display device (and/or the extended reality environment) reduces the distracting nature of such a notification. Thereby enabling the user to both continue the task they are undertaking involving the extended reality experience unhindered as well as receive an update as to the internal state of the extended reality experience as it may affect the task.
It can be seen that the technical state and/or condition of an underlying extended reality experience and/or system providing said experience are received, processed, and outputted.
Optionally, the notification is based on the real-world item. Optionally, the notification is an extended reality model. Preferably, the notification is an extended reality model based on the real-world item. More preferably, the notification is a semi-transparent extended reality model substantially matching the real-world item.
Advantageously, providing a reduced attention-grabbing notification (through use of a diegetically relevant model and/or diegetically related model) does not interfere with the user currently operating the system. This is of particular importance when operating in safety-critical or medical situations. For example, the surgeon using augmented reality to assist them in operation does not want a full-page notification blocking their vision -the technical features of this invention enable the user to continue their current task as being guided by the augmented reality, without the augmented reality display getting in their way, thereby increasing the likelihood of a successful operation of augmented reality system and the task they were conducting using the augmented reality system.
Optionally, the location of the extended reality model corresponds to a last known location of the real-world item. Optionally, the location of the extended reality model corresponds to a last known location of the real-world item in 3D space. Alternatively, the location of the extended reality model corresponds to a last known location of the real-world item relative to the viewer.
Optionally, an orientation of the real-world item is tracked in addition to the location.
Preferably, the orientation of the extended reality model corresponds to a last known orientation of the real-world item. Optionally, the orientation of the extended reality model corresponds to a last known orientation of the real-world item in 3D space. Alternatively, the orientation of the extended reality model corresponds to a last known orientation of the real-world item relative to the viewer.
Advantageously, providing a notification that shows the real-world item's last location and/or orientation provides a suggestion to the user of an "known good" location and/or orientation such that they can return the item to.
Optionally, upon a redetection of the real-world item and/or a resumption of tracking of the real-world item, the notification is removed and/or hidden. Optionally, upon a redetection of the real-world item and/or a resumption of tracking of the real-world item, a message is transmitted to the extended reality display device for the notification to be removed and/or hidden. Preferably, the redetection of the real-world item and/or the resumption of tracking of the real-world item is conducted once the user moves the real-world item in a location and/or an orientation substantially matching that of the notification.
Optionally, the method further comprises the step of re-locating the real-world item and resume tracking of real-world item. Optionally, upon re-locating of the real-world item, a message is transmitted to the extended reality display device for the notification to be removed and/or hidden. Optionally, re-location of the real-world item is conducted when the real-world item within a threshold distance of the notification.
Advantageously, removing the notification again provides an indication to the use of the internal state of the tracking in an unobtrusive manner thereby providing similar advantages as set out above.
Optionally, the notification has an associated introduction animation that animates when the real-world item is detected and, upon the redetection of the real-world item and/or the resumption of tracking of the real-world item and/or the real-world item is relocated, the introduction animation does not animate and/or the introduction animation is suppressed.
Advantageously, preventing an introduction animation or other process from playing when the real-world item is introduced reduces a user's distraction.
Optionally, tracking of the real-world item is conducted using object recognition of the real-world item. Alternatively or additionally, tracking of the real-world item is conducted using a marker on the real-world item Optionally, the real-world item is a video game controller.
Optionally, the notification and/or data indicative of the notification comprises a model associated with the notification and the location. Optionally, the notification and/or data indicative of the notification additionally comprises the position of the notification. Here, location and position refer to the location and position that the model associated with the notification is to be displayed to the user in the extended reality environment.
Also according to a first aspect, the present disclosure proposes a computing device comprising one or more processors that are associated with a memory, the one or more processors configured with executable instructions which, when executed, cause the computing device to carry out the computer-implemented method of the first aspect.
Also according to a first aspect, the present disclosure proposes a system comprising: a computing device according to the first aspect as described above, a real-world item to be tracked by the computing device, and an extended reality display configurable to be worn by a user.
Also according to a first aspect, the present disclosure proposes a machine-readable storage medium having stored there on a computer program comprising a set of instructions for causing a machine to perform the method of the first aspect.
The present disclosure also proposes a computer implemented method comprising the steps of tracking a location of the real-world item; detecting that tracking of the real-world item has been lost; and based on the detection that the tracking of the real-world item has been lost, providing a notification to a user that the tracking of the real-world item has been lost, wherein the notification is provided to the user via an extended reality display.
Some specific components and embodiments of the disclosed method are now described by way of illustration with reference to the accompanying drawings, in which like reference numerals refer to like features.
BRIEF DESCRIPTION OF THE FIGURES
A more complete understanding of the subject matter may be derived by referring to the detailed description and claims when considered in conjunction with the following figures, wherein like reference numbers refer to similar elements throughout the figures.
Figures 1A and 1B depict example systems that may be used in accordance with embodiments described herein.
Figures 2A and 2B depict example methods according to embodiments described herein.
Figures 3A to 3D depict exemplary views from the point of view of a user using the system in accordance with embodiments described herein.
Figures 4A to 4D depict exemplary views from the point of view of a user using the system in accordance with embodiments described herein.
Figure 5 illustrates a block diagram of one example implementation of a computing device that can be used for implementing the steps indicated in Figure 2A and/or 2B and explained throughout the detailed description.
DETAILED DESCRIPTION
The following detailed description is merely illustrative in nature and is not intended to limit the embodiments of the subject matter or the application and uses of such embodiments. As used herein, the words "exemplary" and "example" mean "serving as an example, instance, or illustration." Any implementation described herein as exemplary or an example is not necessarily to be construed as preferred or advantageous over other implementations.
Furthermore, there is no intention to be bound by any expressed or implied theory presented in the preceding technical field, background, or the following detailed description.
A skilled person will appreciate that the example embodiments described herein may be used with virtual reality, mixed reality, augmented reality, or any other reality systems that exist on the reality-virtuality continuum. The term "extended reality" as used herein refers to the any of the above-mentioned realities that exist on the reality-virtuality continuum. A preferable application of the example embodiments is an augmented reality experience as is set out in Figures 3A to 4D.
Starting with Figure 1A, an exemplary system 100 is shown comprising a user 102, a computing device 104, an extended reality visual device 106, a tool 108, and at least one object 110 being worked on. The extended reality visual device is preferably operatively coupled 112 (and more preferably wirelessly) to the computing device. The user wears the extended reality visual device such that their vision is modified by the extended reality visual device and thereby providing an extended reality environment (also described as an extended reality experience herein) to the user. Preferably, the location of the tool is being tracked. More preferably, the location and orientation of the tool is being tracked. Preferably, the computing device is a device capable of providing said extended reality experience to the user.
Preferably, the extended reality visual device 106 comprises a visual input component such as a camera. The visual input component is configured such that it can capture a video (or other visual) stream of where the user is looking. The extended reality visual device is configured to transmit said video stream to the computing device 104 such that the computing device can process the video. As will be described in examples below, this video stream is used for tracking of real-world items such as the tool 108 or object 110.
Preferably, the extended reality visual device 106 is configured to provide visual aspects of the extended reality experience to the user via a display. Optionally, the extended reality visual device is a virtual reality headset (also known as a "VR Headset). For example, the extended reality visual device is any one or more of Oculus (TM) Rift, Sony (TM) PlayStation (TM) VR headset, or HTC (TM) Vive. Alternatively, the extended reality visual device is an augmented (or mixed) reality headset or glasses. For example, the extended reality visual device is any one or more of Sony (TM) SmartEyeglass, Microsoft (TM) HoloLens 2, Magic Leap (TM) One, or Google (TM) Glass. The extended reality visual device is preferably configured to receive notifications, models and/or references thereof to display on its display such that the wearer of the extended reality visual device can view the notification and/or model. The notifications, models and/or references thereof are received from the computing device 104. If a reference is received, then the extended reality visual device obtains the model from its memory or other location.
In an example usage of the present system 100, the user 102 is working on the object 110 using the tool 108. The tool is a welding torch, and the object is an item of machinery being worked on. The computing device 104 is providing an extended reality experience via the extended reality visual device 106 to the user. The extended reality experience provided is one to assist the user in manufacturing of the object. The extended reality visual device provides visual cues based on the location of the tool and/or the object to help the user weld (or otherwise work on) the object. The visual cues can include arrows, curves, or other indicators shown about the object and tool to show how and where the welding torch should be used. The welding torch itself may also have related cues to assist the used in how to orient or operate the device.
In a further example usage of the above discussed system 100, the user 102 is a doctor wearing an extended reality visual device 106, operating on a patient (the patient being the object 110) using a scalpel as a tool 108. This example usage is called Augmented reality-assisted surgery (ARAS). The computing device 104 is providing, via the extend reality visual device, an augmented reality experience to the doctor. Looking through the extended reality visual device, an extended reality experience is displayed to the doctor guiding and assisting them in their operation. The extended reality experience can suggest current and next steps (including the location and other sundry information relating to said steps) in the process the doctor is dealing with at the present time.
Turning to Figure 1B, a further exemplary system 120 (similar to the system 100 as described with reference to Figure 1A) is shown comprising a user 102, a computing device 104, an extended reality visual device 106, and at least one controller 122A, 1223. This further exemplary system is configured to provide a gaming-oriented extended reality experience to the user. The visual device and controller(s) are operatively 124 connected to the computing device 104, preferably via short-range wireless connection such as Bluetooth (TM). Preferably, the computing device is a gaming device.
In both of the example systems 100, 120 of Figures 1A and 1B, a real-world item (or items) is being tracked. As set out below tracking preferably involves the use of computer vision and/or machine learning for object identification and subsequent tracking. In the first example, the tool 108 and/or the object 110 are real-world items being tracked. In the second example, the controllers 122A, 1223 are real-world items being tracked.
Optionally, the computing device 104 is constructed as part of the same unit as the extended reality visual device 106 such that the user 102 is wearing both.
Turning to Figure 2A, an example method 200 of providing tracking and notification of internal state of the tracking to a user in an unobtrusive way is shown. By way of illustration, the example method is described as being used with the system 100 as described with reference to Figure 1A in some places to assist the reader in how it can be implemented.
In this present example, the user 102 is wearing an extended reality visual device 106 and is using a surgical tool such as a scalpel 108 to operate on an animal 110. The extended reality visual device is capturing what the user is seeing through a video input and providing the video input to the computing device 104 that is preferably conducting this method 200. The computing device is, in coordination with the extended reality visual device, providing an extended reality experience to the user. The extended reality experience provides cues to the user about how and where the surgical tool should be used during an operation on the animal. The surgical tool is being tracked by the computing device so that it can provide useful extended reality visuals to, for example, show where and at what angle the user should be using their surgical tool.
As used in relation to this method, the terms "locating", "tracking", "identifying" and similar terms are used in relation to the device conducting this method (as opposed to the user or other members of the system). For example, "locating" the real-world item preferably relates to the process of the computing device 104 applying computer vision techniques using the video input from the extended reality visual device 106 to calculate where the real-world item is in 3D space.
As a first step, the method 200 begins with at least one real-world item being tracked 202. The real-world item being tracked depends on the user's use case. As mentioned, the real-world item in this example is a surgical tool such as a scalpel. Optionally, there may have previously been conducted a step in "locating" the device first, prior to tracking it.
Optionally, tracking of an object is conducted using object recognition and/or object detection based on a pre-determined model of the real-world item being tracked. The object recognition preferably uses computer vision and/or machine learning to identify and then subsequently track the item. The pre-determined model can be pre-programmed into the computing device 104 conducting the object tracking or the pre-determined model is scanned in an initialisation step. Alternatively or additionally, the real-world item comprises extended reality markers (sometimes called "tags" or "augmented reality tags" or "augmented reality markers") such that the real-world items can be identified and tracked. The extended reality markers are the same or similar to OR codes in that they are 2D and comprise shape(s). Having a 2D shape assists in location and orientation determination. The measured size On pixels, but then converted to distance) of said 2D shape(s) can be used to determine how far away a marker (and therefore object) is. The measured skew of said 2D shape(s) can be used to determine the orientation of the marker (and therefore object). An example extended reality marker 320 can be seen in Figure 3C affixed to a controller 302.
A skilled person will appreciate that these methods of tracking an object in 3D space are exemplary and that other techniques may be used instead or in addition.
Preferably, the tracking is conducted as a part of, or in combination with a "Simultaneous Localization And Mapping" (SLAM) system. SLAM is the computational problem of constructing or updating a map of an unknown environment while simultaneously keeping track of a user's location within the environment. There are several algorithms known for solving it, at least approximately, in tractable time for certain environments. Approximate solution methods include the particle filter, extended Kalman filter, covariance intersection, and GraphSLAM. SLAM algorithms are based on concepts in computational geometry and computer vision, and can be used for virtual reality or augmented reality.
SLAM can be used in some examples to detect objects in the real-world environment local to a user. As the environment is scanned, the system can optionally label the different objects as they are detected. In some examples, locations of objects (real or virtual) or planes in the environment may be stored in a memory of the system for future reference.
Preferably, the tracking of the real-world item is conducted by the computing device 104 based on a received video input from the extended reality visual device 106. Alternatively, or additionally, the tracking of the real-world item is based on a further video feed from a further video device that is coupled to the computing device.
Secondly, the tracking of the real-world item is lost 204. As noted, the tracking is preferably conducted through the video input from the extended reality visual device 106. The tracking may be lost for any number of reasons including but not limited to: * Occlusion of the real-world item from the video input, * Occlusion of the marker on the real-world item from the video input, * The real-world item is too far away for it to be identified and/or tracked, * The user has moved the real-world item out of view of the video input, and/or * The user has moved the video input (by, for example, turning their head away) such that the real-world item is not in view of the video input.
With the tracking lost, a notification is then provided 206 to the user. Preferably, the notification or data indicative of the notification is transmitted to the extended reality visual device 106 for display to the user. Thus, it can be seen that this notification is configured to display an internal technical state of the extended reality system and/or computing device 104 providing the extended reality experience. The notification further enables operation of the extended reality experience. Preferably, the notification is provided within the context (also described as diegetically) of the extended reality experience. This way, immersion of the user will not be hindered or lost as much as compared with a non-diegetic notification. A non-diegetic notification could be a pop-up window or similar.
More preferably, the notification provided is based on the real-world item of which the tracking has been lost. Even more preferably, the notification is a model of the real-world item. In this way, the notification is a diegetically related model.
Alternatively or additionally, the notification is a diegetic object in the context of the extended reality experience being currently presented to the user. Preferably, the notification is the extended reality model that was associated with the real-world item. For example, if the real-world item was a controller and the controller was representing an axe within a gaming extended reality experience, then the notification is said axe model. Alternatively, an in-game diegetic notification can be used that is not directly related to the real-world item. For example, a context relevant character or item may be displayed. Thus, the use of a diegetic object, optionally based on the real-world item, will improve a user's immersion and not distract them from the task currently at hand, while still enabling the display of features of the underlying technical system currently implementing the extended reality experience.
Optionally, the notification also comprises a sound. Preferably, the sound is too diegetic within the context of the current extended reality experience.
Preferably, the notification is transparent or semi-transparent such that a user see both the notification as well as any object behind it in the extended reality experience. Example transparency can be seen in Figures 3A to 4D. This transparency can also be described as "ghosting" as the notification appears like a ghost in the extended reality experience.
Preferably, the location of the notification is also based on the location of the real-world object. More preferably, the location of the notification is the last known location of the real-world item in 3D space and/or in an absolute sense. Preferably the location is represented using the Cartesian rectilinear system, the spherical polar system, or the cylindrical system relative to a fixed point or reference. For example, were the last known location of the real-world item be in the centre of the room the user is currently in, the notification's location will be based on this, and more preferably the notification's location is the same as the last known location of the real-world object such that the notification is also in the centre of the room.
Alternatively, the location of the notification is the last known location of the real-world item relative to the user and/or the user's vision. For example, were the last known location of the real-world item to be on the left of the user's view, then the notification would be provided on the left of the user's view.
Preferably, the orientation of the real-world item is also being tracked in the first step 202.
Preferably, the orientation relates to the pitch, yaw, and roll of the real-world object and/or notification. Thus, when the tracking of the object is lost, the presented notification is based on both the location and the orientation of the lost real-world item.
Referring to Figure 2B, a further method 220 is shown. The further method comprises a number of the same steps as the method 200 described with reference to Figure 2A and thus the same reference numerals have been used.
In the further method 220, the location of the real-world item is found again 228. The real-world item is found using object detection/recognition algorithms as discussed above optionally comprising or relating to a SLAM system. This can be considered a part of, or associated with, a redetection of the real-world item as redetection of the real-world item will result in the location of the real-world item being determined. The tracking of the real-world item will then resume 202.
At the same time, or immediately prior to the tracking 202 of the real-world item resuming, the notification is hidden 230 (or otherwise removed) from the extended reality experience and the system can wait for loss of tracking to occur again 204 thus closing the processing loop.
Thus it can be seen that the method is providing a way to record a "lost" (or conversely, a "found") state associated with an extended reality system and present this internal state to the user in a pleasing, unobtrusive manner.
Optionally, through use of the "lost" or "found" state of real-world item, other activities that occur when, for example, a real-world item is introduced or identified, can be undertaken or not depending on the context of the activity. For example, the real-world item may have an associated introduction animation that is played whenever the real-world item is recognised.
In an example embodiment, the real-world item is being re-detected as a part of the above discussed method 220 and the introduction animation is not played as such an activity would not make sense to a user who already has seen the introduction animation.
Optionally, by providing the notification in the last known location (and preferably orientation too) of the real-world item, a requirement may be imposed that the real-world object be returned to substantially the same location as it was last known. With this requirement, object detection can be optimized as the object detection algorithm(s) used need only be activated when the last known location (the location of the notification) is in view of the video input and/or the object detection algorithm need not operate on video input that is not captured near the notification location. These features result in a lesser amount of video processing required which thereby reduces CPU cycles and improves the power efficiency and performance of the computing device 104. How close the real-world object needs to be to the notification is configurable. Optionally, there is a threshold distance between the real-world item location and the notification location for the real-world object to be recognised. The threshold closeness of the real-world object and notification can be up to 50cm, up to 20cm, up to 10cm, up to 5cm, or up to 1cm. Further, the use of a threshold closeness of the real-world object to the notification can also provide a less obtrusive and/or interrupted extended reality experience for the user as returning the object to the same last known location may make more sense in the context of the extended reality experience. For example, if the extended reality experience is a sporting game, then the return of a real-world ball associated with the extended reality sports game to the last known position may be a requirement such that the sports game can continue from a last known correct state.
The extended reality environment shown in Figures 4A-4D show a system comprising a threshold closeness of approximately 2cm. The extended reality environment of Figures 3A-3D does not comprise the threshold closeness feature and the object detection algorithm operating to relocate and/or resume tracking of the real-world item is operating continuously across all of the video input received from the extended reality visual device.
Referring to Figures 3A to 3D, images from the point of view of a user wearing an extended reality visual device are shown. There is a computing device coupled (not pictured) to the extended reality visual device such that the computing device can receive video captured by the extended reality visual device and provide extended reality models (or data indicative thereof) for the extended reality visual device to display.
In Figure 3A, the controller 302 is shown in the hands of the user 304. The controller is used here in coordination with the further example system 120 of Figure 1B. A skilled person will appreciate that the tool 108 of the first example Figure 1A could be used instead. The controller comprises an extended reality marker for tracking which cannot be seen as it is occluded by the extended reality experience being displayed over it.
The extended reality experience comprises a number of features that enhance the user's use of the controller 302 by providing hints as to what each button does. For example, the extended reality "ICE" label 306 would trigger an ice or frozen related scene to be displayed. Alternatively, the button labelled "ICE" could instruct a user-controlled character of a game currently being played to perform an ice attack.
In Figure 38, the user has moved the controller from their vision and/or the vision of the video input of the extended reality video device. Neither the user, nor the video input device of the extended reality visual device are able to see or track the controller 302. Thus the tracking of the real-world item is lost. A notification 310 that the tracking of the real-world item has been lost is displayed. The notification is of the form of a semi-transparent and/or ghosted 3D model based on the controller. Preferably, the 3D model is substantially the same shape and configuration as the real-world item. The notification is in the same last known location of the real-world item. Preferably, the notification is in the same last known orientation of the real-world item also.
In Figure 30, the user has moved the controller 302 back into view of their own vision and that of the video input of the extended reality visual device. The image in Figure 30 is taken immediately before the controller has been recognised and as such, the extended reality experience has not been overlaid on the controller yet and the extended reality marker 320 affixed to the controller can be seen.
In Figure 3D, tracking of the controller 302 has resumed and the extended reality experience is again displayed. The extended reality experience comprising the "ICE" 306, "LIGHTING", "HIT", etc button enhancements. As can be seen from Figure 3D, the tracking of the controller and resumption of the extended reality experience is resumed when the controller comes within view of the video input of the extended reality video device. Here, the location of the controller and the location of the notification need not be aligned with each other for detection to reoccur. The notification has also been hidden.
Referring to Figures 4A-4D, a series of images are shown according to the systems and methods described herein similar to those as shown in Figures 3A-3D. As mentioned above, the present example system comprises the optional feature described above relating to requiring the real-world object be returned to within a threshold distance of the notification for re-detection of the controller to occur. In the present example, the threshold distance is 2cm -that is, the computing device is configured to only detect objects that are within 2cm of the notification location thereby saving processing time and power.
Starting with Figure 4A, the user 304 is holding the controller 302 and an example extended reality experience is being displayed. The extended reality experience comprises a number of descriptors relating to how the controller is used including the "ICE tag 306 which may be used to trigger an ice-based attack in a game related context.
Turning to Figure 4B, the user moves the controller 310 in a way such that the controller could not be tracked and a notification 310 is displayed in the last known location of the controller.
Turning to Figure 40, the user moves the controller 302 close to notification 310, but not within the threshold distance of 2cm. The controller 302 is not detected as the computing device is saving processing power by not conducting any object recognition for the controller except for objects within the threshold distance.
In Figure 4D, the user has now moved the controller 302 to within the 2cm threshold distance and the extended reality experience has restarted and the notification has been hidden.
Figure 5 illustrates a block diagram of one example implementation of a computing device 104 that can be used for implementing the steps indicated in Figures 2A and/or 2B. The computing device is associated with executable instructions for causing the computing device to perform any one or more of the methodologies discussed herein. In alternative implementations, the computing device may be connected (e.g., networked) to other machines in a Local Area Network (LAN), an intranet, an extranet, a local wireless network, the Internet, or other appropriate network. The computing device may operate in the capacity of a server or a client machine (or both) in a client-server network environment, or as a peer machine in a peer-to-peer (or distributed) network environment. The computing device may be a personal computer (PC), a tablet computer, a set-top box (STB), a Personal Digital Assistant (PDA), a cellular telephone, a web appliance, a gaming device, a desktop computer, a laptop, a server, or any machine capable of executing a set of instructions (sequential or otherwise) that specify actions to be taken by that machine. Further, while only a single computing device is illustrated, the term "computing device" shall also be taken to include any collection of machines (e.g., computers) that individually or jointly execute a set (or multiple sets) of instructions to perform any one or more of the methodologies discussed herein.
The example computing device 104 includes a processing device 702, a main memory 704 (e.g., read-only memory (ROM), flash memory, dynamic random-access memory (DRAM) such as synchronous DRAM (SDRAM) or Rambus DRAM (RDRAM), etc.), a static memory 706 (e.g., flash memory, static random-access memory (SRAM), etc.), and a secondary memory (e.g., a data storage device 718), which communicate with each other via a bus 730.
Processing device 702 represents one or more general-purpose processors such as a microprocessor, central processing unit, or the like. More particularly, the processing device 702 may be a complex instruction set computing (CISC) microprocessor, reduced instruction set computing (RISC) microprocessor, very long instruction word (VLIVV) microprocessor, processor implementing other instruction sets, or processors implementing a combination of instruction sets. Processing device 702 may also be one or more special-purpose processing devices such as an application specific integrated circuit (ASIC), a field programmable gate array (FPGA), a digital signal processor (DSP), network processor, or the like. The processing device 702 is configured to execute the processing logic (instructions 722) for performing the operations and steps discussed herein.
The computing device 104 may further include a network interface device 708. The computing device also may include a video display unit 710 (e.g., a liquid crystal display (LCD) or a cathode ray tube (CRT)), an alphanumeric input device 712 (e.g., a keyboard, touchscreen, or a game controller 122A, 122B), a cursor control device 714 (e.g., a mouse, touchscreen, or a game controller 122A, 122B), and an audio device 716 (e.g., a speaker).
Preferably, the computing device 104 comprises a further interface configured to communicate with other devices, such as the extended reality display device 106, in the systems 100, 120 described herein. The further interface may be the network interface 708 as described above, or a different interface depending on the device being connected to. Preferably the interface configured to the extended reality device is such that a video stream from the extended reality display device can be provided to and from the computing device.
As mentioned above, optionally, the computing device 104 is comprised in the same unit as the extended reality display device 106.
The data storage device 718 may include one or more machine-readable storage media (or more specifically one or more non-transitory computer-readable storage media) 728 on which is stored one or more sets of instructions 722 embodying any one or more of the methodologies or functions described herein. The instructions 722 may also reside, completely or at least partially, within the main memory 704 and/or within the processing device 702 during execution thereof by the computer system 104, the main memory 704 and the processing device 702 also constituting computer-readable storage media.
The various methods described above may be implemented by a computer program. The computer program may include computer code arranged to instruct a computer to perform the functions of one or more of the various methods described above. The computer program and/or the code for performing such methods may be provided to an apparatus, such as a computer, on one or more computer readable media or, more generally, a computer program product. The computer readable media may be transitory or non-transitory. The one or more computer readable media could be, for example, an electronic, magnetic, optical, electromagnetic, infrared, or semiconductor system, or a propagation medium for data transmission, for example for downloading the code over the Internet. Alternatively, the one or more computer readable media could take the form of one or more physical computer readable media such as semiconductor or solid-state memory, magnetic tape, a removable computer diskette, a random-access memory (RAM), a read-only memory (ROM), a rigid magnetic disc, and an optical disk, such as a CD-ROM, CD-R/W or DVD.
In an implementation, the modules, components and other features described herein can be implemented as discrete components or integrated in the functionality of hardware components such as ASICS, FPGAs, DSPs or similar devices.
A "hardware component" or "hardware module" is a tangible (e.g., non-transitory) physical component (e.g., a set of one or more processors) capable of performing certain operations and may be configured or arranged in a certain physical manner. A hardware component may include dedicated circuitry or logic that is permanently configured to perform certain operations. A hardware component may be or include a special-purpose processor, such as a field programmable gate array (FPGA) or an ASIC. A hardware component may also include programmable logic or circuitry that is temporarily configured by software to perform certain operations.
Accordingly, the phrase "hardware component" or "hardware module" should be understood to encompass a tangible entity that may be physically constructed, permanently configured (e.g., hardwired), or temporarily configured (e.g., programmed) to operate in a certain manner or to perform certain operations described herein.
In addition, the modules and components can be implemented as firmware or functional circuitry within hardware devices. Further, the modules and components can be implemented in any combination of hardware devices and software components, or only in software (e.g., code stored or otherwise embodied in a machine-readable medium or in a transmission medium).
Unless specifically stated otherwise, as apparent from the following discussion, it is appreciated that throughout the description, discussions utilizing terms such as "determining", "providing", "calculating", "computing," "identifying", "combining", "establishing" , "sending", "receiving", "storing", "estimating", "checking", "obtaining" or the like, refer to the actions and processes of a computer system, or similar electronic computing device, that manipulates and transforms data represented as physical (electronic) quantities within the computer system's registers and memories into other data similarly represented as physical quantities within the computer system memories or registers or other such information storage, transmission or display devices.
The term "comprising" as used in this specification and claims means "consisting at least in part of". \Mien interpreting each statement in this specification and claims that includes the term "comprising", features other than that or those prefaced by the term may also be present. Related terms such as "comprise" and "comprises" are to be interpreted in the same manner.
It is intended that reference to a range of numbers disclosed herein (for example, 1 to 10) also incorporates reference to all rational numbers within that range (for example, 1, 1.1, 2, 3, 3.9, 4, 5, 6, 6.5, 7, 8, 9 and 10) and also any range of rational numbers within that range (for example, 2 to 8, 1.5 to 5.5 and 3.1 to 4.7) and, therefore, all sub-ranges of all ranges expressly disclosed herein are hereby expressly disclosed. These are only examples of what is specifically intended and all possible combinations of numerical values between the lowest value and the highest value enumerated are to be considered to be expressly stated in this application in a similar manner.
As used herein the term "and/or" means "and" or "or", or both.
As used herein "(s)" following a noun means the plural and/or singular forms of the noun.
The singular reference of an element does not exclude the plural reference of such elements and vice-versa.
It is to be understood that the above description is intended to be illustrative, and not restrictive.
Many other implementations will be apparent to those of skill in the art upon reading and understanding the above description. Although the disclosure has been described with reference to specific example implementations, it will be recognized that the disclosure is not limited to the implementations described but can be practiced with modification and alteration within the scope of the appended claims. Accordingly, the specification and drawings are to be regarded in an illustrative sense rather than a restrictive sense. The scope of the disclosure should, therefore, be determined with reference to the appended claims, along with the full scope of equivalents to which such claims are entitled.

Claims (22)

  1. CLAIMS1 A computer implemented method for representing internal state relating to tracking of a real-world item in an extended reality environment, the method comprising the steps: tracking a location of the real-world item; detecting that tracking of the real-world item has been lost; determining a notification based on the detection that the tracking of the real-world item has been lost, and transmitting the notification or data indicative of the notification to an extended reality display device for display to a user.
  2. 2. A method according to claim 1, wherein the notification is based on the real-world item.
  3. 3. A method according to claim 1 or claim 2, wherein the notification is an extended reality model
  4. 4. A method according to claim 3, wherein the notification is an extended reality model based on the real-world item.
  5. 5. A method according to claim 4, wherein the notification is a semi-transparent extended reality model substantially matching the real-world item.
  6. 6. A method according to any one or more of claims 3 to 5, wherein the location of the extended reality model corresponds to a last known location of the real-world item.
  7. 7. A method according to claim 6, wherein the location of the extended reality model corresponds to a last known location of the real-world item in 3D space.
  8. 8. A method according to claim 6, wherein the location of the extended reality model corresponds to a last known location of the real-world item relative to the viewer.
  9. 9. A method according to any one or more of claims 5 to 8, wherein an orientation of the real-world item is tracked in addition to the location.
  10. 10. A method according to claim 9, wherein the orientation of the extended reality model corresponds to a last known orientation of the real-world item.
  11. 11. A method according to claim 10, wherein the orientation of the extended reality model corresponds to a last known orientation of the real-world item in 3D space.
  12. 12. A method according to claim 10, wherein the orientation of the extended reality model corresponds to a last known orientation of the real-world item relative to the viewer.
  13. 13. A method according to any one or more of the preceding claims, further comprising the step of: re-locating the real-world item and resume tracking of real-world item.
  14. 14. A method according to claim 13, wherein, upon re-locating of the real-world item, a message is transmitted to the extended reality display device for the notification to be removed and/or hidden.
  15. 15. A method according to claim 13 or claim 14, re-location of the real-world item is conducted when the real-world item within a threshold distance of the notification.
  16. 16. A method according to any one or more of claims 13 to 15, wherein the notification has an associated introduction animation that animates when the real-world item is detected and, upon re-locating of the real-world, the introduction animation is supressed.
  17. 17. A method according to any one or more of the preceding claims, wherein tracking of the real-world item is conducted using object recognition of the real-world item.
  18. 18. A method according to any one or more of the preceding claims, wherein tracking of the real-world item is conducted using a marker on the real-world item.
  19. 19. A method according to any one or more of the preceding claims, wherein the real-world item is a video game controller.
  20. 20. A computing device comprising one or more processors that are associated with a memory, the one or more processors configured with executable instructions which, when executed, cause the computing device to carry out the computer-implemented method of any preceding claim.
  21. 21. A system comprising: a computing device according to claim 20, a real-world item to be tracked by the computing device, and an extended reality display configurable to be worn by a user.
  22. 22. A machine-readable storage medium having stored thereon a computer program comprising a set of instructions for causing a machine to perform the method of any one or more of claims 1 to 19.
GB2208501.3A 2022-06-10 2022-06-10 Computer implemented method and system Pending GB2619551A (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
GB2208501.3A GB2619551A (en) 2022-06-10 2022-06-10 Computer implemented method and system

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
GB2208501.3A GB2619551A (en) 2022-06-10 2022-06-10 Computer implemented method and system

Publications (2)

Publication Number Publication Date
GB202208501D0 GB202208501D0 (en) 2022-07-27
GB2619551A true GB2619551A (en) 2023-12-13

Family

ID=82496357

Family Applications (1)

Application Number Title Priority Date Filing Date
GB2208501.3A Pending GB2619551A (en) 2022-06-10 2022-06-10 Computer implemented method and system

Country Status (1)

Country Link
GB (1) GB2619551A (en)

Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20180097975A1 (en) * 2016-09-30 2018-04-05 Sony Interactive Entertainment Inc. Systems and methods for reducing an effect of occlusion of a tracker by people
KR20180113406A (en) * 2017-04-06 2018-10-16 엘지전자 주식회사 Mobile terminal and method for controlling the same
US20200026350A1 (en) * 2018-07-20 2020-01-23 Avegant Corp. Relative Position Based Eye-tracking System
US20220044449A1 (en) * 2020-08-07 2022-02-10 Micron Technology, Inc. Highlighting a tagged object with augmented reality
US20220071729A1 (en) * 2020-02-19 2022-03-10 Globus Medical, Inc. Displaying a virtual model of a planned instrument attachment to ensure correct selection of physical instrument attachment

Patent Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20180097975A1 (en) * 2016-09-30 2018-04-05 Sony Interactive Entertainment Inc. Systems and methods for reducing an effect of occlusion of a tracker by people
KR20180113406A (en) * 2017-04-06 2018-10-16 엘지전자 주식회사 Mobile terminal and method for controlling the same
US20200026350A1 (en) * 2018-07-20 2020-01-23 Avegant Corp. Relative Position Based Eye-tracking System
US20220071729A1 (en) * 2020-02-19 2022-03-10 Globus Medical, Inc. Displaying a virtual model of a planned instrument attachment to ensure correct selection of physical instrument attachment
US20220044449A1 (en) * 2020-08-07 2022-02-10 Micron Technology, Inc. Highlighting a tagged object with augmented reality

Also Published As

Publication number Publication date
GB202208501D0 (en) 2022-07-27

Similar Documents

Publication Publication Date Title
US11157725B2 (en) Gesture-based casting and manipulation of virtual content in artificial-reality environments
JP6885935B2 (en) Eye pose identification using eye features
US10712901B2 (en) Gesture-based content sharing in artificial reality environments
JP6979475B2 (en) Head-mounted display tracking
US11003410B2 (en) Augmented reality display sharing
EP3437075B1 (en) Virtual object manipulation within physical environment
US9953214B2 (en) Real time eye tracking for human computer interaction
US20200380784A1 (en) Concealing loss of distributed simultaneous localization and mapping (slam) data in edge cloud architectures
CN114303120A (en) Virtual keyboard
US10783712B2 (en) Visual flairs for emphasizing gestures in artificial-reality environments
EP2371434B1 (en) Image generation system, image generation method, and information storage medium
US20130069931A1 (en) Correlating movement information received from different sources
US11047691B2 (en) Simultaneous localization and mapping (SLAM) compensation for gesture recognition in virtual, augmented, and mixed reality (xR) applications
WO2015026645A1 (en) Automatic calibration of scene camera for optical see-through head mounted display
WO2019118155A1 (en) Detecting the pose of an out-of-range controller
US20220301217A1 (en) Eye tracking latency enhancements
US10254831B2 (en) System and method for detecting a gaze of a viewer
US10896545B1 (en) Near eye display interface for artificial reality applications
US11159645B2 (en) Adaptive backchannel synchronization for virtual, augmented, or mixed reality (xR) applications in edge cloud architectures
US10816341B2 (en) Backchannel encoding for virtual, augmented, or mixed reality (xR) applications in connectivity-constrained environments
US11520409B2 (en) Head mounted display device and operating method thereof
US11043004B2 (en) Resolving region-of-interest (ROI) overlaps for distributed simultaneous localization and mapping (SLAM) in edge cloud architectures
WO2020054760A1 (en) Image display control device and program for controlling image display
Penza et al. Enhanced vision to improve safety in robotic surgery
GB2619551A (en) Computer implemented method and system