WO2023214905A1 - Methods and devices for selective sharing of information in extended reality - Google Patents

Methods and devices for selective sharing of information in extended reality Download PDF

Info

Publication number
WO2023214905A1
WO2023214905A1 PCT/SE2022/050428 SE2022050428W WO2023214905A1 WO 2023214905 A1 WO2023214905 A1 WO 2023214905A1 SE 2022050428 W SE2022050428 W SE 2022050428W WO 2023214905 A1 WO2023214905 A1 WO 2023214905A1
Authority
WO
WIPO (PCT)
Prior art keywords
information
user
enabled device
extended reality
digital
Prior art date
Application number
PCT/SE2022/050428
Other languages
French (fr)
Inventor
Niklas LINDSKOG
Tommy Arngren
Daniel BERGSTRÖM
Peter ÖKVIST
Patrik Salmela
Original Assignee
Telefonaktiebolaget Lm Ericsson (Publ)
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Telefonaktiebolaget Lm Ericsson (Publ) filed Critical Telefonaktiebolaget Lm Ericsson (Publ)
Priority to PCT/SE2022/050428 priority Critical patent/WO2023214905A1/en
Publication of WO2023214905A1 publication Critical patent/WO2023214905A1/en

Links

Classifications

    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04LTRANSMISSION OF DIGITAL INFORMATION, e.g. TELEGRAPHIC COMMUNICATION
    • H04L63/00Network architectures or network communication protocols for network security
    • H04L63/10Network architectures or network communication protocols for network security for controlling access to devices or network resources
    • H04L63/102Entity profiles
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F21/00Security arrangements for protecting computers, components thereof, programs or data against unauthorised activity
    • G06F21/60Protecting data
    • G06F21/62Protecting access to data via a platform, e.g. using keys or access control rules
    • G06F21/6218Protecting access to data via a platform, e.g. using keys or access control rules to a system of files or objects, e.g. local or distributed file system or database
    • G06F21/6245Protecting personal data, e.g. for financial or medical purposes
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06NCOMPUTING ARRANGEMENTS BASED ON SPECIFIC COMPUTATIONAL MODELS
    • G06N20/00Machine learning
    • G06N20/20Ensemble learning
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06NCOMPUTING ARRANGEMENTS BASED ON SPECIFIC COMPUTATIONAL MODELS
    • G06N5/00Computing arrangements using knowledge-based models
    • G06N5/01Dynamic search techniques; Heuristics; Dynamic trees; Branch-and-bound
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04LTRANSMISSION OF DIGITAL INFORMATION, e.g. TELEGRAPHIC COMMUNICATION
    • H04L63/00Network architectures or network communication protocols for network security
    • H04L63/10Network architectures or network communication protocols for network security for controlling access to devices or network resources
    • H04L63/105Multiple levels of security
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04LTRANSMISSION OF DIGITAL INFORMATION, e.g. TELEGRAPHIC COMMUNICATION
    • H04L9/00Cryptographic mechanisms or cryptographic arrangements for secret or secure communications; Network security protocols
    • H04L9/06Cryptographic mechanisms or cryptographic arrangements for secret or secure communications; Network security protocols the encryption apparatus using shift registers or memories for block-wise or stream coding, e.g. DES systems or RC4; Hash functions; Pseudorandom sequence generators
    • H04L9/0643Hash functions, e.g. MD5, SHA, HMAC or f9 MAC
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04LTRANSMISSION OF DIGITAL INFORMATION, e.g. TELEGRAPHIC COMMUNICATION
    • H04L9/00Cryptographic mechanisms or cryptographic arrangements for secret or secure communications; Network security protocols
    • H04L9/32Cryptographic mechanisms or cryptographic arrangements for secret or secure communications; Network security protocols including means for verifying the identity or authority of a user of the system or for message authentication, e.g. authorization, entity authentication, data integrity or data verification, non-repudiation, key authentication or verification of credentials
    • H04L9/3218Cryptographic mechanisms or cryptographic arrangements for secret or secure communications; Network security protocols including means for verifying the identity or authority of a user of the system or for message authentication, e.g. authorization, entity authentication, data integrity or data verification, non-repudiation, key authentication or verification of credentials using proof of knowledge, e.g. Fiat-Shamir, GQ, Schnorr, ornon-interactive zero-knowledge proofs
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04LTRANSMISSION OF DIGITAL INFORMATION, e.g. TELEGRAPHIC COMMUNICATION
    • H04L9/00Cryptographic mechanisms or cryptographic arrangements for secret or secure communications; Network security protocols
    • H04L9/32Cryptographic mechanisms or cryptographic arrangements for secret or secure communications; Network security protocols including means for verifying the identity or authority of a user of the system or for message authentication, e.g. authorization, entity authentication, data integrity or data verification, non-repudiation, key authentication or verification of credentials
    • H04L9/3263Cryptographic mechanisms or cryptographic arrangements for secret or secure communications; Network security protocols including means for verifying the identity or authority of a user of the system or for message authentication, e.g. authorization, entity authentication, data integrity or data verification, non-repudiation, key authentication or verification of credentials involving certificates, e.g. public key certificate [PKC] or attribute certificate [AC]; Public key infrastructure [PKI] arrangements
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04WWIRELESS COMMUNICATION NETWORKS
    • H04W12/00Security arrangements; Authentication; Protecting privacy or anonymity
    • H04W12/08Access security
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04WWIRELESS COMMUNICATION NETWORKS
    • H04W12/00Security arrangements; Authentication; Protecting privacy or anonymity
    • H04W12/60Context-dependent security
    • H04W12/66Trust-dependent, e.g. using trust scores or trust relationships
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F21/00Security arrangements for protecting computers, components thereof, programs or data against unauthorised activity
    • G06F21/30Authentication, i.e. establishing the identity or authorisation of security principals
    • G06F21/44Program or device authentication
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04LTRANSMISSION OF DIGITAL INFORMATION, e.g. TELEGRAPHIC COMMUNICATION
    • H04L67/00Network arrangements or protocols for supporting network services or applications
    • H04L67/01Protocols
    • H04L67/131Protocols for games, networked simulations or virtual reality

Definitions

  • the technology disclosed herein relates generally to the field of protecting information distribution in an Extended Reality, XR, environment, and in particular to means and methods for selective sharing of information.
  • Extended reality is a term referring to all real-and-virtual environments and to human - machine interactions generated by computer technology and wearables.
  • Virtual Reality (VR), Augmented Reality (AR) and Mixed Reality (MR) are examples on XRs, where the “X” of XR is a variable indicating any current or future spatial computing technology.
  • An objective of the present disclosure is to address and improve various aspects for information sharing in an XR environment.
  • a particular objective is to ensure that an XR enabled device only shares information that has been allowed to be shared, as determined, for instance, by the user.
  • Another particular objective is to provide means and methods for an efficient information sharing, while also protecting information from unauthorized sharing.
  • Still another objective is to enable safe information sharing between different users who use separate information delivery systems without any implicit trust therebetween.
  • Yet another objective is to reduce unnecessary rendering processing and processing load in any conveying network, by allowing them to render only content that will be allowed by a user.
  • a method in an Extended Reality, XR, enabled device for selectively sharing information with an XR object.
  • the method is performed in the XR enabled device and comprises performing a handshake procedure with the XR object; receiving, from the XR object, a request for information related to the user of the XR enabled device; determining, by a digital representation of the user, which requested information to send; and providing, to the XR object, information determined to be allowable for sharing with the XR object.
  • a computer program for an Extended Reality, XR, enabled device for selectively sharing information with an XR object.
  • the computer program comprises computer program code, which, when executed on at least one processor on the XR enabled device causes the XR enabled device to perform the method according to the first aspect.
  • a computer program product comprising a computer program according to the second aspect and a computer readable means on which the computer program is stored.
  • an XR enabled device for selectively sharing information with an XR object.
  • the XR enabled device is configured to perform a handshake procedure with the XR object; receive, from the XR object, a request for information related to the user of the XR enabled device; determine, by a digital representation of the user, which requested information to send; and provide, to the XR object, information determined to be allowable for sharing with the XR object.
  • these aspects enable an improved granularity of a user’s privacy in an XR-environment.
  • the digital representation of the user determines what type of information is relevant to provide in a certain scenario and adapts it according to various parameters, such as, for instance, personal preferences, type of information (e.g., permanence, first/ second person damage if misused, etc.) and past behavior of external entities.
  • these aspects are suitable for situations wherein information presented in an XR device, e.g. smart glasses, are needed and there also is a risk of cognitive overload.
  • the user may be unable or unsuited to manually interact with the content in the XR environment, e.g., when driving a car or performing a task where both hands are occupied.
  • Figure i illustrates schematically a user device in which embodiments of the present disclosure may be implemented.
  • Figure 2 illustrates an exemplary implementation of a selective disclosure of information using VCs.
  • Figure 3 is a flow chart of embodiments for interaction between user device and a known digital object.
  • Figures 4 and 5 are flowcharts of various embodiments of a method.
  • Fig. 6 is a schematic diagram showing functional units of a device according to an embodiment.
  • Fig. 7 is a schematic diagram showing functional modules of a device according to an embodiment.
  • Fig. 8 shows one example of a computer program product comprising computer readable means according to an embodiment.
  • a digital twin is used as a representation of a user in terms of information subsets.
  • the user’s digital twin communicates with the digital objects to steer the content of the digital objects towards the user’s preferences.
  • the digital objects may request certain information subsets about the user.
  • Such information subsets may comprise details ranging from mood, current intent, to gender and age to medical conditions, allergies, BMI and credit rating, and so on.
  • Fig. i illustrates schematically a user device i in which embodiments of the present disclosure may be implemented.
  • a user has a user device 1 for interacting with an XR environment; herein the user device i is interchangeable denoted “XR enabled device i”.
  • the user is in an XR environment where different digital objects 51, 52,..., 5n (interchangeable denoted “XR objects 51, 52,..., 5n herein) want to interact and/or present information to the user.
  • the user interacts with these digital objects using the XR-enabled device 1, which may, for instance, be a near-eye display (such as XR headset, smart glasses, smart lenses, etc.), user equipment (mobile phone, tablet, laptop, etc.), hologram projector or the like.
  • a digital twin (DT) 2 representation of a user is used.
  • DTs are typically modelling of entities in the physical environment such as non-human physical objects such as vehicles, buildings, cellular communication networks, logical or physical clusters of loT sensors, and many more, as well as digital copies/instances representing e.g., a combination of a physical person’s bio-data attributes, her associated devices and their respective digital states.
  • a person carrying her smartphone carrying data such as mail, contact list, calendar entries, purchase patterns, etc., and some applications installed, that in all constitute at least a partial copy of the persons behavior, relations and e.g., purchase behavior.
  • the person also carries a smartwatch that provides sensor information of heart rate and blood pressure, and possibly also carrying e.g., smart glasses, it may further reveal where the user looks/gazes, perhaps in conjunction with iris-detection.
  • a DT 2 of the physical person may be considered as all digital states associated with user’s biometry and her associated smart devices.
  • the DT 2 is, in the digital domain, a digital copy of a physical asset, system or device.
  • the DT 2 may also be considered as above-mentioned set of digital states in combination with a ML-model trained on user sensor states, etc., that may “respond to a question” in same way (optimally, but in practice at least sufficiently similar to) as the physical user would respond, given the current set of sensor state.
  • the digital twin representation of the user may utilize information subsets 3a comprising digital representations of physical and mental traits of the user.
  • An information subset 3a may comprise one or more information elements.
  • the information subset(s) 3a may in turn be used as input to a decision engine 2a, comprising software or reconfigurable logic executing, for instance, a series of manually configured if-else-then statements. These statements may be stored as static rule set 2d or as a dynamic rule set 2c, which may be utilized by the decision engine 2a.
  • Static indicates that the rules are based on permanent traits of the user, while dynamic indicates that, for instance, the current context or cognitive load may alter the rule.
  • an incrementally configured machine learning (ML) model based on decision trees such as, for instance, Random Forest maybe used as the decision engine 2a.
  • Other ML models may also be used, which are described later in relation to various embodiments.
  • the DT 2 may further comprise pre-defined parts, containing permanent traits of the user, and it may receive input, e.g., regarding the current context and cognitive load of the user.
  • Information subsets maybe assigned different attributes. Two attributes which may define these information subsets, is permanence and type.
  • the information subsets 3a may be permanent, semi -permanent and temporary. Permanent information comprises personal identifiers, chronic medical conditions etc., semi-permanent comprises slow-changing attributes such as consumption habits, age, weight, height while temporary information comprises current intent, hunger, susceptibility to new information, cognitive load etc.
  • “Type”, on the other hand, is a classification of what the information subset contains information about, e.g., medical-related, work- related, consumption-related, travel-related.
  • categories may be “Sensitivity”, which may range from information sharable with anyone, e.g., “user likes carrots”, to highly sensitive personal information, e.g., “user has chronic disease X”.
  • Exposure cost which may comprise a “cost” (e.g., an estimated amount of money, time etc.) for rectifying unwanted exposure of information.
  • the user’s DT 2 may be requested to provide one or more of the information subsets 3a (each information subset comprising a single information element or several information elements) relating to the user and may further store trust levels and authorization types associated with certain information subsets.
  • the user device 1 may comprise a storage 3 in which such information is stored.
  • the storage 3 may, for instance, comprise information subsets 3a comprising data, type, permanence level etc.; object information 3b relating to digital objects 51, 52,..., 5n, e.g., trust, authorization profile, cryptographic keys etc.; and user object relations 3c, e.g., user IDs, previously shared information etc.
  • the processing devices are shown as being included in the user device 1, but one or more of them may be external to the XR enabled device 1.
  • the XR enabled device 1 is illustrated as comprising the storage 3, but in other embodiments, the storage 3 maybe external, e.g., an external device or cloud storage, from which the XR enabled device 1 obtains (e.g. receives or requests) the information.
  • the DT 2 and/or the XR rendering component could be external to the XR enabled device 1.
  • the main function of the XR enabled device 1 is to provide XR content to the user, and utilizes, for instance, cloud services for the DT 2, ML models, data storage etc.
  • digital objects 51, 52,..., 5n within the XR environment may project information to the user directly, possibly after passing through an information filter.
  • Examples of such digital objects 51, 52,..., 5n may, for instance, comprise objects of common interest, such as stores, tourist attractions and billboards, but also objects of interest to the specific individuals such as, for instance, vehicles, buildings and industrial equipment.
  • the digital objects 51, 52,..., 5n may also comprise other individuals in the XR environment.
  • the user device 1 is enabled to interact with the digital objects 51, 52,..., 5n in the XR environment. Such interaction may begin with a handshake where the digital object 5b 52,..., 5n supplies an identifier, a cryptographical proof of the identifier and object type.
  • the digital object information is stored, for instance, in the user device 1 together with relation information, such as e.g., previous interactions with the digital object 51, 52,..., 5n .
  • the DT 2 supplies an identifier and may optionally also supply cryptographic proof of identifier, additional information, such as susceptibility to content and which types of information subsets the digital object 51, 52,..., 5n may request (according to the objects authentication profile, as will be explained later).
  • the identification supplied by the DT 2 may either be persistent or session-unique, a difference being that the latter makes it difficult for the digital object 51, 52,..., 5n to correlate previously information subsets received from the user device 1.
  • the handshake maybe initiated by either party and the information maybe sent as a single message or divided into several handshake-messages.
  • the digital object 51, 52,..., 5n may request information subsets 3a in the categories specified by the user and may also supply content to the user.
  • the information subsets 3a may also be sent as a part of the client’s handshake message.
  • the DT 2 may create a random identifier. Depending on the trust value (described next), the DT 2 may, for subsequent sessions with the object, either choose to generate a new (random) identifier or to keep the same identity. The latter will allow the digital objects 51, 52,..., 5n to inter-connect previously received segments and thereby increase the knowledge of the user.
  • Each digital object 51, 52,..., 5n previously interacted with has a trust level.
  • Each digital object 51, 52,..., 5n may further have a “start value” for trust level which may be determined based on how the digital objects 51, 52,..., 5n is trusted by other users. In other embodiments, the start value may be a standardized value.
  • the trust level may be changed over time based on how relevant content is shown to be with respect to the user preferences, how relevant the information that the digital objects 51, 52,..., 5n requested actually was for the context, how well-known the digital objects 51, 52,..., 5n is to the user, how other users rate the digital objects 51, 52,..., 5n etc.
  • Such update procedure may, for instance, be done manually or by rules defined by the DT 2.
  • the trust value determines to what extent the digital objects 51, 52,..., 5n is trusted to handle data regarding the user. The more permanent an information subset 3a is, the higher level of trust is needed for the user device 1 to supply the information.
  • the trust level may be correlated against pre-set threshold values, specified in the rule sets within the DT 2, where certain thresholds must be exceeded for certain information to be released. These thresholds may, for instance, be determined manually by the user or be incrementally determined by a machine learning model.
  • the thresholds may, in addition to the trust, further take the current context as input.
  • the context may be the current environment in which the interaction takes place as well as the state the user is in, including, for instance, the current cognitive load of the user. Certain contexts may for example require an increased trust level before sending a certain information subset.
  • the associated DT 2 may provide none, some (selected) or all information subsets requested by the object 51, 52,..., 5n.
  • the information may be stored in the storage 3, or the user device 1 may, as noted earlier, be enabled to retrieve such information from an external device.
  • the information may, e.g., comprise personal identification, current intent, shopping habits, personal interests, medical information, etc.
  • the aspect of trust vs. information maybe understood in context of e.g., a second party obtaining info on shoe size may only require low trust level, whereas information related to medical records etc. may require high trust level.
  • the user may be presented with XR content (also known as overlay content) received from the external digital objects 51, 52,..., 5n.
  • XR content also known as overlay content
  • the digital object 51, 52,..., 5n may increase its trust level by acting in compliance with previously stated information policy rules (and vice versa), by, for instance, requesting only user-related data according to rules, and provide first user with relevant content rendering.
  • the digital object 51, 52,..., 5n may further belong to a certain entity, e.g. a certain vendor, owner, franchise etc. In such case, the digital object 51, 52,..., 5n may share trust value with other digital object 51, 52,..., 5n from the same entity.
  • a certain entity e.g. a certain vendor, owner, franchise etc.
  • the digital object 51, 52,..., 5n may share trust value with other digital object 51, 52,..., 5n from the same entity.
  • each digital object 51, 52,..., 5n may have an authorization profile determining which information subset it may inquire about.
  • the authorization profile may restrict which information types that a digital object 51, 52,..., 5n may request. For instance, a digital object 51, 52,..., 5n in a coffee shop is not authorized to access medical-related information regardless of how high the trust for the digital object 51, 52,—, 5n is.
  • the authorization profile determines what information subsets the digital object 51, 52,..., 5n may request from the user device 1 while the trust level determines what information subsets the DT 2 will supply to the digital object 51, 52,..., 5n.
  • a parameter indicating susceptibility to new content can specify how much content the digital object 51, 52,..., 5n is allowed to supply in response to the information subsets received. If the current cognitive load of the user is high, the information supplied by the DT 2 may reflect that only minimal or no information is of interest at the moment.
  • the DT 2 may lower the trust value for the digital object 51, 52,..., 5n. If a digital object 51, 52,..., 5n repeatedly sends content which is not relevant for the user or requests information which is not relevant or explicitly outside its authorization profile, the trust value may reach a level in which no information is ever supplied to the digital object 51, 52,..., 5n.
  • a digital object 51, 52,..., 5n For a digital object 51, 52,..., 5n to show that it has a certain trust level and authorization for a user, the digital object 51, 52,..., 5n must supply cryptographical proof of having an expected identity.
  • a classic way of establishing such trust is through Public Key Infrastructure (PKI) and asymmetric signatures, where a trusted party endorses the certificate belonging to the key, and a proof of having the key in form of signature is a proof of identity.
  • VCs described with reference to figure 2, are one possible way to utilize asymmetric keys for establishing trust as well as for sharing selected information with a peer.
  • other methods like a pre-shared symmetric key or a zero-knowledge proof scheme could also be used.
  • Verifiable Credentials are a standardized way of defining credentials in web which is cryptographically secure, privacy respecting and machine verifiable.
  • a VC is in many ways very similar to a PKI certificate.
  • a VC (similar to a PKI cert) has a set of tamper-evident claims related to the holder of the VC (similar to PKI cert attributes) and credential metadata including data that cryptographically proves who issued the VC (similar to a Certificate Authority, CA, signature).
  • the credential metadata has an identifier and a set of properties such as the issuer, expiry date and time, a public key (similar to a PKI cert public key) for verification purpose, or a revocation mechanism.
  • issuers When dealing with VCs there are issuers, holders, and verifiers, as well as a verifiable data registry. Each issuer, holder and verifier, associate a set of public keys to their identifier.
  • the issuer’s public keys are made publicly available, so that the verifier can validate VC presented by the holder as having been produced by the issuer.
  • the issuer, holder, and verifier use a verifiable data registry which maintains identifiers and schemas.
  • the schemas describe the data structure and content of the VC.
  • the issuer can freely define the schema, i.e., content of VC, and the verifier can, by fetching the schema, parse the content. This is one of the big differences compared to PKI certificates, which have more rigid structure and content.
  • the content carried by the VC can freely be defined and the verifier can, using the schema, understand the content.
  • the verifiable data registry could be a trusted database, decentralized database, government ID database, blockchain, or some other secure and accessible service.
  • An issuer is an organization or entity which issues digitally signed VCs to holders.
  • the holder stores the VC.
  • the holder could be a user or a device.
  • a verifier is an organization or entity which verifies the VC held by the holder.
  • a university issuer
  • VC university certificates
  • a company verifier
  • a VP is basically the VC signed by the VC holder, i.e., the VC holder’s secret key
  • the VP signature may also cover an authentication challenge received from the verifier, thereby acting as an authentication.
  • the verification scheme is also often extended with zero-knowledge proofs, where the verifier requests the holder to prove that they are in possession of a claim asserting some property. For example, the verifier could request the holder to prove that she is above 18 years of age. Such zero-knowledge proof is constructed such that the holder does not have to reveal the VC itself to the verifier nor their actual age. In the case of the age example, this would allow the student to demonstrate their age being over 18 without revealing their actual age to the verifier.
  • VCs also support selective data disclosure, wherein the holder may choose to provide only a subset of the information of the VC to a verifier and withholding information that she does not want the verifier to see.
  • a VC is much like a public key certificate; it contains a provable identifier (public key), some information about the holder, and is signed by the issuer (e.g., CA).
  • issuer e.g., CA
  • VC is more dynamic with regards to what can be stored in it, and the possibility of having zero-knowledge proofs and support for selective data disclosure and is taken advantage of according to the present teachings, as will be described more in detail in the following.
  • Fig. 2 illustrates an exemplary implementation of a selective disclosure of information using Verifiable Credentials (VCs) according to the present teachings.
  • the VC 7 comprises a digital identity of the user, ID (e.g. public key, or hash of public key) and a signature Sig generated by the issuer. Data that the issuer has verified is presented in the VC 7 as hash values Hash_A, Hash_B, Hash_ C of the data.
  • ID e.g. public key, or hash of public key
  • Sig generated by the issuer.
  • Data that the issuer has verified is presented in the VC 7 as hash values Hash_A, Hash_B, Hash_ C of the data.
  • the holder wants to reveal data A (e.g., holder name) to the verifier he can do it by providing the VC 7 and data A to the digital object 51, 52,..., 5n.
  • the verifier can then generate hash (A) and verify with Hash_A that the signature of the VC 7 is correct and thereby it shows that the value A has been verified by the issuer.
  • VC 7 is a static piece of data containing information about the holder that the issuer has verified and signed. If the user wants to update the VC 7, e.g., change phone number (e.g., a value represented by Hash_B in the figure 2), it requires that the user contacts the issuer and requests a new updated VC 7 with the new phone number.
  • change phone number e.g., a value represented by Hash_B in the figure 2
  • the issuer needs to verify the identity of the one that requested the update of the VC 7 (i.e., needs to ensure that is it really the VC holder that wants to update his own VC 7) and that the new phone number is correct; an issuer would not want to carelessly issue VCs without verification, much like a CA would not want to issue certificates without verifying that the content is correct. This is thus not a lightweight and fast process.
  • An advantage is that the holder can generate, update, and expand its self-issued VC 7 whenever he likes as it only entails signing the VC data structure using the own private key, or in a yet simpler solution, not even signing it such as when presenting the VC 7 as a VP, the VC 7 is signed by the holder via the VP signature.
  • This also means that the holder is enabled to do selective disclosure of information on multiple levels; he can choose which data is presented as data and which as, for instance, hash of data to the verifier (as described earlier), and in addition he can choose even which data to provide even at hash level to the verifier.
  • Hash_C For instance, not sending Hash_C to the verifier meaning that the verifier does not even know there is such data available at/about the holder. It is noted that the provided example of hash values is only one (simplified) exemplary implementation of selective data disclosure; there are various other ways that the selective data disclosure may be implemented.
  • the VC contains a pointer/identifier of the schema of the VC.
  • the issuer would generate the VC and associated schema and publish the schema to the registry.
  • the issuer/holder would also generate the schema. If the issuer/holder has the right to publish the schema to the registry it could do that.
  • the issuer/holder could provide the schema to the verifier together with, or as part of, the VP. The verifier could then use the received schema to interpret the received VC.
  • the verifier can provide the desired schema to be used by the holder.
  • the holder then generates a VC based on the received schema and signs the VC/VP using its private key.
  • the holder can provide additional assurance of the data contained in the VC if the data has also been issued by a “trusted” issuer in a separate VC(s) by also including that part of the data and their respective “trusted” VCs in the VP provided to the verifier, requesting the use of the specific schema.
  • the verifier receives data according to the desired schema and can potentially also verify at least some of the data has been issued by a trusted issuer.
  • One exemplary implementation of selective information sharing is according to various embodiments is to utilize the earlier described verified credentials, VC.
  • an exemplary implementation according to the present teachings is described next:
  • the DT 2 stores VCs 7 of the user device 1.
  • These VCs 7 may e.g., comprise Government issued VC 7 with official name, social security number, etc.
  • the VCs 7 may be issued by various service providers e.g., by the store that the user/holder is visiting, the VCs 7 may then contain data such as customer number and name, which the store system can map to purchasing habits and other information about the user collected by the store.
  • the VCs 7 could also be issued by a financial institution e.g. related to a credit card which can be used for making purchases.
  • the VCs 7 can have self-signed VCs 7 containing, e.g., temporary data (mood, cognitive load etc.) or data where no obvious verifier exists (e.g., hobbies).
  • the DT 2 uses the information of the VCs 7 but may also have additional information about the user and policies or algorithms instructing how to behave in certain situations.
  • the DT 2 uses VCs 7 ( or VPs) to present information subsets about the user to the digital objects 51, 52,..., 5n.
  • the digital object 51, 52,..., 5n can provide its VCs 7 to the DT 2 and thereby share information about itself.
  • VCs 7/VP user authentication can also be provided since the VP, which is signed by the VC holder, i.e. DT 2/digital object 51, 52,..., 5n, may comprise a challenge issued by the peer, i.e. the digital object 51, 52,..., 5n /DT 2.
  • the DT 2 and a digital object 51, 52,..., 5n are about to interact, they exchange nonces, which are then carried in the signed VPs, thereby resulting in mutual authentication. This interaction then acts as the handshake described earlier.
  • the digital object 51, 52,..., 5n would typically share the same set of data with any interacting DT 2 so in the following focus is on how the DT 2 can limit what data is to be shared with the digital object 51, 52,..., 5n. However, the digital object 51, 52,..., 5n could utilize the same approach. Initially the DT 2 can use a VC 7 where all or at least most data is blinded by default, e.g. only provided as hash values as described earlier.
  • the DT 2 may choose to possibly provide the additional data matching the hash values in order to let the digital object 51, 52,..., 5n verify the data about the holder/DT 2.
  • the DT 2 may choose to share certain data based on own policy.
  • the digital object 51, 52,..., 5n may request certain additional data and the DT 2 may then choose which parts of the requested data to share. Sharing the data could be based on initially sharing all data as hashed values and later provide the actual data for verification by the digital object 51, 52,..., 5n.
  • the digital object 51, 52,..., 5n may provide a VC scheme to the DT 2, and the DT 2 may then create a VC 7 with indicated scheme, possibly leaving some data fields blank if the DT 7 chooses not to reveal the data to the digital object 51, 52,..., 5n.
  • Some of the data that maybe shared with the digital object 51, 52,..., 5n could be dynamic data, that is not issued by a “trusted” issuer but rather be generated at time of answering the digital object 51, 52,..., 5n. This could, for instance, be the mood of the holder.
  • the digital object 51, 52,..., 5n may choose to put different level of confidence in data received from the DT 2 based on who has issued the data; data issued by a trusted issuer, which can be verified by the digital object 51, 52,..., 5n, would naturally be more reliable while “self-signed” data by DT 2 would have lesser confidence.
  • data of permanence “Permanent” as described earlier would be issued by a trusted issuer, while temporary data is typically self-signed by the DT 2.
  • the zeroknowledge proof aspect of VC 7 could be used. This could be used to establish that the DT 2 is allowed to get age restricted material, e.g., age 18+ content, without revealing the actual age of the DT 2 holder.
  • Other uses of the zero-knowledge proof include proof-of-permission without revealing identity, e.g., showing that you may buy a prescription drug without exposing your identity to a digital pharmacy. Proof-of- access without revealing user identity is another application, where the digital object 5b 52,..., 5n can verify that you are allowed access without needing an identifying membership number of the like.
  • the DT 2 may comprise a manual override function.
  • Such override function may, for instance, be activated when a defined trust level for a particular information subset exceeds a certain threshold value. In such situation, the user may need to actively allow the information subset at hand in order for it to be transmitted by the DT 2.
  • Such authorization may take place by interacting with the XR environment by the user using e.g., voice or movement.
  • the authorization may be performed outside of the XR environment in a two-factor authentication fashion, e.g., the user clicking on an emailed link or using an authentication app on his mobile phone.
  • the manual override may give the user right to supply requested information segments which the digital object 51, 52,..., 5n does not have the trust and/or authorization to receive.
  • the overall stress level or condition of the user, or any other physical trait may also be considered when limiting the scope of the information subset.
  • Fig. 3 is a flow chart of embodiments for interaction between an XR enabled user device 1 and a known digital object 51, 52,..., 5n.
  • the flow starts in box 50, and continues to box 51, in which the XR enabled user device 1 detects a digital object 51, 52,..., 5n in an XR environment. Flow continues to box 52.
  • the user device 1 receives an object identifier and cryptographic proof from the digital object 51, 52,..., 5n.
  • the DT 2 determines that is has seen this digital object 51, 52,..., 5n before, i.e., that it is a previously known digital object 51, 52,..., 5n. Flow continues to decision box 53.
  • decision box 53 it is determined whether or not the user device 1 is to remain anonymous for the digital object 51, 52,..., 5n. If no, flow continues to box 54, in which the user device i sends a known user identifier to the digital object 51, 52,..., 5n. If yes, flow continues to box 55, wherein the user device 1 sends a random user identifier to the digital object 51, 52,..., 5n, and the user device 1 keeps its anonymity towards the digital object 51, 52,..., 5n. From either box 54 or 55, flow continues to box 56.
  • the XR enabled user device 1 receives a request for an information subset from the digital object 51, 52,..., 5n.
  • the information maybe any type of information.
  • the digital object 51, 52,..., 5n may request information on shopping habits, and if allowed to receive such information, it may project relevant goods for sale, i.e. goods that might be of interest for the user as determined by the digital object 51, 52,..., 5n based on the received information. From box 56, flow continues to decision box 57.
  • decision box 57 it is determined (in the user device 1) whether the digital object 51, 52,..., 5n is allowed to receive information, and/or what type of information to give away. If the digital object 51, 52,..., 5n is authorized to receive the requested information flow continues to decision box 60, if not, flow continues to box 58.
  • the user device 1 rejects the request for information from the digital object 5b 52,—, 5n, as it was determined (in the decision box 56) that the objects’ authorization profile would not allow it to receive the requested information subset.
  • decision box 60 it is determined whether or not a trust value of the digital object 5 52,..., 5n exceeds a threshold set for the requested information subset. That is, the digital object 51, 52,..., 5n needs to have a certain trust value in order to receive some particular information. If this trust value exceeds the set threshold for the particular information, flow continues to box 61, in which the user device 1 provides the requested information subset to the digital object 51, 52,..., 5n. It is noted that if the digital object 51, 52,..., 5n requests multiple pieces of information, it might be that only part of the requested information will actually be shared the digital object 51, 52,..., 5n.
  • Fig. 4 is a flow chart over steps of a method 40 in the XR enabled device 1 in accordance with the present disclosure.
  • a user in possession of the XR enabled device 1, for instance, wearing it on her head is moving in an XR environment.
  • the user device 1 detects an XR object 51, 52,..., 5n in the XR environment.
  • the detection may, for instance, occur due to geographical proximity, between the XR enabled device 1 and the XR object 51, 52,..., 5n, e.g., by a positioning system, nearfield communication, Bluetooth, WiFi or the like.
  • the detection may occur by the XR enabled device 1 receiving the object from another user or entity.
  • the detection may occur by the XR enabled device having a camera recognizing a location, e.g., identifying a coffee shop logo or reading a QR (Quick Response) code or other out-of-band mechanism.
  • the user device 1 receives information from the XR object 51, 52,..., 5n, for instance, receives an identifier, type and cryptographic proof, and in step 42 the user device 1 determines, based on the received information, that it has not seen the XR object 51, 52,..., 5n before.
  • step 43 the user device 1 then stores the information received in step 41, e.g., in a database, and then selects an authorization profile.
  • the selection may be based on any type of selection criteria, e.g., any of the herein given examples.
  • step 44 the user device 1 establishes trust level. This establishing may, for instance, be made by an external request to e.g., a database or by choosing a standardized trust level value, e.g., as stored within the user device and in particular in the DT 2 thereof.
  • step 45 the user device 1 may inform the XR object 51, 52,..., 5n, about which information subsets it may access.
  • the XR object 51, 52,..., 5n may then request information from any of these information subsets.
  • Fig. 5 is a flow chart over steps of a method in an XR enabled device in accordance with the present disclosure.
  • the method 20 is performed in the XR enabled device 1 for selectively sharing information with an XR object 51, 52,..., 5n.
  • the information to be shared is, in some embodiments, determined by the user. In other embodiments, the information allowed to be shared is determined by policies related to the XR enabled user device i. For instance, a company owning the user device may impose restrictions on information sharing, e.g., prohibiting some information to be shared.
  • a user wears the XR enabled device i, e.g., a head worn gadget or glasses, when moving in an XR environment, e.g.
  • Virtual Reality VR
  • Augmented Reality AR
  • Mixed Reality MR
  • a user interacting with other users and digital objects in a metaverse scenario; a user in a work environment using XR capabilities when performing tasks such as, e.g., maintenance, construction or plumbing; a user participating in an XR game interacting with both physical and real objects; a visitor interacting with XR-based reenactments of historical events; in a grocery store, using XR capabilities for finding suitable path, special deals or recommendations etc.
  • the method 20 comprises performing 22 a handshake procedure with the XR object 5b 52,—, 5n-
  • the handshake procedure maybe a known procedure, in which two devices are to establish a connection, for instance comprising setting up various parameters for communication, rules for data sharing etc., as described earlier.
  • the method 20 comprises receiving 24, from the XR object 51, 52,..., 5n, a request for information related to the user of the XR enabled device 1.
  • the information may comprise information subsets defining the user, comprising for instance one or more of: health related information, consumption related information, work related information, social information, travel related information, current intent, gender, age, medical conditions, allergies, loyalty program membership, political information, gender-interaction preferences, religion, personal preferences, and identity related information (for instance: name, social security number, temporary identity, temporary pseudonym, alternative e-mail entries, alternative e-mail entries, alternative addresses, temporary card number and credit rating). It is realized that essentially any type of information may be used for defining a particular user.
  • the method 20 comprises determining 26, by a digital representation 2 of the user, which requested information to send.
  • the digital representation 2 of the user is a specific one for each user.
  • the information to send may be all of the requested information, or only some or none at all.
  • This step comprises determining what specific information that is indeed - if any - allowable for sharing with the specific XR object 51, 52,..., 5n at hand.
  • the digital representation 2 of the user maybe required to determine allowability of requested information separately for each XR object 51, 52,..., 5n, as the rules for sharing information may differ for the XR objects 5b 52,—, 5n.
  • the method 20 comprises providing 28, to the XR object 51, 52,..., 5n, information that has been determined in the earlier step to be allowable for sharing with the XR object 51, 52,..., 5n.
  • the method provides many advantages. By letting a digital twin of the user communicate with its surrounding digital environment, the method enables the requesting of information that is relevant for a user in an VR/XR environment and the user is hence less likely to be burdened by content which is irrelevant for the current context or task. This in turn reduces the risk of the user experiencing a cognitive overload. Further, the method may also reduce the amount of data sent by the XR objects 51, 52,..., 5n, thereby providing an optimized radio usage.
  • the determining 26 maybe based on one or more of: information subsets defining the user, rules for information distribution, and level of trust relating to the XR object.
  • the rules for information distribution may comprise rules relating to one or more of: XR object identity, XR object type, XR object owner, a trusted part having verified correctness of attributes.
  • each information subset may have one or both of: individually specified required trust level and individually specified sharing principles.
  • the method may comprise updating the trust levels and sharing principles based on behavior of the XR object 51, 52,..., 5n.
  • the behavior of the XR object may, for instance, comprise previous interaction history, previous rule compliance, out-of-band information (e.g. public social media information), reviews related to object/ company trust and behavior.
  • out-of-band information e.g. public social media information
  • the digital representation 2 of the user may search through such database for information relevant for the determining which information is allowable for sharing.
  • the digital representation 2 of the user may, for instance, comprise one of: a machine learning model, a behavioral database or a software defined representation. While decision tree models might be the most intuitive choice for constructing an information distribution mechanism in the digital representation 2, other machine learning models maybe considered as well.
  • a neural network maybe considered, which neural network is trained on a dataset containing objects having a certain trust, authorization and an information subset that it has requested.
  • a supervised approach i.e., training with pre-defined labels of what is correct
  • a semi-supervised i.e., beginning with pre-defined labels of what is correct and continuing with unlabeled data, such model may learn the behavior desired by the user.
  • the performing 22 the handshake procedure comprises the digital representation 2 of the user providing an identifier for use in an XR session and one or more parameters related to the information subsets that the XR object 51, 52,..., 5n requests.
  • the digital representation 2 of the user further provides information on how the XR enabled device 1 prefers the XR object 51, 52,..., 5n to act.
  • the DT 2 may, for instance, instruct the XR object 51, 52,..., 5n on how much content to supply, what content the user is interesting in, etc.
  • the method 20 comprises, prior to the performing 22 of the handshake procedure, a step wherein the XR enabled device 1 detects the XR object 5b 52,—, 5n- Such detection may, for instance, occur automatically, e.g., when the XR enabled device 1 is sufficiently close to the XR object.
  • the digital representation 2 of the user is located within the XR enabled device 1.
  • the method 20 then comprises obtaining the information related to the user from a storage within the XR enabled device 1.
  • the digital representation 2 of the user is instead located externally of the XR enabled device 1 and the result of the determination 26 on which requested information to send is then received by the XR enabled device 1 from the digital representation of the user.
  • the digital representation 2 of the user may obtain the information related to the user from a storage external to the XR enabled device i.
  • the digital representation 2 may control information subsets which belongs to another digital representation. This might be needed, for instance, in a situation with a child and a parent.
  • a digital representation A may receive a request for information subset I from an object O.
  • the needed trust level exceeds what the digital representation A may decide upon by itself, and it must therefore receive clearance from another digital representation B.
  • the request from O is forwarded to representation B, which in turn reports its decision to digital representation A.
  • a user maybe represented by multiple digital representations.
  • a private digital representation and a work digital representation which may comprise different rules and trust relations.
  • a person’s professional digital representation in duty as a police officer may, during her operation of a vehicle, hold some principles of interactions and associated trust within a law enforcement information infrastructure, whereas the same physical person potentially operating the same vehicle along same streets but in the role of a civilian may not have access to such law enforcement information infrastructure.
  • a service technician engaged in a service task in a hazardous environment, the user context, cognitive load etc. could escalate the rule settings to ensure that only information relevant for the task is displayed.
  • Such rule setting would thus alleviate the service technician from receiving anything but the most important information, which in turn may reduce, for instance, the cognitive load of the technician.
  • the method as described in different embodiments may be used for sharing information regarding, for instance, allergies, and thereby creating a personalized menu for the user.
  • certain hardware, equipment or device may be associated with implicit roles and hence use specific digital representations; for example, the use of a certain physical object that implicitly carries some specific level of user authentication (badge, ID card, gun, device (smartwatch), etc.) may assign a certain digital representation to the user/wearer, etc.
  • the XR enabled device i is one or more of: near-eye display (XR headset, smart glasses, smart lenses etc.), user equipment (mobile phone, tablet, laptop, etc.), heads-up display (smart glass projection screen, such as windshield or window), hologram projector or the like.
  • Fig. 6 schematically illustrates, in terms of a number of functional units, the components of a user device i according to an embodiment.
  • Processing circuitry no is provided using any combination of one or more of a suitable central processing unit (CPU), multiprocessor, microcontroller, digital signal processor (DSP), etc., capable of executing software instructions stored in a computer program product 330 (as in Fig. 8), e.g., in the form of a storage medium 130.
  • the processing circuitry no may further be provided as at least one application specific integrated circuit (ASIC), or field programmable gate array (FPGA).
  • ASIC application specific integrated circuit
  • FPGA field programmable gate array
  • the processing circuitry 110 is configured to cause the user device 1 to perform a set of operations, or actions, as disclosed herein.
  • the storage medium 130 may store the set of operations
  • the processing circuitry 110 maybe configured to retrieve the set of operations from the storage medium 130 to cause the user device 1 to perform the set of operations.
  • the set of operations maybe provided as a set of executable instructions.
  • the processing circuitry 110 is thereby arranged to execute embodiments of the methods as disclosed herein.
  • the storage medium 130 may also comprise persistent storage, which, for example, can be any single one or combination of magnetic memory, optical memory, solid state memory or even remotely mounted memory.
  • the user device 1 may further comprise a communications interface 120 for communications with other entities, functions, nodes, and devices, over suitable interfaces.
  • the communications interface 120 may comprise one or more transmitters and receivers, comprising analogue and digital components.
  • the processing circuitry no controls the general operation of the user device i e.g., by sending data and control signals to the communications interface 120 and the storage medium 130, by receiving data and reports from the communications interface 120, and by retrieving data and instructions from the storage medium 130.
  • Other components, as well as the related functionality, of the device 20 are omitted in order not to obscure the concepts presented herein.
  • Fig. 7 is a schematic diagram showing functional modules of a user device according to an embodiment.
  • the user device 1 of figure 7 comprises a number of functional modules; a perform module 210 configured to perform a handshake procedure with the XR object 51, 52,..., 5n, a receive module 220, configured to receive, from the XR object 51, 52,..., 5n, a request for information related to the user of the XR enabled device 1, a determine module 230, configured to determine, based on output from a digital representation 2 of the user, which requested information to send, and a provide module 240, configured to provide, to the XR object 51, 52,..., 5n, information determined to be allowable for the XR object 51, 52,..., 5n.
  • the user device 1 of Fig. 7 may further comprise a number of optional functional modules (not illustrated), such any of an update module, configured to update the trust levels and sharing principles based on behavior of the XR object 51, 52,..., 5n, a monitor module 250 configured to monitor data relating to user, a process module 260 configured to process data, a provide module 270 configured to provide processed data to policy entity, and an update module 280 configured to update policies based on processed data.
  • each such functional module may be implemented in hardware or in software.
  • one or more or all functional modules may be implemented by the processing circuitry no, possibly in cooperation with the communications interface 120 and the storage medium 130.
  • the processing circuitry 110 may thus be arranged to from the storage medium 130 fetch instructions as provided by a functional module and to execute these instructions, thereby performing any actions of the user device 1 as disclosed herein.
  • Fig. 8 shows one example of a computer program product comprising computer readable means according to an embodiment.
  • a computer program 320 can be stored, which computer program 320 can cause the processing circuitry no and thereto operatively coupled entities and devices, such as the communications interface 120 and the storage medium 130, to execute methods according to embodiments described herein.
  • the computer program 320 and/or computer program product 330 may thus provide means for performing any actions of the user device 1 as disclosed herein.
  • the computer program product 330 is illustrated as an optical disc, such as a CD (compact disc) or a DVD (digital versatile disc) or a Blu-Ray disc.
  • the computer program product 330 could also be embodied as a memory, such as a random-access memory (RAM), a read-only memory (ROM), an erasable programmable read-only memory (EPROM), or an electrically erasable programmable read-only memory (EEPROM) and more particularly as a non-volatile storage medium of a device in an external memory such as a USB (Universal Serial Bus) memory or a Flash memory, such as a compact Flash memory.
  • RAM random-access memory
  • ROM read-only memory
  • EPROM erasable programmable read-only memory
  • EEPROM electrically erasable programmable read-only memory
  • the computer program 320 is here schematically shown as a track on the depicted optical disk, the computer program 320 can be stored in any way which is suitable for the computer program product
  • An Extended Reality, XR, enabled device 1 for selectively sharing information with an XR object 51, 52,..., 5n, is provided.
  • the XR enabled device 1 is configured to:
  • the Extended Reality, XR, enabled device 1 is configured to determine information to be allowable for sharing based on one or more of: information subsets defining the user, rules for information distribution, and level of trust relating to the XR object.
  • the information subsets defining the user comprises one or more of: health related information, consumption related information, work related information, social information, travel related information, current intent, gender, age, medical conditions, allergies, loyalty program membership, political information, gender-interaction preferences, religion, personal preferences, identity related information and credit rating.
  • the rules for information distribution comprise rules relating to one or more of: XR object identity, XR object type, XR object owner, a trusted part having verified correctness of attributes.
  • each information subset has one or both of: individually specified required trust level and individually specified sharing principles.
  • the Extended Reality, XR, enabled device i is configured to update the trust levels and sharing principles based on behavior of the XR object 51, 52,—, 5n-
  • the digital representation 2 of the user comprises one of: a machine learning model, a behavioral database or software defined representation.
  • the XR enabled device i is configured to perform the handshake procedure by the digital representation 2 of the user providing an identifier for use in an XR session and one or more parameters related to the information subsets that the XR object 51, 52,..., 5n requests and on how the XR enabled device 1 prefers the XR object 51, 52,..., 5n to act.
  • the XR enabled device 1 is configured to, prior to the performing of the handshake procedure, detect the XR object 51, 52,..., 5n.
  • the digital representation 2 of the user is located within the XR enabled device 1, while in a different set of embodiments, the digital representation 2 of the user is located external to the XR enabled device 1, and wherein the result of the determining 26 which requested information to send is received from the digital representation of the user.
  • the XR enabled device 1 is configured to obtain the information related to the user from a storage in the XR enabled device 1.
  • the XR enabled device 1 is configured to obtain the information related to the user from a storage external to the XR enabled device 1.
  • the XR enabled device i is one or more of: near-eye display (XR headset, smart glasses, smart lenses etc.), user equipment (mobile phone, tablet, laptop, etc.), heads-up display (smart glass projection screen, such as windshield or window), hologram projector or the like. It is noted that the XR enabled device i may be constructed in many different forms.

Landscapes

  • Engineering & Computer Science (AREA)
  • Computer Security & Cryptography (AREA)
  • Computer Networks & Wireless Communication (AREA)
  • Signal Processing (AREA)
  • Theoretical Computer Science (AREA)
  • General Engineering & Computer Science (AREA)
  • Computing Systems (AREA)
  • Software Systems (AREA)
  • General Physics & Mathematics (AREA)
  • Physics & Mathematics (AREA)
  • Computer Hardware Design (AREA)
  • Health & Medical Sciences (AREA)
  • Mathematical Physics (AREA)
  • Artificial Intelligence (AREA)
  • Evolutionary Computation (AREA)
  • Medical Informatics (AREA)
  • Data Mining & Analysis (AREA)
  • Bioethics (AREA)
  • General Health & Medical Sciences (AREA)
  • Computational Linguistics (AREA)
  • Computer Vision & Pattern Recognition (AREA)
  • Databases & Information Systems (AREA)
  • Power Engineering (AREA)
  • Storage Device Security (AREA)

Abstract

A method in an Extended Reality, XR, enabled device is provided for selectively sharing information with an XR object. The method is performed in the XR enabled device and comprises: performing a handshake procedure with the XR object; receiving, from the XR object, a request for information related to the user of the XR enabled device; determining, by a digital representation of the user, which requested information to send; and providing, to the XR object, information determined to be allowable for sharing with the XR object. A corresponding XR enabled device, computer program and computer program products are also provided.

Description

Methods and devices for selective sharing of information in Extended Reality
Technical field
The technology disclosed herein relates generally to the field of protecting information distribution in an Extended Reality, XR, environment, and in particular to means and methods for selective sharing of information.
Background
Extended reality (XR) is a term referring to all real-and-virtual environments and to human - machine interactions generated by computer technology and wearables. Virtual Reality (VR), Augmented Reality (AR) and Mixed Reality (MR) are examples on XRs, where the “X” of XR is a variable indicating any current or future spatial computing technology.
There is a vast amount of information that can be received by the user that is interacting in an XR environment. Much of this information may be irrelevant for the user, and the excess information may affect the user experience negatively. In many use cases XR devices are needed for presenting the information to the user, but since the user interaction for XR devices is limited there may be a risk for cognitive overload. Therefore, some information filtering may be necessary; but manual filtering, such as for instance removal of certain content in XR, is often cumbersome.
Several parties, such as vendors, advertisers etc., may request information about the user moving in the XR environment and it may be difficult for the user to interact with each new object of such parties in order to decide what to share with whom. This situation is similar to choosing cookies for every new web site visited. What information the user wishes to share may further be affected by the current environment, user’s context, user’s cognitive load etc.
Furthermore, in a situation where not all parties are part of the same information sharing system, such as when interacting with different digital objects in an XR setting, the user does not have the possibility of knowing beforehand which objects it will interact with. There is thus a need for an efficient way for a user in an XR environment to share information with different XR providers.
Summary
An objective of the present disclosure is to address and improve various aspects for information sharing in an XR environment. A particular objective is to ensure that an XR enabled device only shares information that has been allowed to be shared, as determined, for instance, by the user. Another particular objective is to provide means and methods for an efficient information sharing, while also protecting information from unauthorized sharing. Still another objective is to enable safe information sharing between different users who use separate information delivery systems without any implicit trust therebetween. Yet another objective is to reduce unnecessary rendering processing and processing load in any conveying network, by allowing them to render only content that will be allowed by a user. These objectives and others are achieved by the methods, devices, computer programs and computer program products according to the appended independent claims, and by the embodiments according to the dependent claims.
According to a first aspect there is presented a method in an Extended Reality, XR, enabled device for selectively sharing information with an XR object. The method is performed in the XR enabled device and comprises performing a handshake procedure with the XR object; receiving, from the XR object, a request for information related to the user of the XR enabled device; determining, by a digital representation of the user, which requested information to send; and providing, to the XR object, information determined to be allowable for sharing with the XR object.
According to a second aspect there is presented a computer program for an Extended Reality, XR, enabled device for selectively sharing information with an XR object. The computer program comprises computer program code, which, when executed on at least one processor on the XR enabled device causes the XR enabled device to perform the method according to the first aspect.
According to a third aspect there is presented a computer program product comprising a computer program according to the second aspect and a computer readable means on which the computer program is stored. According to a fourth aspect there is presented an XR enabled device for selectively sharing information with an XR object. The XR enabled device is configured to perform a handshake procedure with the XR object; receive, from the XR object, a request for information related to the user of the XR enabled device; determine, by a digital representation of the user, which requested information to send; and provide, to the XR object, information determined to be allowable for sharing with the XR object.
Advantageously, these aspects enable an improved granularity of a user’s privacy in an XR-environment. The digital representation of the user determines what type of information is relevant to provide in a certain scenario and adapts it according to various parameters, such as, for instance, personal preferences, type of information (e.g., permanence, first/ second person damage if misused, etc.) and past behavior of external entities.
Advantageously, these aspects are suitable for situations wherein information presented in an XR device, e.g. smart glasses, are needed and there also is a risk of cognitive overload. In one aspect, the user may be unable or unsuited to manually interact with the content in the XR environment, e.g., when driving a car or performing a task where both hands are occupied.
Other objectives, features and advantages of the enclosed embodiments will be apparent from the following detailed disclosure, from the attached dependent claims as well as from the drawings.
Generally, all terms used in the claims are to be interpreted according to their ordinary meaning in the technical field, unless explicitly defined otherwise herein. All references to "a/an/the element, apparatus, component, means, module, action, etc." are to be interpreted openly as referring to at least one instance of the element, apparatus, component, means, module, action, etc., unless explicitly stated otherwise. The actions of any method disclosed herein do not have to be performed in the exact order disclosed, unless explicitly stated.
Brief description of the drawings
Figure i illustrates schematically a user device in which embodiments of the present disclosure may be implemented. Figure 2 illustrates an exemplary implementation of a selective disclosure of information using VCs.
Figure 3 is a flow chart of embodiments for interaction between user device and a known digital object.
Figures 4 and 5 are flowcharts of various embodiments of a method.
Fig. 6 is a schematic diagram showing functional units of a device according to an embodiment.
Fig. 7 is a schematic diagram showing functional modules of a device according to an embodiment.
Fig. 8 shows one example of a computer program product comprising computer readable means according to an embodiment.
Detailed description
The inventive concept will now be described more fully hereinafter with reference to the accompanying drawings, in which certain embodiments of the inventive concept are shown. This inventive concept may, however, be embodied in many different forms and should not be construed as limited to the embodiments set forth herein; rather, these embodiments are provided by way of example so that this disclosure will be thorough and complete, and will fully convey the scope of the inventive concept to those skilled in the art. Like numbers refer to like elements throughout the description. Any action or feature illustrated by dashed lines should be regarded as optional.
Briefly, according to the present teachings a digital twin is used as a representation of a user in terms of information subsets. Instead of digital objects of an XR environment delivering content directly to the user, the user’s digital twin communicates with the digital objects to steer the content of the digital objects towards the user’s preferences. In order to achieve this, the digital objects may request certain information subsets about the user. Such information subsets may comprise details ranging from mood, current intent, to gender and age to medical conditions, allergies, BMI and credit rating, and so on. Fig. i illustrates schematically a user device i in which embodiments of the present disclosure may be implemented. According to various aspects of the present teachings, a user has a user device 1 for interacting with an XR environment; herein the user device i is interchangeable denoted “XR enabled device i”. The user is in an XR environment where different digital objects 51, 52,..., 5n (interchangeable denoted “XR objects 51, 52,..., 5n herein) want to interact and/or present information to the user. The user interacts with these digital objects using the XR-enabled device 1, which may, for instance, be a near-eye display (such as XR headset, smart glasses, smart lenses, etc.), user equipment (mobile phone, tablet, laptop, etc.), hologram projector or the like.
According to an aspect, a digital twin (DT) 2 representation of a user is used. In a very broad aspect, DTs are typically modelling of entities in the physical environment such as non-human physical objects such as vehicles, buildings, cellular communication networks, logical or physical clusters of loT sensors, and many more, as well as digital copies/instances representing e.g., a combination of a physical person’s bio-data attributes, her associated devices and their respective digital states.
In the latter case, a person carrying her smartphone, carrying data such as mail, contact list, calendar entries, purchase patterns, etc., and some applications installed, that in all constitute at least a partial copy of the persons behavior, relations and e.g., purchase behavior. If the person also carries a smartwatch that provides sensor information of heart rate and blood pressure, and possibly also carrying e.g., smart glasses, it may further reveal where the user looks/gazes, perhaps in conjunction with iris-detection. A DT 2 of the physical person may be considered as all digital states associated with user’s biometry and her associated smart devices. The DT 2 is, in the digital domain, a digital copy of a physical asset, system or device.
In a further aspect, the DT 2 may also be considered as above-mentioned set of digital states in combination with a ML-model trained on user sensor states, etc., that may “respond to a question” in same way (optimally, but in practice at least sufficiently similar to) as the physical user would respond, given the current set of sensor state.
The digital twin representation of the user may utilize information subsets 3a comprising digital representations of physical and mental traits of the user. An information subset 3a may comprise one or more information elements. The information subset(s) 3a may in turn be used as input to a decision engine 2a, comprising software or reconfigurable logic executing, for instance, a series of manually configured if-else-then statements. These statements may be stored as static rule set 2d or as a dynamic rule set 2c, which may be utilized by the decision engine 2a. Static indicates that the rules are based on permanent traits of the user, while dynamic indicates that, for instance, the current context or cognitive load may alter the rule. In other embodiments, an incrementally configured machine learning (ML) model based on decision trees such as, for instance, Random Forest maybe used as the decision engine 2a. Other ML models may also be used, which are described later in relation to various embodiments. The DT 2 may further comprise pre-defined parts, containing permanent traits of the user, and it may receive input, e.g., regarding the current context and cognitive load of the user.
Information subsets maybe assigned different attributes. Two attributes which may define these information subsets, is permanence and type. The information subsets 3a may be permanent, semi -permanent and temporary. Permanent information comprises personal identifiers, chronic medical conditions etc., semi-permanent comprises slow-changing attributes such as consumption habits, age, weight, height while temporary information comprises current intent, hunger, susceptibility to new information, cognitive load etc. “Type”, on the other hand, is a classification of what the information subset contains information about, e.g., medical-related, work- related, consumption-related, travel-related.
Another example on categories may be “Sensitivity”, which may range from information sharable with anyone, e.g., “user likes carrots”, to highly sensitive personal information, e.g., “user has chronic disease X”. Still another exemplary category maybe “Exposure cost”, which may comprise a “cost” (e.g., an estimated amount of money, time etc.) for rectifying unwanted exposure of information.
The user’s DT 2 may be requested to provide one or more of the information subsets 3a (each information subset comprising a single information element or several information elements) relating to the user and may further store trust levels and authorization types associated with certain information subsets. The user device 1 may comprise a storage 3 in which such information is stored. The storage 3 may, for instance, comprise information subsets 3a comprising data, type, permanence level etc.; object information 3b relating to digital objects 51, 52,..., 5n, e.g., trust, authorization profile, cryptographic keys etc.; and user object relations 3c, e.g., user IDs, previously shared information etc.
In the illustrated case, the processing devices are shown as being included in the user device 1, but one or more of them may be external to the XR enabled device 1. For instance, the XR enabled device 1 is illustrated as comprising the storage 3, but in other embodiments, the storage 3 maybe external, e.g., an external device or cloud storage, from which the XR enabled device 1 obtains (e.g. receives or requests) the information. Likewise, the DT 2 and/or the XR rendering component could be external to the XR enabled device 1. In such cases wherein the processing is performed external, the main function of the XR enabled device 1 (e.g., smart glasses) is to provide XR content to the user, and utilizes, for instance, cloud services for the DT 2, ML models, data storage etc.
In a contemporary XR environment, digital objects 51, 52,..., 5n within the XR environment may project information to the user directly, possibly after passing through an information filter. Examples of such digital objects 51, 52,..., 5n may, for instance, comprise objects of common interest, such as stores, tourist attractions and billboards, but also objects of interest to the specific individuals such as, for instance, vehicles, buildings and industrial equipment. The digital objects 51, 52,..., 5n may also comprise other individuals in the XR environment.
The user device 1 is enabled to interact with the digital objects 51, 52,..., 5n in the XR environment. Such interaction may begin with a handshake where the digital object 5b 52,..., 5n supplies an identifier, a cryptographical proof of the identifier and object type. The digital object information is stored, for instance, in the user device 1 together with relation information, such as e.g., previous interactions with the digital object 51, 52,..., 5n . The DT 2 supplies an identifier and may optionally also supply cryptographic proof of identifier, additional information, such as susceptibility to content and which types of information subsets the digital object 51, 52,..., 5n may request (according to the objects authentication profile, as will be explained later). The identification supplied by the DT 2 may either be persistent or session-unique, a difference being that the latter makes it difficult for the digital object 51, 52,..., 5n to correlate previously information subsets received from the user device 1. The handshake maybe initiated by either party and the information maybe sent as a single message or divided into several handshake-messages.
When this handshake procedure is completed, the digital object 51, 52,..., 5n may request information subsets 3a in the categories specified by the user and may also supply content to the user. In some embodiments, the information subsets 3a may also be sent as a part of the client’s handshake message.
When interacting with digital objects 51, 52,..., 5n for the first time, the DT 2 may create a random identifier. Depending on the trust value (described next), the DT 2 may, for subsequent sessions with the object, either choose to generate a new (random) identifier or to keep the same identity. The latter will allow the digital objects 51, 52,..., 5n to inter-connect previously received segments and thereby increase the knowledge of the user.
Each digital object 51, 52,..., 5n previously interacted with has a trust level. Each digital object 51, 52,..., 5n may further have a “start value” for trust level which may be determined based on how the digital objects 51, 52,..., 5n is trusted by other users. In other embodiments, the start value may be a standardized value. The trust level may be changed over time based on how relevant content is shown to be with respect to the user preferences, how relevant the information that the digital objects 51, 52,..., 5n requested actually was for the context, how well-known the digital objects 51, 52,..., 5n is to the user, how other users rate the digital objects 51, 52,..., 5n etc. Such update procedure may, for instance, be done manually or by rules defined by the DT 2.
The trust value determines to what extent the digital objects 51, 52,..., 5n is trusted to handle data regarding the user. The more permanent an information subset 3a is, the higher level of trust is needed for the user device 1 to supply the information.
The trust level may be correlated against pre-set threshold values, specified in the rule sets within the DT 2, where certain thresholds must be exceeded for certain information to be released. These thresholds may, for instance, be determined manually by the user or be incrementally determined by a machine learning model. The thresholds may, in addition to the trust, further take the current context as input. The context may be the current environment in which the interaction takes place as well as the state the user is in, including, for instance, the current cognitive load of the user. Certain contexts may for example require an increased trust level before sending a certain information subset.
Depending on the level of trust that the user has towards the requesting object 51, 52,..., 5n, the associated DT 2 may provide none, some (selected) or all information subsets requested by the object 51, 52,..., 5n. The information may be stored in the storage 3, or the user device 1 may, as noted earlier, be enabled to retrieve such information from an external device. The information may, e.g., comprise personal identification, current intent, shopping habits, personal interests, medical information, etc. For example, the aspect of trust vs. information maybe understood in context of e.g., a second party obtaining info on shoe size may only require low trust level, whereas information related to medical records etc. may require high trust level.
Based on the provided preferences, in terms of what information the user has trusted the digital objects 51, 52,..., 5n to receive, the user maybe presented with XR content (also known as overlay content) received from the external digital objects 51, 52,..., 5n.
In some embodiments, the digital object 51, 52,..., 5n may increase its trust level by acting in compliance with previously stated information policy rules (and vice versa), by, for instance, requesting only user-related data according to rules, and provide first user with relevant content rendering.
The digital object 51, 52,..., 5n may further belong to a certain entity, e.g. a certain vendor, owner, franchise etc. In such case, the digital object 51, 52,..., 5n may share trust value with other digital object 51, 52,..., 5n from the same entity.
Furthermore, each digital object 51, 52,..., 5n may have an authorization profile determining which information subset it may inquire about. The authorization profile may restrict which information types that a digital object 51, 52,..., 5n may request. For instance, a digital object 51, 52,..., 5n in a coffee shop is not authorized to access medical-related information regardless of how high the trust for the digital object 51, 52,—, 5n is.
Hence, the authorization profile determines what information subsets the digital object 51, 52,..., 5n may request from the user device 1 while the trust level determines what information subsets the DT 2 will supply to the digital object 51, 52,..., 5n. As a part of the information subsets supplied from the DT 2, a parameter indicating susceptibility to new content can specify how much content the digital object 51, 52,..., 5n is allowed to supply in response to the information subsets received. If the current cognitive load of the user is high, the information supplied by the DT 2 may reflect that only minimal or no information is of interest at the moment. If a digital object 51, 52,..., 5n does not comply with the supplied information, the DT 2 may lower the trust value for the digital object 51, 52,..., 5n. If a digital object 51, 52,..., 5n repeatedly sends content which is not relevant for the user or requests information which is not relevant or explicitly outside its authorization profile, the trust value may reach a level in which no information is ever supplied to the digital object 51, 52,..., 5n.
For a digital object 51, 52,..., 5n to show that it has a certain trust level and authorization for a user, the digital object 51, 52,..., 5n must supply cryptographical proof of having an expected identity. A classic way of establishing such trust is through Public Key Infrastructure (PKI) and asymmetric signatures, where a trusted party endorses the certificate belonging to the key, and a proof of having the key in form of signature is a proof of identity. VCs, described with reference to figure 2, are one possible way to utilize asymmetric keys for establishing trust as well as for sharing selected information with a peer. In lieu of using an asymmetric key to establish trust, other methods like a pre-shared symmetric key or a zero-knowledge proof scheme could also be used.
Verifiable Credentials (VCs) are a standardized way of defining credentials in web which is cryptographically secure, privacy respecting and machine verifiable. A VC is in many ways very similar to a PKI certificate. A VC (similar to a PKI cert) has a set of tamper-evident claims related to the holder of the VC (similar to PKI cert attributes) and credential metadata including data that cryptographically proves who issued the VC (similar to a Certificate Authority, CA, signature). The credential metadata has an identifier and a set of properties such as the issuer, expiry date and time, a public key (similar to a PKI cert public key) for verification purpose, or a revocation mechanism.
When dealing with VCs there are issuers, holders, and verifiers, as well as a verifiable data registry. Each issuer, holder and verifier, associate a set of public keys to their identifier. The issuer’s public keys are made publicly available, so that the verifier can validate VC presented by the holder as having been produced by the issuer.
The issuer, holder, and verifier use a verifiable data registry which maintains identifiers and schemas. The schemas describe the data structure and content of the VC. The issuer can freely define the schema, i.e., content of VC, and the verifier can, by fetching the schema, parse the content. This is one of the big differences compared to PKI certificates, which have more rigid structure and content. With VC the content carried by the VC can freely be defined and the verifier can, using the schema, understand the content. The verifiable data registry could be a trusted database, decentralized database, government ID database, blockchain, or some other secure and accessible service.
An issuer is an organization or entity which issues digitally signed VCs to holders. The holder stores the VC. The holder could be a user or a device. A verifier is an organization or entity which verifies the VC held by the holder. For example: A university (issuer) issues university certificates (VC) to its students (holder). A company (verifier) at which the student applies for a job verifies that the VC has been issued by the university. When a holder of the digitally signed VC presents her VC to a verifier, it maybe done in the format of a verifiable presentation (VP). A VP is basically the VC signed by the VC holder, i.e., the VC holder’s secret key, the VP signature may also cover an authentication challenge received from the verifier, thereby acting as an authentication.
The verification scheme is also often extended with zero-knowledge proofs, where the verifier requests the holder to prove that they are in possession of a claim asserting some property. For example, the verifier could request the holder to prove that she is above 18 years of age. Such zero-knowledge proof is constructed such that the holder does not have to reveal the VC itself to the verifier nor their actual age. In the case of the age example, this would allow the student to demonstrate their age being over 18 without revealing their actual age to the verifier. VCs also support selective data disclosure, wherein the holder may choose to provide only a subset of the information of the VC to a verifier and withholding information that she does not want the verifier to see. In summary, a VC is much like a public key certificate; it contains a provable identifier (public key), some information about the holder, and is signed by the issuer (e.g., CA). The difference is that VC is more dynamic with regards to what can be stored in it, and the possibility of having zero-knowledge proofs and support for selective data disclosure and is taken advantage of according to the present teachings, as will be described more in detail in the following.
Fig. 2 illustrates an exemplary implementation of a selective disclosure of information using Verifiable Credentials (VCs) according to the present teachings. The VC 7 comprises a digital identity of the user, ID (e.g. public key, or hash of public key) and a signature Sig generated by the issuer. Data that the issuer has verified is presented in the VC 7 as hash values Hash_A, Hash_B, Hash_ C of the data. When the user uses the VC 7 towards a verifier it can provide the VC 7 and then the verifier only learns the ID of the holder. However, if the holder wants to reveal data A (e.g., holder name) to the verifier he can do it by providing the VC 7 and data A to the digital object 51, 52,..., 5n. The verifier can then generate hash (A) and verify with Hash_A that the signature of the VC 7 is correct and thereby it shows that the value A has been verified by the issuer.
An issue with using VCs 7 is that the VC 7 is a static piece of data containing information about the holder that the issuer has verified and signed. If the user wants to update the VC 7, e.g., change phone number (e.g., a value represented by Hash_B in the figure 2), it requires that the user contacts the issuer and requests a new updated VC 7 with the new phone number. However, before the issuer can do that, the issuer needs to verify the identity of the one that requested the update of the VC 7 (i.e., needs to ensure that is it really the VC holder that wants to update his own VC 7) and that the new phone number is correct; an issuer would not want to carelessly issue VCs without verification, much like a CA would not want to issue certificates without verifying that the content is correct. This is thus not a lightweight and fast process.
When updating the VC 7 it is in many cases necessary to do the “heavy” alternative described above, wherein the issuer verifies the holder identity and provide information. However, there are also information and use cases for VC 7 where such highly verified information is not required. In these cases, the holder could even generate and issue the VC 7 itself, a self-issued VC, much like a self-signed certificate. This means that the receiver would not get any trust in the information (e.g., about the holder) besides the holder providing the information claiming (without proof) it to be correct. An advantage is that the holder can generate, update, and expand its self-issued VC 7 whenever he likes as it only entails signing the VC data structure using the own private key, or in a yet simpler solution, not even signing it such as when presenting the VC 7 as a VP, the VC 7 is signed by the holder via the VP signature. This also means that the holder is enabled to do selective disclosure of information on multiple levels; he can choose which data is presented as data and which as, for instance, hash of data to the verifier (as described earlier), and in addition he can choose even which data to provide even at hash level to the verifier. For instance, not sending Hash_C to the verifier meaning that the verifier does not even know there is such data available at/about the holder. It is noted that the provided example of hash values is only one (simplified) exemplary implementation of selective data disclosure; there are various other ways that the selective data disclosure may be implemented.
For the verifier to be able to interpret the VC 7 the verifier needs it get the matching schema. The VC contains a pointer/identifier of the schema of the VC. Traditionally, the issuer would generate the VC and associated schema and publish the schema to the registry. For self-issued VCs, the issuer/holder would also generate the schema. If the issuer/holder has the right to publish the schema to the registry it could do that. However, as the self-issuing approach makes it possible to make rapid changes to the VC and thereby schema, it might not be optimal. Instead, the issuer/holder could provide the schema to the verifier together with, or as part of, the VP. The verifier could then use the received schema to interpret the received VC.
Yet another approach is that the verifier can provide the desired schema to be used by the holder. The holder then generates a VC based on the received schema and signs the VC/VP using its private key. The holder can provide additional assurance of the data contained in the VC if the data has also been issued by a “trusted” issuer in a separate VC(s) by also including that part of the data and their respective “trusted” VCs in the VP provided to the verifier, requesting the use of the specific schema. Thus, the verifier receives data according to the desired schema and can potentially also verify at least some of the data has been issued by a trusted issuer. An additional possibility with such self-issued VCs is that the issuer/holder could generate a new public key pair for its VC whenever it wants, and e.g., use a different public key (digital ID) towards a web service for each new session and thereby get better privacy. Of course, when needing to provide more “reliable” data, the actual identity, or other information about the holder, would be revealed at least to some degree through the use of non-self issued VCs.
One exemplary implementation of selective information sharing is according to various embodiments is to utilize the earlier described verified credentials, VC. In an exemplifying scenario, an exemplary implementation according to the present teachings is described next:
The DT 2 stores VCs 7 of the user device 1. These VCs 7 may e.g., comprise Government issued VC 7 with official name, social security number, etc. As another example, the VCs 7 may be issued by various service providers e.g., by the store that the user/holder is visiting, the VCs 7 may then contain data such as customer number and name, which the store system can map to purchasing habits and other information about the user collected by the store. The VCs 7 could also be issued by a financial institution e.g. related to a credit card which can be used for making purchases. As a final example, the VCs 7 can have self-signed VCs 7 containing, e.g., temporary data (mood, cognitive load etc.) or data where no obvious verifier exists (e.g., hobbies).
The DT 2 uses the information of the VCs 7 but may also have additional information about the user and policies or algorithms instructing how to behave in certain situations.
When the DT 2 interacts with the digital objects 51, 52,..., 5n, the DT 2 uses VCs 7 ( or VPs) to present information subsets about the user to the digital objects 51, 52,..., 5n. Similarly, the digital object 51, 52,..., 5n can provide its VCs 7 to the DT 2 and thereby share information about itself. Using VCs 7/VP user authentication can also be provided since the VP, which is signed by the VC holder, i.e. DT 2/digital object 51, 52,..., 5n, may comprise a challenge issued by the peer, i.e. the digital object 51, 52,..., 5n /DT 2. As an example, when the DT 2 and a digital object 51, 52,..., 5n are about to interact, they exchange nonces, which are then carried in the signed VPs, thereby resulting in mutual authentication. This interaction then acts as the handshake described earlier.
The digital object 51, 52,..., 5n would typically share the same set of data with any interacting DT 2 so in the following focus is on how the DT 2 can limit what data is to be shared with the digital object 51, 52,..., 5n. However, the digital object 51, 52,..., 5n could utilize the same approach. Initially the DT 2 can use a VC 7 where all or at least most data is blinded by default, e.g. only provided as hash values as described earlier. After authentication, when the DT 2 knows who the digital object 51, 52,..., 5n is/belongs to and the DT 2 has evaluated what data to share with the digital object 51, 52,..., 5n, the DT 2 may choose to possibly provide the additional data matching the hash values in order to let the digital object 51, 52,..., 5n verify the data about the holder/DT 2. The DT 2 may choose to share certain data based on own policy. Alternatively, the digital object 51, 52,..., 5n may request certain additional data and the DT 2 may then choose which parts of the requested data to share. Sharing the data could be based on initially sharing all data as hashed values and later provide the actual data for verification by the digital object 51, 52,..., 5n. Alternatively, the digital object 51, 52,..., 5n may provide a VC scheme to the DT 2, and the DT 2 may then create a VC 7 with indicated scheme, possibly leaving some data fields blank if the DT 7 chooses not to reveal the data to the digital object 51, 52,..., 5n.
Some of the data that maybe shared with the digital object 51, 52,..., 5n could be dynamic data, that is not issued by a “trusted” issuer but rather be generated at time of answering the digital object 51, 52,..., 5n. This could, for instance, be the mood of the holder. The digital object 51, 52,..., 5n may choose to put different level of confidence in data received from the DT 2 based on who has issued the data; data issued by a trusted issuer, which can be verified by the digital object 51, 52,..., 5n, would naturally be more reliable while “self-signed” data by DT 2 would have lesser confidence. Typically, data of permanence “Permanent” as described earlier, would be issued by a trusted issuer, while temporary data is typically self-signed by the DT 2.
In addition to using selective data disclosure, as has been described, also the zeroknowledge proof aspect of VC 7 could be used. This could be used to establish that the DT 2 is allowed to get age restricted material, e.g., age 18+ content, without revealing the actual age of the DT 2 holder. Other uses of the zero-knowledge proof include proof-of-permission without revealing identity, e.g., showing that you may buy a prescription drug without exposing your identity to a digital pharmacy. Proof-of- access without revealing user identity is another application, where the digital object 5b 52,..., 5n can verify that you are allowed access without needing an identifying membership number of the like.
In some embodiments, the DT 2 may comprise a manual override function. Such override function may, for instance, be activated when a defined trust level for a particular information subset exceeds a certain threshold value. In such situation, the user may need to actively allow the information subset at hand in order for it to be transmitted by the DT 2. Such authorization may take place by interacting with the XR environment by the user using e.g., voice or movement. As another example, the authorization may be performed outside of the XR environment in a two-factor authentication fashion, e.g., the user clicking on an emailed link or using an authentication app on his mobile phone.
In other embodiments, the manual override may give the user right to supply requested information segments which the digital object 51, 52,..., 5n does not have the trust and/or authorization to receive.
In still other embodiments, the overall stress level or condition of the user, or any other physical trait, may also be considered when limiting the scope of the information subset.
Fig. 3 is a flow chart of embodiments for interaction between an XR enabled user device 1 and a known digital object 51, 52,..., 5n. The flow starts in box 50, and continues to box 51, in which the XR enabled user device 1 detects a digital object 51, 52,..., 5n in an XR environment. Flow continues to box 52.
In box 52, the user device 1 (and in particular the DT 2 thereof) receives an object identifier and cryptographic proof from the digital object 51, 52,..., 5n. The DT 2 determines that is has seen this digital object 51, 52,..., 5n before, i.e., that it is a previously known digital object 51, 52,..., 5n. Flow continues to decision box 53.
In decision box 53, it is determined whether or not the user device 1 is to remain anonymous for the digital object 51, 52,..., 5n. If no, flow continues to box 54, in which the user device i sends a known user identifier to the digital object 51, 52,..., 5n. If yes, flow continues to box 55, wherein the user device 1 sends a random user identifier to the digital object 51, 52,..., 5n, and the user device 1 keeps its anonymity towards the digital object 51, 52,..., 5n. From either box 54 or 55, flow continues to box 56.
In box 56, the XR enabled user device 1 receives a request for an information subset from the digital object 51, 52,..., 5n. As described earlier, the information maybe any type of information. For instance, the digital object 51, 52,..., 5n may request information on shopping habits, and if allowed to receive such information, it may project relevant goods for sale, i.e. goods that might be of interest for the user as determined by the digital object 51, 52,..., 5n based on the received information. From box 56, flow continues to decision box 57.
In decision box 57, it is determined (in the user device 1) whether the digital object 51, 52,..., 5n is allowed to receive information, and/or what type of information to give away. If the digital object 51, 52,..., 5n is authorized to receive the requested information flow continues to decision box 60, if not, flow continues to box 58.
In box 58, the user device 1 rejects the request for information from the digital object 5b 52,—, 5n, as it was determined (in the decision box 56) that the objects’ authorization profile would not allow it to receive the requested information subset.
In decision box 60, it is determined whether or not a trust value of the digital object 5 52,..., 5n exceeds a threshold set for the requested information subset. That is, the digital object 51, 52,..., 5n needs to have a certain trust value in order to receive some particular information. If this trust value exceeds the set threshold for the particular information, flow continues to box 61, in which the user device 1 provides the requested information subset to the digital object 51, 52,..., 5n. It is noted that if the digital object 51, 52,..., 5n requests multiple pieces of information, it might be that only part of the requested information will actually be shared the digital object 51, 52,..., 5n. This may, for instance, be the case if the digital object 51, 52,..., 5n has a trust level not allowing some particular information to be shared. From box 61, flow continues to box 59. If on the other hand, the outcome of decision box 60 is “No”, i.e., the digital object 51, 52,..., 5n do not reach the threshold set for the requested information subset flow continues to box 58, in which the user device 1 rejects the request for information. From box 58 flow continues to decision box 59. In box 59, the user device i may update trust for the handled digital object 51, 52,..., 5n. Flow then continues to decision box 62, wherein it is determined whether there are further requests pending. If yes, flow returns to box 56, if no, flow ends in box 63.
Fig. 4 is a flow chart over steps of a method 40 in the XR enabled device 1 in accordance with the present disclosure. A user in possession of the XR enabled device 1, for instance, wearing it on her head is moving in an XR environment. In step 41, the user device 1 detects an XR object 51, 52,..., 5n in the XR environment. The detection may, for instance, occur due to geographical proximity, between the XR enabled device 1 and the XR object 51, 52,..., 5n, e.g., by a positioning system, nearfield communication, Bluetooth, WiFi or the like. As another example, the detection may occur by the XR enabled device 1 receiving the object from another user or entity. As still another example, the detection may occur by the XR enabled device having a camera recognizing a location, e.g., identifying a coffee shop logo or reading a QR (Quick Response) code or other out-of-band mechanism.
Next, the user device 1 receives information from the XR object 51, 52,..., 5n, for instance, receives an identifier, type and cryptographic proof, and in step 42 the user device 1 determines, based on the received information, that it has not seen the XR object 51, 52,..., 5n before.
In step 43, the user device 1 then stores the information received in step 41, e.g., in a database, and then selects an authorization profile. The selection may be based on any type of selection criteria, e.g., any of the herein given examples.
In step 44, the user device 1 establishes trust level. This establishing may, for instance, be made by an external request to e.g., a database or by choosing a standardized trust level value, e.g., as stored within the user device and in particular in the DT 2 thereof.
In step 45, which is an optional step, the user device 1 may inform the XR object 51, 52,..., 5n, about which information subsets it may access. The XR object 51, 52,..., 5n, may then request information from any of these information subsets.
Fig. 5 is a flow chart over steps of a method in an XR enabled device in accordance with the present disclosure. The method 20 is performed in the XR enabled device 1 for selectively sharing information with an XR object 51, 52,..., 5n. The information to be shared is, in some embodiments, determined by the user. In other embodiments, the information allowed to be shared is determined by policies related to the XR enabled user device i. For instance, a company owning the user device may impose restrictions on information sharing, e.g., prohibiting some information to be shared. A user wears the XR enabled device i, e.g., a head worn gadget or glasses, when moving in an XR environment, e.g. Virtual Reality (VR), Augmented Reality (AR), Mixed Reality (MR), in essence any mix of real and virtual environment. Illustrative examples of such situations: a user interacting with other users and digital objects in a metaverse scenario; a user in a work environment using XR capabilities when performing tasks such as, e.g., maintenance, construction or plumbing; a user participating in an XR game interacting with both physical and real objects; a visitor interacting with XR-based reenactments of historical events; in a grocery store, using XR capabilities for finding suitable path, special deals or recommendations etc.
The method 20 comprises performing 22 a handshake procedure with the XR object 5b 52,—, 5n- The handshake procedure maybe a known procedure, in which two devices are to establish a connection, for instance comprising setting up various parameters for communication, rules for data sharing etc., as described earlier.
The method 20 comprises receiving 24, from the XR object 51, 52,..., 5n, a request for information related to the user of the XR enabled device 1. The information may comprise information subsets defining the user, comprising for instance one or more of: health related information, consumption related information, work related information, social information, travel related information, current intent, gender, age, medical conditions, allergies, loyalty program membership, political information, gender-interaction preferences, religion, personal preferences, and identity related information (for instance: name, social security number, temporary identity, temporary pseudonym, alternative e-mail entries, alternative e-mail entries, alternative addresses, temporary card number and credit rating). It is realized that essentially any type of information may be used for defining a particular user.
The method 20 comprises determining 26, by a digital representation 2 of the user, which requested information to send. The digital representation 2 of the user is a specific one for each user. The information to send may be all of the requested information, or only some or none at all. This step comprises determining what specific information that is indeed - if any - allowable for sharing with the specific XR object 51, 52,..., 5n at hand. The digital representation 2 of the user maybe required to determine allowability of requested information separately for each XR object 51, 52,..., 5n, as the rules for sharing information may differ for the XR objects 5b 52,—, 5n.
The method 20 comprises providing 28, to the XR object 51, 52,..., 5n, information that has been determined in the earlier step to be allowable for sharing with the XR object 51, 52,..., 5n.
The method provides many advantages. By letting a digital twin of the user communicate with its surrounding digital environment, the method enables the requesting of information that is relevant for a user in an VR/XR environment and the user is hence less likely to be burdened by content which is irrelevant for the current context or task. This in turn reduces the risk of the user experiencing a cognitive overload. Further, the method may also reduce the amount of data sent by the XR objects 51, 52,..., 5n, thereby providing an optimized radio usage.
In various embodiments, the determining 26 maybe based on one or more of: information subsets defining the user, rules for information distribution, and level of trust relating to the XR object. In further embodiments, the rules for information distribution may comprise rules relating to one or more of: XR object identity, XR object type, XR object owner, a trusted part having verified correctness of attributes.
In still other embodiments, each information subset may have one or both of: individually specified required trust level and individually specified sharing principles.
In variations of the above set of embodiments, the method may comprise updating the trust levels and sharing principles based on behavior of the XR object 51, 52,..., 5n. The behavior of the XR object may, for instance, comprise previous interaction history, previous rule compliance, out-of-band information (e.g. public social media information), reviews related to object/ company trust and behavior. As a particular example, companies that have been blacklisted by e.g., the National Board for Consumer Disputes (Allmanna Reklamationsnamnden, ARN, in Sweden) may be entered to a list of blacklisted companies in a trust -level database. The digital representation 2 of the user may search through such database for information relevant for the determining which information is allowable for sharing.
The digital representation 2 of the user may, for instance, comprise one of: a machine learning model, a behavioral database or a software defined representation. While decision tree models might be the most intuitive choice for constructing an information distribution mechanism in the digital representation 2, other machine learning models maybe considered as well. For instance, a neural network maybe considered, which neural network is trained on a dataset containing objects having a certain trust, authorization and an information subset that it has requested. By using a supervised approach, i.e., training with pre-defined labels of what is correct, or a semi-supervised, i.e., beginning with pre-defined labels of what is correct and continuing with unlabeled data, such model may learn the behavior desired by the user.
In various embodiments, the performing 22 the handshake procedure comprises the digital representation 2 of the user providing an identifier for use in an XR session and one or more parameters related to the information subsets that the XR object 51, 52,..., 5n requests. The digital representation 2 of the user further provides information on how the XR enabled device 1 prefers the XR object 51, 52,..., 5n to act. The DT 2 may, for instance, instruct the XR object 51, 52,..., 5n on how much content to supply, what content the user is interesting in, etc.
In various embodiments, the method 20 comprises, prior to the performing 22 of the handshake procedure, a step wherein the XR enabled device 1 detects the XR object 5b 52,—, 5n- Such detection may, for instance, occur automatically, e.g., when the XR enabled device 1 is sufficiently close to the XR object.
In various embodiments, the digital representation 2 of the user is located within the XR enabled device 1. The method 20 then comprises obtaining the information related to the user from a storage within the XR enabled device 1.
In various other embodiments, the digital representation 2 of the user is instead located externally of the XR enabled device 1 and the result of the determination 26 on which requested information to send is then received by the XR enabled device 1 from the digital representation of the user. The digital representation 2 of the user may obtain the information related to the user from a storage external to the XR enabled device i.
In an embodiment, the digital representation 2 may control information subsets which belongs to another digital representation. This might be needed, for instance, in a situation with a child and a parent. In such cases a digital representation A may receive a request for information subset I from an object O. However, the needed trust level exceeds what the digital representation A may decide upon by itself, and it must therefore receive clearance from another digital representation B. The request from O is forwarded to representation B, which in turn reports its decision to digital representation A.
In a further embodiment, a user maybe represented by multiple digital representations. For example, a private digital representation and a work digital representation, which may comprise different rules and trust relations. In a descriptive illustration, a person’s professional digital representation in duty as a police officer may, during her operation of a vehicle, hold some principles of interactions and associated trust within a law enforcement information infrastructure, whereas the same physical person potentially operating the same vehicle along same streets but in the role of a civilian may not have access to such law enforcement information infrastructure.
As another example, a service technician, engaged in a service task in a hazardous environment, the user context, cognitive load etc. could escalate the rule settings to ensure that only information relevant for the task is displayed. Such rule setting would thus alleviate the service technician from receiving anything but the most important information, which in turn may reduce, for instance, the cognitive load of the technician.
As still another example, in an environment in which a user has an XR enabled user device in a restaurant environment, the method as described in different embodiments, may be used for sharing information regarding, for instance, allergies, and thereby creating a personalized menu for the user.
In another aspect, it may be considered to have different digital representations associated to a certain device or devices. In similarity with the earlier civilian/police officer example, certain hardware, equipment or device may be associated with implicit roles and hence use specific digital representations; for example, the use of a certain physical object that implicitly carries some specific level of user authentication (badge, ID card, gun, device (smartwatch), etc.) may assign a certain digital representation to the user/wearer, etc.
In different embodiments, the XR enabled device i is one or more of: near-eye display (XR headset, smart glasses, smart lenses etc.), user equipment (mobile phone, tablet, laptop, etc.), heads-up display (smart glass projection screen, such as windshield or window), hologram projector or the like.
Fig. 6 schematically illustrates, in terms of a number of functional units, the components of a user device i according to an embodiment. Processing circuitry no is provided using any combination of one or more of a suitable central processing unit (CPU), multiprocessor, microcontroller, digital signal processor (DSP), etc., capable of executing software instructions stored in a computer program product 330 (as in Fig. 8), e.g., in the form of a storage medium 130. The processing circuitry no may further be provided as at least one application specific integrated circuit (ASIC), or field programmable gate array (FPGA).
Particularly, the processing circuitry 110 is configured to cause the user device 1 to perform a set of operations, or actions, as disclosed herein. For example, the storage medium 130 may store the set of operations, and the processing circuitry 110 maybe configured to retrieve the set of operations from the storage medium 130 to cause the user device 1 to perform the set of operations. The set of operations maybe provided as a set of executable instructions. The processing circuitry 110 is thereby arranged to execute embodiments of the methods as disclosed herein.
The storage medium 130 may also comprise persistent storage, which, for example, can be any single one or combination of magnetic memory, optical memory, solid state memory or even remotely mounted memory.
The user device 1 may further comprise a communications interface 120 for communications with other entities, functions, nodes, and devices, over suitable interfaces. As such the communications interface 120 may comprise one or more transmitters and receivers, comprising analogue and digital components. The processing circuitry no controls the general operation of the user device i e.g., by sending data and control signals to the communications interface 120 and the storage medium 130, by receiving data and reports from the communications interface 120, and by retrieving data and instructions from the storage medium 130. Other components, as well as the related functionality, of the device 20 are omitted in order not to obscure the concepts presented herein.
Fig. 7 is a schematic diagram showing functional modules of a user device according to an embodiment. The user device 1 of figure 7 comprises a number of functional modules; a perform module 210 configured to perform a handshake procedure with the XR object 51, 52,..., 5n, a receive module 220, configured to receive, from the XR object 51, 52,..., 5n, a request for information related to the user of the XR enabled device 1, a determine module 230, configured to determine, based on output from a digital representation 2 of the user, which requested information to send, and a provide module 240, configured to provide, to the XR object 51, 52,..., 5n, information determined to be allowable for the XR object 51, 52,..., 5n. The user device 1 of Fig. 7 may further comprise a number of optional functional modules (not illustrated), such any of an update module, configured to update the trust levels and sharing principles based on behavior of the XR object 51, 52,..., 5n, a monitor module 250 configured to monitor data relating to user, a process module 260 configured to process data, a provide module 270 configured to provide processed data to policy entity, and an update module 280 configured to update policies based on processed data. In general terms, each such functional module may be implemented in hardware or in software. Preferably, one or more or all functional modules may be implemented by the processing circuitry no, possibly in cooperation with the communications interface 120 and the storage medium 130. The processing circuitry 110 may thus be arranged to from the storage medium 130 fetch instructions as provided by a functional module and to execute these instructions, thereby performing any actions of the user device 1 as disclosed herein.
Fig. 8 shows one example of a computer program product comprising computer readable means according to an embodiment. On this computer readable means 340, a computer program 320 can be stored, which computer program 320 can cause the processing circuitry no and thereto operatively coupled entities and devices, such as the communications interface 120 and the storage medium 130, to execute methods according to embodiments described herein. The computer program 320 and/or computer program product 330 may thus provide means for performing any actions of the user device 1 as disclosed herein.
In the example of Fig. 8, the computer program product 330 is illustrated as an optical disc, such as a CD (compact disc) or a DVD (digital versatile disc) or a Blu-Ray disc. The computer program product 330 could also be embodied as a memory, such as a random-access memory (RAM), a read-only memory (ROM), an erasable programmable read-only memory (EPROM), or an electrically erasable programmable read-only memory (EEPROM) and more particularly as a non-volatile storage medium of a device in an external memory such as a USB (Universal Serial Bus) memory or a Flash memory, such as a compact Flash memory. Thus, while the computer program 320 is here schematically shown as a track on the depicted optical disk, the computer program 320 can be stored in any way which is suitable for the computer program product 330.
An Extended Reality, XR, enabled device 1 for selectively sharing information with an XR object 51, 52,..., 5n, is provided. The XR enabled device 1 is configured to:
- perform a handshake procedure with the XR object 51, 52,..., 5n,
- receive, from the XR object 51, 52,..., 5n, a request for information related to the user of the XR enabled device 1,
- determine, based on a digital representation 2 of the user, which requested information to send, and to
- provide, to the XR object 51, 52,..., 5n, information determined to be allowable for the XR object 51, 52,..., 5n-
In an embodiment, the Extended Reality, XR, enabled device 1 is configured to determine information to be allowable for sharing based on one or more of: information subsets defining the user, rules for information distribution, and level of trust relating to the XR object.
In a variation of the above embodiment, the information subsets defining the user comprises one or more of: health related information, consumption related information, work related information, social information, travel related information, current intent, gender, age, medical conditions, allergies, loyalty program membership, political information, gender-interaction preferences, religion, personal preferences, identity related information and credit rating.
In various embodiments, the rules for information distribution comprise rules relating to one or more of: XR object identity, XR object type, XR object owner, a trusted part having verified correctness of attributes.
In various embodiments, each information subset has one or both of: individually specified required trust level and individually specified sharing principles.
In various embodiments, the Extended Reality, XR, enabled device i is configured to update the trust levels and sharing principles based on behavior of the XR object 51, 52,—, 5n-
In various embodiments, the digital representation 2 of the user comprises one of: a machine learning model, a behavioral database or software defined representation.
In various embodiments, the XR enabled device i is configured to perform the handshake procedure by the digital representation 2 of the user providing an identifier for use in an XR session and one or more parameters related to the information subsets that the XR object 51, 52,..., 5n requests and on how the XR enabled device 1 prefers the XR object 51, 52,..., 5n to act.
In various embodiments, the XR enabled device 1 is configured to, prior to the performing of the handshake procedure, detect the XR object 51, 52,..., 5n.
In various embodiments, the digital representation 2 of the user is located within the XR enabled device 1, while in a different set of embodiments, the digital representation 2 of the user is located external to the XR enabled device 1, and wherein the result of the determining 26 which requested information to send is received from the digital representation of the user.
In various embodiments, the XR enabled device 1 is configured to obtain the information related to the user from a storage in the XR enabled device 1.
In various embodiments, the XR enabled device 1 is configured to obtain the information related to the user from a storage external to the XR enabled device 1. In various embodiments, the XR enabled device i is one or more of: near-eye display (XR headset, smart glasses, smart lenses etc.), user equipment (mobile phone, tablet, laptop, etc.), heads-up display (smart glass projection screen, such as windshield or window), hologram projector or the like. It is noted that the XR enabled device i may be constructed in many different forms.
The invention has mainly been described herein with reference to a few embodiments. However, as is appreciated by a person skilled in the art, other embodiments than the particular ones disclosed herein are equally possible within the scope of the invention, as defined by the appended patent claims.

Claims

Claims A method (20) in an Extended Reality, XR, enabled device (1) for selectively sharing information with an XR object (51, 52,..., 5n), the method (20) being performed in the XR enabled device (1) and comprising:
- performing (22) a handshake procedure with the XR object (51, 52,..., 5n),
- receiving (24), from the XR object (51, 52,..., 5n), a request for information related to the user of the XR enabled device (1),
- determining (26), by a digital representation (2) of the user, which requested information to send, and
- providing (28), to the XR object (51, 52,..., 5n), information determined to be allowable for sharing with the XR object (51, 52,..., 5n). The method (20) as claimed in claim 1, wherein the determining (26) comprises determining information to be allowable for sharing based on one or more of: information subsets defining the user, rules for information distribution, and level of trust relating to the XR object. The method (20) as claimed in claim 2, wherein the information subsets defining the user comprises one or more of: health related information, consumption related information, work related information, social information, travel related information, current intent, gender, age, medical conditions, allergies, loyalty program membership, political information, gender-interaction preferences, religion, personal preferences, identity related information and credit rating. The method (20) as claimed in claim 2 or 3, wherein the rules for information distribution comprise rules relating to one or more of: XR object identity, XR object type, XR object owner, a trusted part having verified correctness of attributes. The method (20) as claimed in any of claims 2 - 4, wherein each information subset has one or both of: individually specified required trust level and individually specified sharing principles. The method (20) as claimed in claim 5, comprising updating the trust levels and sharing principles based on behavior of the XR object (51, 52,..., 5n). The method (20) as claimed in any of the preceding claims, wherein the digital representation (2) of the user comprises one of: a machine learning model, a behavioral database or software defined representation.
8. The method (20) as claimed in any of the preceding claims, wherein performing the handshake procedure comprises the digital representation (2) of the user providing an identifier for use in an XR session and one or more parameters related to the information subsets that the XR object (51, 52,..., 5n) requests and on how the XR enabled device (1) prefers the XR object (51, 52,..., 5n) to act.
9. The method (20) as claimed in any of the preceding claims, comprising, prior to the performing of the handshake procedure, the XR enabled device (1) detecting the XR object (51, 52,..., 5n).
10. The method as claimed in any of the preceding claims, wherein the digital representation (2) of the user is located within the XR enabled device (1).
11. The method as claimed in any of the preceding claims 1 - 9, wherein the digital representation (2) of the user is located external to the XR enabled device (1), and wherein the result of the determining (26) which requested information to send is received from the digital representation of the user.
12. The method (20) as claimed in any of the preceding claims, comprising obtaining the information related to the user from a storage in the XR enabled device (1).
13. The method (20) as claimed in any of claims 1 - 11, comprising obtaining the information related to the user from a storage external to the XR enabled device (1).
14. The method (20) as claimed in any of the preceding claims, wherein the XR enabled device (1) is one or more of: near-eye display, such as a XR headset, smart glasses, smart lenses, or user equipment, such as mobile phone, tablet, laptop, or a heads-up display, such as a smart glass projection screen, windshield window, or a hologram projector.
15. A computer program (320) for a user device (1) for sharing information with an Extended Reality, XR, object (51, 52,..., 5n), the computer program (320) comprising computer program code, which, when executed on at least one processor on the user device (1) causes the user device (1) to perform the method (20) according to any one of claims 1 - 14.
16. A computer program product (330) comprising a computer program (320) as claimed in claim 15 and a computer readable means (340) on which the computer program (320) is stored.
17. An Extended Reality, XR, enabled device (1) for selectively sharing information with an XR object (51, 52,..., 5n), the XR enabled device being configured to: - perform a handshake procedure with the XR object (51, 52,..., 5n),
- receive, from the XR object (51, 52,..., 5n), a request for information related to the user of the XR enabled device (1),
- determine, based on a digital representation (2) of the user, which requested information to send, and to
- provide, to the XR object (51, 52,..., 5n), information determined to be allowable for the XR object (51, 52,..., 5n). The Extended Reality, XR, enabled device (1) as claimed in claim 17, configured to determine information to be allowable for sharing based on one or more of: information subsets defining the user, rules for information distribution, and level of trust relating to the XR object. The Extended Reality, XR, enabled device (1) as claimed in claim 17, wherein the information subsets defining the user comprises one or more of: health related information, consumption related information, work related information, social information, travel related information, current intent, gender, age, medical conditions, allergies, loyalty program membership, political information, genderinteraction preferences, religion, personal preferences, identity related information and credit rating. The Extended Reality, XR, enabled device (1) as claimed in claim 18 or 19, wherein the rules for information distribution comprise rules relating to one or more of: XR object identity, XR object type, XR object owner, a trusted part having verified correctness of attributes. The Extended Reality, XR, enabled device (1) as claimed in any of claims 17 - 20, wherein each information subset has one or both of: individually specified required trust level and individually specified sharing principles. The Extended Reality, XR, enabled device (1) as claimed in claim 21, configured to update the trust levels and sharing principles based on behavior of the XR object (51, 52,-> 5n)- The Extended Reality, XR, enabled device (1) as claimed in any of claims 17 - 22, wherein the digital representation (2) of the user comprises one of: a machine learning model, a behavioral database or software defined representation. The Extended Reality, XR, enabled device (1) as claimed in any of claims 17 - 23, configured to perform the handshake procedure by the digital representation (2) of the user providing an identifier for use in an XR session and one or more parameters related to the information subsets that the XR object (51, 52,..., 5n) requests and on how the XR enabled device (1) prefers the XR object (51, 52,..., 5n) to act.
25. The Extended Reality, XR, enabled device (1) as claimed in any of claims 17 - 24, configured to, prior to the performing of the handshake procedure, detect the XR object (51, 52,..., 5n).
26. The Extended Reality, XR, enabled device (1) as claimed in any of the preceding claims, wherein the digital representation (2) of the user is located within the XR enabled device (1).
27. The Extended Reality, XR, enabled device (1) as claimed in any of claims 17 - 25, wherein the digital representation (2) of the user is located external to the XR enabled device (1), and wherein the result of the determining (26) which requested information to send is received from the digital representation of the user.
28. The Extended Reality, XR, enabled device (1) as claimed in any of claims 17 - 27, configured to obtain the information related to the user from a storage in the XR enabled device (1).
29. The Extended Reality, XR, enabled device (1) as claimed in any of claims 17 - 26, configured to obtain the information related to the user from a storage external to the XR enabled device (1).
30. The Extended Reality, XR, enabled device (1) as claimed in any of claims 17 - 29, wherein the XR enabled device (1) is one or more of: near-eye display, such as a XR headset, smart glasses, smart lenses, or user equipment, such as mobile phone, tablet, laptop, or a heads-up display, such as a smart glass projection screen, windshield window, or a hologram projector.
PCT/SE2022/050428 2022-05-04 2022-05-04 Methods and devices for selective sharing of information in extended reality WO2023214905A1 (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
PCT/SE2022/050428 WO2023214905A1 (en) 2022-05-04 2022-05-04 Methods and devices for selective sharing of information in extended reality

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
PCT/SE2022/050428 WO2023214905A1 (en) 2022-05-04 2022-05-04 Methods and devices for selective sharing of information in extended reality

Publications (1)

Publication Number Publication Date
WO2023214905A1 true WO2023214905A1 (en) 2023-11-09

Family

ID=88646777

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/SE2022/050428 WO2023214905A1 (en) 2022-05-04 2022-05-04 Methods and devices for selective sharing of information in extended reality

Country Status (1)

Country Link
WO (1) WO2023214905A1 (en)

Citations (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2015073612A1 (en) * 2013-11-15 2015-05-21 Microsoft Technology Licensing, Llc Protecting privacy in web-based immersive augmented reality
US20170094018A1 (en) * 2015-09-24 2017-03-30 Intel Corporation Facilitating dynamic filtering and local and/or remote processing of data based on privacy policies and/or user preferences
US20180157333A1 (en) * 2016-12-05 2018-06-07 Google Inc. Information privacy in virtual reality
US20210089642A1 (en) * 2019-09-19 2021-03-25 Facebook Technologies, Llc Artificial reality system having hardware mutex with process authentication
EP3964985A1 (en) * 2019-06-21 2022-03-09 Huawei Technologies Co., Ltd. Simulation object identity recognition method, and related apparatus and system
US20220414403A1 (en) * 2021-06-28 2022-12-29 Meta Platforms Technologies, Llc Artificial Reality Application Lifecycle

Patent Citations (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2015073612A1 (en) * 2013-11-15 2015-05-21 Microsoft Technology Licensing, Llc Protecting privacy in web-based immersive augmented reality
US20170094018A1 (en) * 2015-09-24 2017-03-30 Intel Corporation Facilitating dynamic filtering and local and/or remote processing of data based on privacy policies and/or user preferences
US20180157333A1 (en) * 2016-12-05 2018-06-07 Google Inc. Information privacy in virtual reality
EP3964985A1 (en) * 2019-06-21 2022-03-09 Huawei Technologies Co., Ltd. Simulation object identity recognition method, and related apparatus and system
US20210089642A1 (en) * 2019-09-19 2021-03-25 Facebook Technologies, Llc Artificial reality system having hardware mutex with process authentication
US20220414403A1 (en) * 2021-06-28 2022-12-29 Meta Platforms Technologies, Llc Artificial Reality Application Lifecycle

Similar Documents

Publication Publication Date Title
US11196569B2 (en) Systems and methods for accuracy and attestation of validity of data shared in a secure distributed environment
US20230362166A1 (en) System and method for storing and distributing consumer information
US10121143B1 (en) Method and system for blockchain-based combined identity, ownership, integrity and custody management
US11296895B2 (en) Systems and methods for preserving privacy and incentivizing third-party data sharing
US20200005290A1 (en) System and Method for Processing Payments in Fiat Currency Using Blockchain and Tethered Tokens
US11159525B2 (en) Multi-dimensional framework for defining criteria that indicate when authentication should be revoked
EP3844934B1 (en) A computer system and method of operating same for handling anonymous data
US11411959B2 (en) Execution of application in a container within a scope of user-granted permission
US20140279544A1 (en) Creation and use of mobile identities
US20200213296A1 (en) Providing verified claims of user identity
US20230230066A1 (en) Crypto Wallet Configuration Data Retrieval
US20230419308A1 (en) System and method for processing payments in fiat currency using blockchain and tethered tokens
KR20150070387A (en) Publication and removal of attributes in a multi-user computing system
US11587084B2 (en) Decentralized identification anchored by decentralized identifiers
EP4152197A1 (en) Methods and systems for managing user data privacy
US20240205024A1 (en) Systems and methods for use in provisioning credentials
WO2022250913A1 (en) Bootstrapping trust in decentralized identifiers
US20230179402A1 (en) Device asserted verifiable credential
KR20200112229A (en) Electronic device for providing personal information and operating method thereof
US20230131095A1 (en) Computer method and graphical user interface for identity management
WO2023214905A1 (en) Methods and devices for selective sharing of information in extended reality
CN117597696A (en) Machine learning computer system architecture
US20230177528A1 (en) Systems and methods for data insights from consumer accessible data
Venkatesh Kumar et al. A privacy preservation data collection and access control using entropy-based conic curve
US11893553B1 (en) Systems and methods of exchanging digital assets using a public key cryptography (PKC) framework

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 22940882

Country of ref document: EP

Kind code of ref document: A1