WO2016029232A2 - System, apparatus, and method for interfacing between a user and an it environment - Google Patents
System, apparatus, and method for interfacing between a user and an it environment Download PDFInfo
- Publication number
- WO2016029232A2 WO2016029232A2 PCT/US2015/046617 US2015046617W WO2016029232A2 WO 2016029232 A2 WO2016029232 A2 WO 2016029232A2 US 2015046617 W US2015046617 W US 2015046617W WO 2016029232 A2 WO2016029232 A2 WO 2016029232A2
- Authority
- WO
- WIPO (PCT)
- Prior art keywords
- user
- sensor
- media
- head
- display apparatus
- Prior art date
Links
Classifications
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F21/00—Security arrangements for protecting computers, components thereof, programs or data against unauthorised activity
- G06F21/30—Authentication, i.e. establishing the identity or authorisation of security principals
- G06F21/31—User authentication
- G06F21/32—User authentication using biometric data, e.g. fingerprints, iris scans or voiceprints
Definitions
- the invention is a system, apparatus, and method for providing an interface between a user and an IT environment (collectively, the "system").
- the human-IT environment interface is inadequate in many different respects, whether the issue is authenticating that a person is who they say they are, or whether the issue is being able to tailor and customize the IT environment to a particular person.
- the conventional interface between man and machine misses many opportunities to improve the interface between a human being and an IT environment.
- Networks such as the Internet, the World Wide Web, and other public and semi-public networks provide people, businesses, and other organizations an unprecedented ability for interactions. Unfortunately the ability and utility of connecting with others is sometimes deterred by the possibility of fraud—particularly in the context of a user pretending to be someone who they are not.
- the invention is a system, apparatus, and method for providing an interface between a user and an IT environment (collectively, the "system").
- the system can be implemented utilizing one or more of the following subsystems: (1) user authentication; (2) eye tracking; and/or (3) user tracking.
- Figure la is a block diagram illustrating (i) an example of a system that can be used to generate an authentication ID from a user (i.e. authenticate a user) and (ii) certain component elements that can comprise such a system, such as a sensor, a sensor reading captured from the sensor, a heuristic process for selectively identifying an authenticated ID from the various inputs, a computer, and other types of relevant data.
- a system that can be used to generate an authentication ID from a user (i.e. authenticate a user) and (ii) certain component elements that can comprise such a system, such as a sensor, a sensor reading captured from the sensor, a heuristic process for selectively identifying an authenticated ID from the various inputs, a computer, and other types of relevant data.
- Figure lb is a hierarchy diagram illustrating an example of different kinds of biometric sensor readings and different kinds of behavioral sensor readings.
- Figure lc is an input/output diagram illustrating an example of an authentication ID and a corresponding confidence metric be identified by a heuristic of the system from the inputs which can include one or more sensor readings, a login password submission, a certificate, and potentially other data.
- Figure Id is a hierarchy diagram illustrating an example of different types of authentication IDs, including but not limited to a continuous authenticated ID, a period authenticated ID, and a discrete authenticated ID.
- Figure 2a is a block diagram illustrating an example of a user authentication system that utilizes sensor readings captured from sensor located on a head-mounted display apparatus to link to an authenticated ID on a server.
- Figure 2b is an input/output diagram involving a head-mounted display apparatus.
- Figure 2c is a flow chart diagram illustrating an example of how a user with a head- mounted display apparatus can be authenticated.
- Figure 3a is a block diagram illustrating an example of a head-mounted display apparatus as an interface between a user and a media experience.
- Figure 3b is a block diagram illustrating an example of a sensor as an interface between a user and a sensor feedback.
- Figure 3c is a block diagram illustrating an example of a head-mounted display apparatus as an interface between a user and an instruction.
- Figure 4a is a block diagram illustrating an example of components that can be included in a computer.
- Figure 4b is a block diagram illustrating an example of different types of computers.
- Figure 4c is a block diagram illustrating an example of different components of the system that can include computers.
- Figure 4d is a block diagram illustrating an example of a variation of Figure 4c that does not include a remote server.
- Figure 4e is a block diagram illustrating an example of a variation of Figures 4c and 4d that does not include a media player that is separate from the head-mounted display apparatus.
- Figure 4f is a block diagram illustrating an example of different types of data that can be stored on a database used by the system.
- Figure 5a is a block diagram illustrating an example of sensor feedback being captured by a head-mounted display apparatus and then used by one or more heuristics to impact the media experience sent to the user of the head-mounted display apparatus.
- Figure 5b is a block diagram illustrating an example of a variation of Figure 5a in which the applicable profile storage and heuristic processing is performed in a local media player instead of a remote server.
- Figure 5c is a block diagram illustrating an example of a variation of Figures 5a and 5b in which there is no local media player interfacing between the head-mounted display apparatus and the remote server.
- Figure 5d is an input/output diagram for a head-mounted display apparatus indicating that the sensor feedback is an output of an embedded sensor and the media experience is an input for the head-mounted display apparatus.
- Figure 5e is a flow chart diagram illustrating an example of a method for dynamically influence the delivery of a media experience.
- Figure 5f is a block diagram illustrating an example of different operating modes that can be incorporated into the processing performed by the system.
- Figure 5g is a block diagram illustrating an example of the different types of sensors that can be used to capture data about a user.
- Figure 6 is a block diagram illustrating a system with a head-mounted display apparatus acting as an interface between a user and an IT environment.
- the invention is a system, apparatus, and method for providing an interface between a user and an IT environment (collectively, the "system").
- Table 1 below provides a glossary/index of claim element numbers, claim element names, and claim element descriptions.
- Figure la is a block diagram illustrating (i) an example of a system 100 that can be used to generate an authentication ID 590 from a user 80 (i.e. authenticate a user) and (ii) certain component elements that can comprise such a system 100, such as a sensor 300, a sensor reading 310 captured from the sensor 300, a heuristic process 525 for selectively identifying an authenticated ID 590 from the various inputs, a computer 510, and potentially other types of relevant data 60.
- a sensor 300 i.e. authenticate a user
- certain component elements that can comprise such a system 100, such as a sensor 300, a sensor reading 310 captured from the sensor 300, a heuristic process 525 for selectively identifying an authenticated ID 590 from the various inputs, a computer 510, and potentially other types of relevant data 60.
- the system 100 can utilize sensor readings 310 to authenticate a user 80.
- User authentication can also allow the system 100 to authenticate desired user activity, in addition to user identity.
- authentication can be based on a wide variety of different sensor readings 310, login/password submissios 580, certificates such as PKI certificates 81, and potentially other data 560 that is cognizable to the system 100.
- Authentication processing can provide opportunities for accelerated process analogous to the "single-click" purchasing methodology of Amazon or the "PreCheck" program of the Transportation Security Administration. Reliable authentication can provide for accelerated or at least more convenient processing in a wide variety of different contexts.
- User authentication opens new opportunities for users to convey acceptance of business transactions beyond use of a mouse, although mouse use can still play a role in the authentication process.
- the system can provide an opportunity for integrated authentication through user behavior.
- User authentication is not limited to an individual process.
- the system can perform multiple authentication activities or exchange authentication information with other devices and sensors.
- User authentication processes can be based on biometric information (such as a fingerprint, voice recognition, facial recognition, or retina scan), user behavior (how the person moves a mouse, how they walk, etc.), login password, and/or tokens (hardware or software keys embedded in hardware, such as ATM or credit cards).
- biometric information such as a fingerprint, voice recognition, facial recognition, or retina scan
- user behavior how the person moves a mouse, how they walk, etc.
- login password e.g., a password
- tokens software or software keys embedded in hardware, such as ATM or credit cards.
- Different authentication heuristics can be associated with different confidence intervals. For example, a retina scan can be associated with a greater degree of confidence and accuracy than a login/password process.
- Some user authentication activities can be continuous, such as authenticating a user 80 based on how they move a mouse or based on how they walk.
- Discrete authentication activities can be repeated multiple times if the triggering event/activity is repeated.
- the apparatus 200 can be configured to automatically re-authenticate the user 80 every so often, or immediately prior to certain events such as the display of certain sensitive information, the display of
- the system 100 can be implemented in the configuration of a single integrated standalone device. However, it will often be more desirable for the system 100 to be implemented in less integrated and more modular configurations.
- contexts in which the ability to authenticate a user 80 is potentially useful can include but are not limited to: initiating business or financial transactions; accessing financial or other proprietary information; tracking user behavior for the purposes of modeling the behavior of a particular user; communicating with others; and other contexts where it is important to have some magnitude of confidence that a particular user 80 is who he or she says that he or she is.
- the system 100 can be configured to accommodate users 80 in different physical locations using different devices for different purposes.
- the system 100 could involve the use of an embedded retina scanner at an ATM machine located within a bank, a head-mounted display apparatus located within the home, and a general purpose desktop computer while at the office.
- the scope of different authentication technologies that can be incorporated into the system 100 the more accurate and more useful the system 100 can become.
- Figure 2b illustrates an example of a head-mounted display apparatus 200 that provides for delivering a media experience 600 to the user 80. was the use of a apparatus 200 such a
- a user 80 is the subject or target of activity by the system 100 to be authenticated.
- a user is a human being who voluntarily interacts with the system 100 for the purposes of being authenticated by the system 100.
- the system 100 can be implemented in ways that could be useful to other types of animals and potentially even robots.
- the system 100 could also be used to authenticate users 80 without the knowledge of the applicable user 80.
- An authenticated ID 590 is an output of the system 100.
- the authenticated ID 590 is what allows the system 100 to communicate to third parties such as other users 80, businesses, government agencies, and other organizations that the user 80 is who the user 80 says he or she is.
- the system 100 can provide a convenient and efficient way to authenticate the identity of users 80, i.e. link a specific user 80 to a specific authenticated ID 590 that can be relied upon by the system 100 and other systems, applications, and entities that interesting in interacting with the user 80 if the identity of the user 80 can be.
- Figure lb is a hierarchy diagram illustrating an example of different kinds of sensor readings 310, such as biometric sensor readings 320 and different kinds of behavioral sensor readings 330 that can be used selectively identify the authenticated ID 590 for a particular user 80.
- Figure lc is an input/output diagram illustrating an example of an authentication ID 590 and a corresponding confidence metric 563 be identified by a heuristic 525 of the system 100 from the inputs which can include one or more sensor readings 310, a logm/password submission 580, a certificate 581, and potentially other data 560.
- Figure Id is a hierarchy diagram illustrating an example of different types of authentication IDs 590, including but not limited to a continuous authenticated ID 591, a period authenticated ID 592, and a discrete authenticated ID 593.
- a continuous authenticated ID 591 is an authenticated ID 590 that is re-authenticated on a continuous or at least substantially continuous basis.
- a sensor 300 in the apparatus 200 could be configured to capture and transmit sensor readings 310 on a continuous basis while the apparatus 200 is being used to deliver a media experience to the user 80.
- a periodic authenticated ID 592 is an authenticated ID 590 that is re-authenticated on a predefined periodic basis. So for example, a retina scan 322 could be perpetually re-obtained once the predefined cycle time has lapsed.
- a discrete authenticated ID 593 is an authenticated ID 590 that is captured in relation to a discrete period of time triggered by the initiation of a specific event or activity.
- a sensor 300 is a device used to captures information.
- the system 100 can utilize a wide variety of different sensors that can be useful to the authentication of users 80.
- the potential diversity of sensors 300 that can be utilized by the system 100 is commensurate with the potential diversity of sensor readings 310 that can be utilized by the sensor 100.
- Figure lb is a hierarchy diagram illustrating an example of different categories and subcategories of sensor readings 310.
- a biometric sensor reading 320 is a sensor reading 310 that is based on a biological attribute of the user 80, such as a retina scan 322, a fingerprint scan 324, facial recognition, a voice print 326, and other identifiers that direct relate to who a person is. [0071] 2. Behavioral Sensor Reading
- a sensor reading 310 that is based on a behavioral attribute of the user 80, such as a voice print/scan 332, a walk scan 334, a mouse scan 336, and other techniques for authentication a user 80 via user 80 behavior.
- a behavioral attribute of the user 80 such as a voice print/scan 332, a walk scan 334, a mouse scan 336, and other techniques for authentication a user 80 via user 80 behavior.
- a confidence metric 563 is an indicator of the magnitude of confidence associated with a sensor reading 310 (as illustrated in Figure lb) or an authentication ID 590 (as illustrated in Figure lc). Some embodiments of the system 100 will not include confidence metrics 563. Confidence metrics 563 can be derived using
- FIG. la is a block diagram illustrating an example of a user authentication system 100 that utilizes biometric scans 561 captured from sensor 300 located on a head-mounted display apparatus 200 to link to an authenticated ID 563 on a server 512.
- Profiles 562 on the server 512 include the profile 562 and authenticated ID 563 associated with the biometric scan 561 that was captured as a sensor reading 310 by a sensor 300 included in the head-mounted display apparatus 200.
- a device that in the eyes of the user 80 may be primarily focused on making experiences 600 available to the user 80 can thus be transformed into a powerful tool for user 80 authentication.
- the apparatus 200 can encourage entities 90 to engage in online transactions with users 80 because both parties will have confidence that fraudulent assumptions of identity are far less likely to succeed due to the use of biometric scans 561.
- the head-mounted display apparatus 200 can be implemented in wide variety of different alternative embodiments using wide variety of different components and component configurations.
- a head-mounted display apparatus 200 is potentially any device capable of being worn on the head of the user 80 that is capable of delivering media experiences 600 to the user 80.
- the apparatus 200 will typically involve a computer 510 with many of the components of a computer 510 that are discussed below.
- the apparatus 200 used by the system 100 will typically include the capabilities of both video and audio.
- the apparatus 200 can also include a sensor 300 for capturing biometric scans 561.
- Such an embodiment of the apparatus 200 is illustrated in Figure lb. As a one-person at a time device, the apparatus 200 provides an excellent opportunity for the reliable authentication of users 80 for any number of online transactions and other activities.
- a user 80 of the system 100 is typically a human being, although it is possible to imagine instances where the user 80 could be another species of animal or even a form of robot or artificial intelligence, expert system, or other similar form of automated intelligence.
- the system 100 can be implemented in a wide variety of different configurations using a wide variety of different networks 540.
- Networks 540 the infrastructure for sharing data 560 such as media experiences 600, but the system 100 can be implemented in such a fashion as to be agnostic on which networks 540 are used.
- Servers 512 are computers 510 that can enable a wide variety of client devices to participate in the system 100.
- Servers 12 that host the data 560 and software applications 524 that exist specifically and often exclusively for authentication purposes can be highly desirable.
- a profile 562 of a user 80 on the system 100 can include potentially everything about the user 80 that the system 100 is cognizant of, including all past history 564, all social media 568, and past instructions 566 provided by the user 80.
- a profile 562 includes an association of a biometric scan 561 and an authenticated ID 563.
- a biometric scan 561 is sensor reading of the eye of the user 80 taken by a sensor 300 in the apparatus 200.
- Examples of potential biometric scans 561 include retina scans and iris scans.
- a certification that the outside world and entities 90 unrelated to the user 80 can have confidence in.
- the system 100 can link an authenticated ID 563 to a profile 562 and a biometric scan 561.
- Figure lc is a flow chart diagram illustrating an example of how a user 80 with a head- mounted display apparatus 200 can be authenticated.
- a sensor 300 on the apparatus 200 is used to capture a biometric scan 561 which is then submitted to a server 512 on which the scan 561 can be matched to an authenticated
- the biometric scan 561 is used to identify the corresponding authentication ID 563 by linking to that profile 562.
- the user 80 is can interact with outside entities 90 on the network 540 because those outside entities have confidence in the authentication ID 563 issued to the user 80 on the basis of the biometric scan 561.
- FIG. 2a is a block diagram illustrating an example of a head-mounted display apparatus 200 as an interface between a user 80 and a media experience 600.
- one or more sensors 300 in the apparatus 200 can function as an interface between the user 80 and useful sensor feedback 310 that can assist in the processing performed by the system 100.
- the apparatus 200 can also be used to transmit instructions 568 to the other components of the system 100.
- Figure 3a is a block diagram illustrating an example of components that can be included in virtually any computer 510 positioned within the apparatus 200 or anywhere else within the system 100.
- An assembly that includes a processor 521.
- a computer 510 can also include one or more of the following components: (a) a memory component 522; (b) a storage component 523; (c) a computer program/software application 524 that provide for implementing one or more heuristics 525; (d) a database 5; and (d) a network adapter 529.
- a microprocessor or central processing unit that provides for the running of computer programs/software applications 524.
- the memory component 512 is often referred to as RAM, random access memory.
- RAM random access memory
- a component that provides for the permanent storage of data 560 is
- An application 524 can also be referred to as a computer program 524 or a software application 524.
- An application 524 is a unit of computer code or a set of instructions that are capable of being implemented or run by the processor 521.
- a wide variety of different programming languages can be used to create the application(s) 524 used by the system 100. There are a voluminous number of different types of programming languages including but not limited to: object-oriented languages, assembly languages, fourth-generation languages, visual languages, XML-based languages, and hardware description languages.
- a problem solving, analytical process, or similar approach that can be implemented by a processor 521 running an application 524.
- Heuristics 525 can include algorithms but are not limited to algorithms.
- the system 100 can incorporate the capabilities of any type of database 526.
- databases 526 include but are not limited to: relational databases, object-oriented databases, navigational databases, and NoSQL databases.
- GUI graphical user interface
- a GUI will typically include virtual menus, buttons, editable fields, drop down list boxes, etc. that can users 80 can use to interact with the computer 510.
- the system 100 is not limited to the use of GUIs or other typical computer-oriented interfaces as user interfaces 528.
- Network adapters 529 can operate wirelessly or be hardwired.
- a telecommunications network that allows computers 510 to communicate with each other and exchange data 560.
- a wide variety of networks 540 can be incorporated into the system 100. Examples of network configurations include but are not limited to bus networks, star networks, ring networks, mesh networks, and tree networks. Different networks 540 can utilize different communication protocols and operate in different geographic scales.
- a network 540 that is limited in scope to particular geographic location such as a home or office.
- Common examples of local networks include personal area networks, local area networks, home area networks, and storage area networks.
- external networks 544 include the Internet, wide area networks, metropolitan area networks, and enterprise networks.
- computers 510 can possess a wide variety of different components. Computers 510 can also be embodied in a variety of different categories of computers 510.
- Figure 3b is a block diagram illustrating an example of different types of computers 510.
- An embedded computer 511 is not typically a general purpose computer 514.
- a computer 510 that primarily exists for the purpose of making functionality available to other computers 510, such as to client device 513.
- a computer 510 that functions at least in part by accessing at least some data 560 or an application 524 on a sever 512.
- client devices 513 are general purpose computers 514.
- client devices 513 are mobile computers 515.
- a computer 510 that providers a user 80 with substantial flexibility in configuring the computer 510.
- a one user 80 may use their desktop computer primarily to play games online while another user 80 may use that same model of desktop computer to perform legal research.
- Examples of general purpose computers 514 include desktop computers, laptop computers, smart phones, tablet computers notepad computers, mainframe computers, minicomputers, work stations, and other devices that allow users 80 to decide what is run on the computer 520.
- Some general purpose computers 514 are mobile computers 515.
- a computer 510 that is easily moved from on location to another, such as smart phone, a tablet, or even a notepad or laptop computer.
- Figure 3 c is a block diagram illustrating an example of different components of the system 100 that can include computers 10.
- Virtually any component of the system 200 can include a computer 510.
- the server 512 is a type of computer 510.
- the media player 500 can include one or more computers 510.
- the apparatus 200 can include computer 510 and the sensor 300 of apparatus 200 can include its own separate computer 510.
- Alternative configurations of various computerized components of the system 100 are disclosed in Figures 3d and 3e.
- Figure 3f is a block diagram illustrating an example of different types of data 560 that can be stored on a database 526 used by the system 100.
- Data 560 can represent facts, statistics, and other forms of information.
- Data 560 can include profiles 562, history 564, instructions 566, as well as social media 568.
- a user 80 can interact with the system 100 and other users 80 of the system 100 through a profile 562.
- the profile 562 is associated with one or more users 80.
- a single user 80 can have more than one profile 562.
- Profiles 562 can include virtually any information pertaining to the user 80 including the preferences of the user 80 and the totality of a user's history with the system 100. The more data 560 that exists for a particular user 80 and that can be associated with a particular profile 562, the more it is possible for the heuristics 525 of the system 100 to make assessments on what a user 80 wants.
- the system 100 can also be used to affirmatively trigger instructions 566.
- the system 100 can be used to initiate automated or semi-automated activities.
- Social media 568 are a substantial avenue for creating and sharing data 560.
- the system 100 can be integrated work with social media 568.
- a media experience is any type of media content that can be accessed or captured using the apparatus 200.
- a senor feedback 310 is any type of data 560 captured from a sensor 300.
- the system 100 can utilize sensors to capture information about a user experiencing media as the user is experiencing the media. Such information can be stored, analyzed, updated, and otherwise used to create a profile for a specific user that embodies the interests, tastes, preferences, and history of the user. Such a profile can be used in a wide variety of different ways, including but not limited to the modification of a media experience being delivered to that user.
- sensors can be part of a media access or playing device such as a wearable near-eye display.
- the system can also utilize physically separate sensors that are in communication with each other and the device through which the user is accessing the media.
- Highly actionable profiles can assist consumers in finding what would be of interest to them. Such profiles can also assist media content providers in driving certain types of media experience opportunities to the subset of users most likely to be interested in that content.
- Profiles can thus use both push and pull methodologies to create more opportunities for mutually beneficial deliveries of media experiences to the consumers who want them, without inundating consumers with unwanted marketing messages and advertisements.
- a media navigation heuristic can include both manual/explicit decisions as well as automated processing assessments as inputs.
- advertisements identified to be of interest to the specific user can be delivered instead of generic advertising.
- the system could virtually ensure that only advertisements found interesting to the user would be delivered to the user.
- Two viewers of the same television program could be subjected to radically different advertisements based on the profiles associated with the users. Not only could different users be subject to completely different ads, but the ads for the same product by the same advertiser could have variations tailored to the specific tastes, preferences, and histories of the user.
- the modification of the media experience delivered to a specific user is based on the profile associated with that specific user.
- the user profile is created, developed, and maintained by the capturing and updating of various sensor readings that pertain to the physiological and emotional responses of the user.
- Other marketing-type data pertaining to the user such as where they live, what magazines they subscribe to, occupation, etc. may also be included in the universe of data that is used to comprise the profile for the user.
- the modification of the media experience delivered to a user in response to the tastes, preferences, and history associated with the user is distinguishable from navigating a 3-D space through the use of a head tracker that modifies a point of view.
- the modifications brought about by the system are user-centric, and not based on factors outside the user such as the architecture or layout of simulated space of virtual reality with a point of view that corresponds to head movements within that space.
- Figure 5 a is a block diagram illustrating an example of a sensor feedback (a sensor feedback attribute 310) being captured by a head-mounted apparatus 200 and then used by one or more heuristics 525 to impact the media experience 600 sent to the user 80 of the head-mounted apparatus 200.
- Figure la illustrates a processing loop between making a media experience 600 available to a user 80 wearing the head-mounted apparatus 200, and capturing sensor feedback during the time that the user 80 was engaging the media experience 600.
- the sensor feedback 310 is subjected to a processing heuristic 525 that is used to update a profile 562 associated with the user 80.
- the updated profile 562 can then be used to selectively influence the delivery of the media experiences 600 to the user 80, which can in turn result in additional sensor feedback 310.
- the profile 562 of the user 80 can be substantially and comprehensively developed to increase the predictive efficacy of the profile 562 in influencing the media experiences 600 delivered to the user 600. This can ultimately result in a highly tailored user 80 experience in which only advertisements of interest to the user 80 are shown to the user 80 and in which the system 100 can make highly effective recommendations for media experiences 600 such as films, television programs, video games, music, video clips, and other types of media experiences.
- the system 100 can be embodied in a stand-alone device such as a head- mounted display apparatus 210 or even in merely a head-mounted apparatus 200, many embodiments of the system 100 will incorporate varying degrees of distributed data processing across various devices.
- the media player 500 may be a conventional television set, with the only thing worn by the user 80 being one or more sensors 300.
- FIG. 1 illustrates an example where the head-mounted apparatus 200 connects to a local media player 500 over a local network 542 and the local media player 500 communicates with a remote server 512 over an external network 544.
- the remote server 512 houses the heuristics 525 and profiles 562.
- a user 80 of the system 100 is typically a human being, although it is possible to imagine instances where the user 80 could be another species of animal or even a form of robot or artificial intelligence, expert system, or other similar form of automated intelligence.
- the system 100 captures sensor feedback 310 (which can also be referred to as sensor metrics 310) from specific users 80 in creating, developing, and updating one or more profiles 562 for that specific user 80.
- a media experience 600 is typically media content stored in some format cognizable to the system 100 and in some device or component that is accessible to the system 100. In many if not most instances, the media experience 600 is something originating from outside the user 80 and outside the head-mounted apparatus 200. Examples of media experiences 600 can include but are not limited to: films, television programs, songs, video games, music videos, computer programs, virtual reality applications, books on tape, e-books, e-magazines, and other forms of media interactions.
- the media experiences 600 of the system 100 can be modified by the system 100, tailoring the media experiences 600 to a specific user 80 based on the profile 562 associated with the specific user 80 and the various heuristics 525 used to develop a profile 562 for the user based on sensor feedback 310 and other data that the system 100 is cognizant of.
- the modified media experience 600 that can made available by the system 100 is not be confused with a prior art virtual reality technology in which the movement of the head triggers a change in the point of view of the user.
- Such technologies modify the view of the user 80 on the basis of the layout or floor plan of the virtually reality space, and not the emotional or physiological responses of the user 80.
- an actionable user profile 562 is dependent upon the meaningful collection of data pertaining to the user 80.
- data is captured by one or more sensors 300.
- the data captured by a sensor 300 can be referred to as sensor feedback 310, sensor readings 310, or sensor metrics 310.
- sensor feedback 310 As illustrated in Figure lg, a wide variety of different sensors 300 can be incorporated into the functionality of the system 100.
- a single embodiment of the system 100 can incorporate one or more different sensors 300 of one or more different types.
- Some sensors 300 may be physically embedded in a head-mounted display apparatus 210 (a head-mounted media access device that includes the capability of delivering a video component, unlike a standard headphone) as displayed in Figure Id, but other sensors may be physically separate from such an apparatus 200.
- Figure lg provides examples of different subsets of sensors 300 that are not mutually exclusive (for example, an authentication-based sensor 309 will often be an image -based sensor 310) and are not exhaustive (i.e. other types of sensors 300 can be used).
- the apparatus 200 such as a display apparatus 210, used to "play" or otherwise make the media experience 600 accessible to the user 80 is an excellent opportunity to position certain types of sensors 300, particularly sensors 300 that capture measurements relating to the eye(s) of the user 80.
- sensors 300 that capture measurements relating to the eye(s) of the user 80.
- One prominent category of feedback from the sensors i.e. sensor feedback 310) is eyelid timing, retina tracking, and other near-eye sensor metrics.
- the individualized nature of the head- mounted apparatus 200, the possibility of definitively identifying the identity of the user 80 based on biometric sensor reading (such as a retina scan or an iris scan), and the ability to closely monitor the eye activity of a user 80 while the user 80 is actively involved in a media experience 600 provides a valuable opportunity to model the behavior of the user 80 and to have that model embodied in a profile 562 that is accessible to the system 100.
- biometric sensor reading such as a retina scan or an iris scan
- Conventional data from surveys filled out by the user 80 and other data relating to the user 80 can be used to further augment the profile 562 of the user 80.
- Sensor feedback 310 can also include biometric information used to definitively identify the user 80 to ensure that the correct profile 562 is being updated.
- biometric sensor feedback 310 can include but are not limited to iris scans and retina scans.
- One way to delineated between different types of sensors 300 that can be incorporated into the system 100 is to distinguish the sensor 300 based on the phenomenon being sensed.
- an image-based sensor 301 which would include still frame cameras, video cameras, retina scanners, etc operate through the collection of images.
- the sensors 300 used by the system 100 can include a temperature-based sensor 302, a motion-based sensor 303, a force-based sensor 304, a conductance-based sensor 305, and a sound-based sensor 306.
- Another way to delineate between different types of sensors 300 that can be incorporated into the system 100 is to distinguish the sensor 300 based on what is being assessed.
- Such categories can include but are not limited to an attention-based sensor 307 which captures information about attentiveness and/or the focal point of attention, an physiological-based sensor 308 used to capture physiological data about the user 80, and an authentication-based sensor 309 which captures information that authenticates the identity of the user 80.
- FIG. 1 is a diagram illustrating an example of a sensor 300 embedded in the apparatus 200.
- the sensor 300 captures sensor feedback 310 for use by the system 100, and the system 100 supplies media experiences 600 to the apparatus 200 that can be selectively influenced by the sensor feedback 310.
- a profile 562 is an aggregate representation of the user 80 within the system 100.
- a user 80 can interact with the system 100 and potentially other users 80 of the system 100 through a profile 562.
- a profile 562 is associated with one or more users 80.
- a single user 80 can have more than one profile 562.
- To identify examples of profiles 562 one only has to look at popular websites such as Amazon.com, Facebook.com, eBay.com, Google.com, etc. to understand the wide diversity of profiles 562 that can exist.
- Profiles 562 can include virtually any information pertaining to the user 80 including the preferences of the user 80 and the totality of a user's history with the system 100.
- a heuristic 525 is a problem solving methodology, an analytical process, or similar approach that can be implemented by a processor 521 of a computer 510 that is running an application 524.
- Heuristics 525 can include algorithms but are not limited to algorithms.
- the system 100 can include a wide variety of heuristics 525 pertaining to the tracking of eye movements, the timing of eye movements, other eye-related data, and other data pertaining to the user 80 in evaluating the interests, desires, and preferences of users 80.
- a head-mounted apparatus 200 is potentially any device capable of being worn on the head of the user 80 that is capable of delivering media experiences 600 to the user 80.
- the apparatus 200 used by the system 100 will typically include the capabilities of video and/or audio.
- An apparatus 200 with the ability to convey a visual component can also be referred to as a display apparatus 210 or a head-mounted display apparatus 210.
- the display apparatus 210 can also include a sensor 300 for capturing sensor readings (i.e. sensor feedback 310) that pertain to information about the user 80.
- the apparatus 200 is the media player 500.
- the media player 500 is a separate device such as a TV, DVD player, computer 510, cable box, or similar device.
- Servers 512 are computers 510 that can enable a wide variety of client devices to participate in the system 100.
- One way to make the system 100 available to multiple users 80 and to provide users 80 with a high degree of portability in terms of their profiles 562 is to house much of the data 560 and computer 510 components necessary for the functionality of the system 100 on remote servers 512 that can be accessed by multiple users 80 simultaneously, regardless of the locations of those users 80 at the time.
- Some embodiments of the system 100 can operate in a variety of different operating modes 600. Different embodiments of the system 100 can include different operating modes 600 and different abilities of users 80 to select of otherwise impact the operating mode 600 that applies to a particular media experience 600.
- the universe of potential operating modes 600 that can be offered by system 100 can include a "record only” mode 660, a "modify only” mode 680, and record and modify mode 670, and a mode of off 690.
- this operating mode 660 the system 100 captures information about the user 80, but such information is not used to impact the media experience 600 being delivered to the user 80 at that time.
- a "record only" operating mode 660 the user 80 is exposed to the same media experience 600 as other users 80 even though such other users 80 may possess vastly different profiles 562 in terms of tastes, preferences, histories, etc.
- This operating mode 660 supports the building and updating a profile 562 associated with the user 80, but the profile 562 is not used to impact the delivery of the media experience 600 currently being delivered to the user 80.
- Some embodiments of the system 100 can distinguish between a "full record” mode 662 where all possible data is collected and stored in contrast to a "partial record” mode 664 where certain data is excluded from the process of storing sensor readings 310 captured by the sensor(s) 300.
- this operating mode 680 what one user experiences can be totally different than what another user experiences, even if they both select the same media content 600.
- a "modify only” mode 680 no data from the user experience is stored by the system 100. Users 80 may choose a "modify only” mode 680 when trying something new, if they are not feeling well, or to protect their privacy with respect to a particular matter.
- Some embodiments of the system 100 may differentiate between different magnitudes of content modification, including a "full modify” mode 682 as well as any number of “partial modify” modes 684 that can range from 1% of the modification capabilities to 99%.
- Many multi-mode systems 100 will include a mode of "record and modify" 670. As illustrated in Figure If, there are several variations of the record & modify mode 670 that can be implemented by the system 100, including a "full record/full modify” mode 672, a “full record/partial modify” mode 676, a “partial record/partial modify” mode 674, and a “partial record/ full modify” mode 678.
- Analogous concerns may exist on the modify side of the system 100 as well.
- Fully dynamic media experiences 600 such as ads can be highly desirable. Two viewers of the same television program could be subjected to radically different advertisements based on the profiles associated with the users. Not only could different users be subject to completely different ads, but the ads for the same product by the same advertiser could have variations tailored to the specific tastes, preferences, and histories of the user.
- the user 80 may want to limit the customization of their media experience 600 of the content provider may provide fewer options for customizations.
- the modification of the media experience delivered to a user 80 in response to a profile 562 or model associated with the user is distinguishable from navigating a 3-D space through the use of a head tracker that modifies a point of view.
- the modifications by the system 100 are based on emotional and physiological responses of the user 80 that are embodied in a profile 562 or model of that user 80, i.e. such modifications are based on the tastes, preferences, and history of the user.
- sensor feedback 310 is captured by the sensor 300 in the apparatus 200.
- one or more profiles 562 corresponding to the user 80 are updated based on the senor feedback 310 pertaining to that user 80.
- the media experience 600 for that user 80 can selectively and dynamically modified based on the updated profile 562 corresponding to the user 80.
- Table 1 below provides a chart of element numbers, element names, and element descriptions.
- System An aggregate configuration of IT components 500 and data 600 that includes a head-mounted apparatus 200 for interfacing with a user 80.
- the system 100 can collectively act as an interface between the user 80 wearing a head-mounted apparatus 200 and an IT environment 400.
- the system 100 can include an authentication subsystem 110, an eye tracking subsystem 120, a user tracking subsystem 130, a customized media subsystem 140, and/or potentially other subsystems.
- Subsystem a user 80.
- the authentication subsystem 110 can use one or more sensor readings 310 captured from one or more sensors 300 to authenticate the identity of the user 80. This typically involves linking the user 80 to a profile 620.
- Eye Tracking A subset of the system 100 that can track the eye 82 and eye
- Subsystem attributes 85 of the user 80 wearing the head-mounted apparatus 200 are Subsystem attributes 85 of the user 80 wearing the head-mounted apparatus 200.
- the user tracking subsystem 130 can include the eye tracking system 120.
- Subsystem the head-mounted apparatus 200.
- Subsystem experience 610 with other users 80 is provided.
- Space virtual space that can be navigated by one or more users 80.
- Apparatus A device capable of being worn on the head 86 of a user 80 that
- the device also has
- Apparatus includes one or more sensors 300 that capture sensor readings
- Different embodiments of the apparatus 200 can provide for delivering media experiences 610 utilizing the different senses of the user 80, such as visual content 611, sound content 613, haptic content 614, olfactory content 61 , and/or taste content 616.
- a head-mounted apparatus 200 that includes a display
- Apparatus or component 220 Conventional headphones could not serve as a head-mounted display apparatus 210.
- Head-Mounted Conventional headphones could not serve as a head-mounted display apparatus 210.
- the display component 220 can be a display screen, a virtual retina display (see U.S. Patent Numbers 8,982,014 and 9,076,368), or any other display technology known in the prior art.
- Sensor A device embedded to or otherwise in communication with the apparatus 200 or other system 100 components There are a wide variety of different types and categories of sensors 300 that can be utilized by the system 100. Any sensor 300 used in the prior art can potentially be used or adapted for use in the system 100. Sensors 300 can be integrated into the head- mounted apparatus 200, other devices in the system 100, and/or free standing devices that are merely in communication with other components of the system 100. Sensors 300 of a wide variety of sensor types 302 can be utilized within the system 100.
- Sensor Types Sensors can be categorized in terms of the technology that they use (a camera 320 is based on capturing light, a microphone 326 is based on capturing sound) or the attribute of the user 80 that is captured (
- Cameras 310 can utilize a wide variety of different ranges of spectral light, including but not limited to the visual spectrum, infrared, ultraviolet, and as well generate images from non-light based sources, such as an ultrasound.
- a retina scanner 312 is typically a camera 310, and it is also an example of a biometric sensor 320.
- a retina scanner 312 captures an image of a human retina that can be used for the purposes of identification.
- Fingerprint A fingerprint scanner 313 is typically a camera 310, and it is
- Scanner also an example of a biometric sensor 320.
- a fingerprint also an example of a biometric sensor 320.
- scanner 312 captures an image of a human retina that can be used for the purposes of identification.
- Video Camera A camera 310 that rapidly captures a sequence of still frame images that when played rapidly in succession, convey to the viewer a sense of motion.
- Microphone A sensor 300 that captures sound.
- the sounds captured by a microphone can include information pertaining to the source of the sound (distinguishing one spoken voice from another) as well as containing information explicitly through the sound itself (such as a spoken instruction or password).
- a voice print scanner 318 is an example of a biometric sensor 320, although part of the validation can include a user ID or password component.
- Biometric A sensor 300 that measures and analyzes biological attributes
- Biometric attributes can include but are not limited to DNA, fingerprints, eye retinas, eye irises, voice patterns, facial recognition, and hand measurements.
- Motion Sensor A sensor 300 that captures information pertaining to movement.
- Gait Scanner or A motion sensor 330 that captures information about the walk
- Walk Scanner i.e. gait
- Mouse Scanner A motion sensor 330 that captures information about the mouse movements of an individual.
- Physiology A sensor 300 that measures one or more physiological
- Sensor attributes of a human being such as weight, heart rate, blood pressure, height, etc.
- Heart Rate A type of physiology sensor 344 that captures information
- Blood Pressure A type of physiology sensor 344 that captures information
- Location Sensor A type of sensor 300 that captures information about location or Position and/or position. GPS is an example of a location sensor 348. Sensor
- Sensor include biometric data such iris scans and retina scans for
- a user 80 may visit a particular website to make a purchase, in which case the IT components 500 and data 600 utilized to communicate between the user 80 and the seller make up the applicable IT environment 400.
- the activity is performing an online banking transaction, making a purchase, communicating with friends, watching movies, playing video games, or working on a collaborative project
- aptop computer 521
- Processor A microprocessor or central processing unit that provides for the running of computer programs/software applications 524.
- Memory A component accessible to the processor 11 where data 560 is
- the memory component 512 is often referred to as RAM, random access memory.
- Application An application 524 can also be referred to as a computer
- An application 524 is a unit of computer code or a set of instructions that are capable of being implemented or run by the processor 521.
- a wide variety of different programming languages can be used to create the application(s) 524 used by the system 100. There are a voluminous number of different types of programming languages including but not limited to: object-oriented languages, assembly languages, fourth-generation languages, visual languages, XML-based languages, and hardware description languages.
- Heuristics 525 Heuristic A problem solving, analytical process, or similar approach that can be implemented by a processor 521 running an application 524. Heuristics 525 can include algorithms but are not limited to algorithms.
- Database A comprehensive collection of data 560 that is organized for convenient access.
- the system 100 can incorporate the capabilities of any type of database 526.
- databases 526 include but are not limited to: relational databases, object-oriented databases, navigational databases, and NoSQL databases.
- GUI graphical user interface
- a GUI will typically include virtual menus, buttons, editable fields, drop down list boxes, etc. that can users 80 can use to interact with the computer 510.
- the system 100 is not limited to the use of GUIs or other typical computer-oriented interfaces as user interfaces 528.
- Network adapters 529 can operate wirelessly or be hardwired.
- networks 540 can be incorporated into the system 100. Examples of network configurations include but are not limited to bus networks, star networks, ring networks, mesh networks, and tree networks. Different networks 540 can utilize different communication protocols and operate in different geographic scales.
- Local Network A network 540 that is limited in scope to particular geographic location such as a home or office.
- Common examples of local networks include personal area networks, local area networks, home area networks, and storage area networks.
- a device such as disc player or cable box that provides for
- Data 560 can represent facts, statistics, and other forms of information.
- Data 560 can include profiles 562, history 564, instructions 566, as well as social media 568.
- a media experience 610 can Media Content engage one or more senses of the user 80 such a visual content
- Visual Content An aspect of the media experience 610 that is experienced through the sense of sight.
- Video Content An image component 611 that is in the form of video/
- a common example of a haptic component 614 is vibration.
- Profiles 562 An association between an identity of a user 80 and a set of data 600.
- a user 80 can interact with the system 100 and other users 80 of the system 100 through a profile 620.
- the profile 620 is associated with one or more users 80.
- a single user 80 can have more than one profile 620.
- To identify examples of profiles 562 one only has to look at popular websites such as Amazon.com, Facebook.com, eBay.com, Google.com, etc. to understand the wide diversity of profiles 562 that can exist.
- Profiles 562 can include virtually any information pertaining to the user 80 including the preferences of the user 80 and the totality of a user's history with the system 100.
- User History A subset of data 600 that pertains to the history of a user 80 with an IT environment 400.
- the historical data 622 of a user 80 can be valuable information for the processing of the system 100. So can the historical data 622 for the aggregated community of users 80.
- User ID A unique identifier that corresponds to a specific user 80.
- a specific user 80 may have more than one ID 624, but an ID 624 pertains to only a specific user 80.
- Group ID A unique identifier that corresponds to a specific group of
- the system 100 can also be used to affirmatively trigger instructions 566.
- the system 100 can be used to initiate automated or semi-automated activities.
- Social media 568 are a substantial avenue for creating and sharing data 60.
- the system 100 can be integrated work with social media 568.
- Method involves linking a user 80 to a profile
- a record only mode 660 can be a full record mode 662 where every possible sensor 310 is captured and stored, or a partial record mode 664 where certain readings 310 are not stored and captured.
- the record and modify mode 670 can include a full record/full modify mode 672, a partial record/partial modify mode 674, a full record/partial modify mode 676, and a partial record, full modify mode 678.
- the modify only mode 680 can be a full modify mode 682 or a partial modify mode 684.
Landscapes
- Engineering & Computer Science (AREA)
- Computer Security & Cryptography (AREA)
- Theoretical Computer Science (AREA)
- Computer Hardware Design (AREA)
- Software Systems (AREA)
- Physics & Mathematics (AREA)
- General Engineering & Computer Science (AREA)
- General Physics & Mathematics (AREA)
- User Interface Of Digital Computer (AREA)
Abstract
A system (100), apparatus (200), and method (900), that provide for acting as an interface between a user (80) and an IT environment (400). The system (100) can identify an authenticated ID (690) pertaining to the user (80) that is associated with a confidence metric (663). The eye-attributes (85) of the user (80) can be tracked, to assess the preferences of the user (80) when engaging in a media experience (610) and other activities. A profile (620) developed to model the likes and dislikes of the user (80) can be used in a variety of useful contexts.
Description
SYSTEM, APPARATUS, AND METHOD FOR INTERFACING BETWEEN A USER AND AN IT ENVIRONMENT
RELATED APPLICATIONS
[0001] This PCT applications claims priority to the following U.S. provisional patent applications which are hereby incorporated by reference in their entirety: (A) "APPARATUS, SYSTEM, AND METHOD FOR CAPTURING A USER'S RESPONSE TO MEDIA" (Serial Number 62/041,019) filed on August 22, 2014; and (B) "APPARATUS, SYSTEM, AND METHOD FOR USER AUTHENTICATION" (Serial Number 62/041,008) filed on August 22, 2014.
BACKGROUND OF THE INVENTION
[0002] The invention is a system, apparatus, and method for providing an interface between a user and an IT environment (collectively, the "system").
[0003] The human-IT environment interface is inadequate in many different respects, whether the issue is authenticating that a person is who they say they are, or whether the issue is being able to tailor and customize the IT environment to a particular person. The conventional interface between man and machine misses many opportunities to improve the interface between a human being and an IT environment.
[0004] I. USER AUTHENTICATION
[0005] Networks such as the Internet, the World Wide Web, and other public and semi-public networks provide people, businesses, and other organizations an unprecedented ability for interactions. Unfortunately the ability and utility of connecting with others is sometimes deterred by the possibility of fraud— particularly in the context of a user pretending to be someone who they are not.
[0006] Common techniques of user authentication are often inadequate because such techniques rely on information that is submitted to by a user or because there is little incentive for the user to engage in the authentication process.
[0007] It would be desirable for users to be able to authenticate themselves more conveniently for such authentication to be more accurate.
[0008] II. USER SATISFACTION/REACTION
[0009] Information pertaining to the consumer's emotional, physiological, and other forms of response (collectively, the "response") to the media experience can be captured, stored, accessed, and analyzed for a variety of different purposes and benefits. Such information can potentially even be used to dynamically impact the types and ways that future media experiences are delivered to users in general, or to the specific user in particular.
[0010] The consumption of media is exploding on a world-wide basis. Movies, television programs, video games, music, video clips, and other types of media content are accessed by billions of people around the world. Consumers of media can access a universe of ever increasing content through both traditional as well as new delivery channels. For example, movies are viewed in movie theaters, but they are also viewed through online streaming services, pay-per-view cable and satellite services, and optical discs such as DVDs, BLU-RAY® discs, and other similar media.
[0011] Despite the explosion in the size of the media industry and in the technological advancements used to deliver media to consumers, media playing technologies do a remarkably poor job at assisting users to interact with the content they like while avoiding the content that they do not enjoy. Prior art technologies do not anticipate the desires of users, and prior art media players fail to factor in the preferences, tastes, and desires of individual users in presenting media to those users.
[0012] Whether watching movies on cable TV, a sporting event through online streaming services, or a music video from an online video site, consumers of media are routinely forced to view commercials and other types of advertisements that are of no interest to the individual in question. Neither viewers nor advertisers benefit from the subjecting of viewers to advertisements for which they have no interest. A similar dynamic is in play with respect to the media selection process for consumers. There is a lot of media content that users would eagerly consume, but the prior art does little to selectively identify and deliver media that is tailored to the specific user.
[0013] It would be desirable if media delivery systems could assist in capturing actionable information about users.
SUMMARY OF THE INVENTION
[001 ] The invention is a system, apparatus, and method for providing an interface between a user and an IT environment (collectively, the "system").
[0015] The system can be implemented utilizing one or more of the following subsystems: (1) user authentication; (2) eye tracking; and/or (3) user tracking.
BRIEF DESCRIPTION OF THE DRAWINGS
[0016] Many features and inventive aspects of the system are illustrated in the various drawings. No patent application can expressly disclose in words or in drawings, all of the potential embodiments of an invention. In accordance with the provisions of the patent statutes, the system, apparatus, and method are explained and illustrated in certain preferred embodiments. However, it must be understood that the system, apparatus, and method may be practiced otherwise than is specifically explained and illustrated without departing from its spirit or scope.
[0017] The description of the system, apparatus, and method provided above and below should be understood to include all novel and non-obvious alternative combinations of the elements described herein, and claims may be presented in this or a later application to any novel non- obvious combination of these elements. Moreover, the foregoing embodiments are illustrative, and no single feature or element is essential to all possible combinations that may be claimed in this or a later application.
[0018] Figure la is a block diagram illustrating (i) an example of a system that can be used to generate an authentication ID from a user (i.e. authenticate a user) and (ii) certain component elements that can comprise such a system, such as a sensor, a sensor reading captured from the sensor, a heuristic process for selectively identifying an authenticated ID from the various inputs, a computer, and other types of relevant data.
[0019] Figure lb is a hierarchy diagram illustrating an example of different kinds of biometric sensor readings and different kinds of behavioral sensor readings.
[0020] Figure lc is an input/output diagram illustrating an example of an authentication ID and a corresponding confidence metric be identified by a heuristic of the system from the inputs which can include one or more sensor readings, a login password submission, a certificate, and potentially other data.
[0021] Figure Id is a hierarchy diagram illustrating an example of different types of authentication IDs, including but not limited to a continuous authenticated ID, a period authenticated ID, and a discrete authenticated ID.
[0022] Figure 2a is a block diagram illustrating an example of a user authentication system that utilizes sensor readings captured from sensor located on a head-mounted display apparatus to link to an authenticated ID on a server.
[0023] Figure 2b is an input/output diagram involving a head-mounted display apparatus.
[0024] Figure 2c is a flow chart diagram illustrating an example of how a user with a head- mounted display apparatus can be authenticated.
[0025] Figure 3a is a block diagram illustrating an example of a head-mounted display apparatus as an interface between a user and a media experience.
[0026] Figure 3b is a block diagram illustrating an example of a sensor as an interface between a user and a sensor feedback.
[0027] Figure 3c is a block diagram illustrating an example of a head-mounted display apparatus as an interface between a user and an instruction.
[0028] Figure 4a is a block diagram illustrating an example of components that can be included in a computer.
[0029] Figure 4b is a block diagram illustrating an example of different types of computers.
[0030] Figure 4c is a block diagram illustrating an example of different components of the system that can include computers.
[0031 ] Figure 4d is a block diagram illustrating an example of a variation of Figure 4c that does not include a remote server.
[0032] Figure 4e is a block diagram illustrating an example of a variation of Figures 4c and 4d that does not include a media player that is separate from the head-mounted display apparatus.
[0033] Figure 4f is a block diagram illustrating an example of different types of data that can be stored on a database used by the system.
[0034] Figure 5a is a block diagram illustrating an example of sensor feedback being captured by a head-mounted display apparatus and then used by one or more heuristics to impact the media experience sent to the user of the head-mounted display apparatus.
[0035] Figure 5b is a block diagram illustrating an example of a variation of Figure 5a in which the applicable profile storage and heuristic processing is performed in a local media player instead of a remote server.
[0036] Figure 5c is a block diagram illustrating an example of a variation of Figures 5a and 5b in which there is no local media player interfacing between the head-mounted display apparatus and the remote server.
[0037] Figure 5d is an input/output diagram for a head-mounted display apparatus indicating that the sensor feedback is an output of an embedded sensor and the media experience is an input for the head-mounted display apparatus.
[0038] Figure 5e is a flow chart diagram illustrating an example of a method for dynamically influence the delivery of a media experience.
[0039] Figure 5f is a block diagram illustrating an example of different operating modes that can be incorporated into the processing performed by the system.
[0040] Figure 5g is a block diagram illustrating an example of the different types of sensors that can be used to capture data about a user.
[0041 ] Figure 6 is a block diagram illustrating a system with a head-mounted display apparatus acting as an interface between a user and an IT environment.
DETAILED DESCRIPTION
[0042] The invention is a system, apparatus, and method for providing an interface between a user and an IT environment (collectively, the "system").
[0043] Table 1 below provides a glossary/index of claim element numbers, claim element names, and claim element descriptions.
[0044] I. OVERVIEW
[0045] Figure la is a block diagram illustrating (i) an example of a system 100 that can be used to generate an authentication ID 590 from a user 80 (i.e. authenticate a user) and (ii) certain component elements that can comprise such a system 100, such as a sensor 300, a sensor reading 310 captured from the sensor 300, a heuristic process 525 for selectively identifying an authenticated ID 590 from the various inputs, a computer 510, and potentially other types of relevant data 60.
[0046] As disclosed in Figure lb, the system 100 can utilize sensor readings 310 to authenticate a user 80. User authentication can also allow the system 100 to authenticate desired user activity, in addition to user identity. As illustrated in Figure lc, authentication can be based on a wide variety of different sensor readings 310, login/password submissios 580, certificates such as PKI certificates 81, and potentially other data 560 that is cognizable to the system 100. Authentication
processing can provide opportunities for accelerated process analogous to the "single-click" purchasing methodology of Amazon or the "PreCheck" program of the Transportation Security Administration. Reliable authentication can provide for accelerated or at least more convenient processing in a wide variety of different contexts.
[0047] User authentication opens new opportunities for users to convey acceptance of business transactions beyond use of a mouse, although mouse use can still play a role in the authentication process. The system can provide an opportunity for integrated authentication through user behavior.
[0048] User authentication is not limited to an individual process. The system can perform multiple authentication activities or exchange authentication information with other devices and sensors. User authentication processes can be based on biometric information (such as a fingerprint, voice recognition, facial recognition, or retina scan), user behavior (how the person moves a mouse, how they walk, etc.), login password, and/or tokens (hardware or software keys embedded in hardware, such as ATM or credit cards). Different authentication heuristics can be associated with different confidence intervals. For example, a retina scan can be associated with a greater degree of confidence and accuracy than a login/password process. Some user authentication activities can be continuous, such as authenticating a user 80 based on how they move a mouse or based on how they walk. Other authentication activities are discrete, such as a retina scan or a login/password process. Discrete authentication activities can be repeated multiple times if the triggering event/activity is repeated. For example, in the context of a head-mounted display apparatus 200 illustrated in Figure 2a, the apparatus 200 can be configured to automatically re-authenticate the user 80 every so often, or immediately prior to certain events such as the display of certain sensitive information, the display of
[0049] A. System
[0050] The system 100 can be implemented in the configuration of a single integrated standalone device. However, it will often be more desirable for the system 100 to be implemented in less integrated and more modular configurations. There are many contexts in which the ability to authenticate a user 80 is potentially useful. Such contexts can include but are not limited to: initiating business or financial transactions; accessing financial or other proprietary information; tracking user behavior for the purposes of modeling the behavior of a particular user;
communicating with others; and other contexts where it is important to have some magnitude of confidence that a particular user 80 is who he or she says that he or she is.
[0051] It is anticipated that the system 100 can be configured to accommodate users 80 in different physical locations using different devices for different purposes. For example, the system 100 could involve the use of an embedded retina scanner at an ATM machine located within a bank, a head-mounted display apparatus located within the home, and a general purpose desktop computer while at the office. In many ways, the scope of different authentication technologies that can be incorporated into the system 100, the more accurate and more useful the system 100 can become.
[0052] The original inspiration for the conception of the system 100 is illustrated in Figure 2b, which illustrates an example of a head-mounted display apparatus 200 that provides for delivering a media experience 600 to the user 80. was the use of a apparatus 200 such a
[0053] B. User
[0054] A user 80 is the subject or target of activity by the system 100 to be authenticated. Typically a user is a human being who voluntarily interacts with the system 100 for the purposes of being authenticated by the system 100. However, the system 100 can be implemented in ways that could be useful to other types of animals and potentially even robots. The system 100 could also be used to authenticate users 80 without the knowledge of the applicable user 80.
[0055] C. Authenticated ID
[0056] An authenticated ID 590 is an output of the system 100. The authenticated ID 590 is what allows the system 100 to communicate to third parties such as other users 80, businesses, government agencies, and other organizations that the user 80 is who the user 80 says he or she is.
[0057] The system 100 can provide a convenient and efficient way to authenticate the identity of users 80, i.e. link a specific user 80 to a specific authenticated ID 590 that can be relied upon by the system 100 and other systems, applications, and entities that interesting in interacting with the user 80 if the identity of the user 80 can be.
[0058] Figure lb is a hierarchy diagram illustrating an example of different kinds of sensor readings 310, such as biometric sensor readings 320 and different kinds of behavioral sensor readings 330 that can be used selectively identify the authenticated ID 590 for a particular user 80.
[0059] Figure lc is an input/output diagram illustrating an example of an authentication ID 590 and a corresponding confidence metric 563 be identified by a heuristic 525 of the system 100 from
the inputs which can include one or more sensor readings 310, a logm/password submission 580, a certificate 581, and potentially other data 560.
[0060] Figure Id is a hierarchy diagram illustrating an example of different types of authentication IDs 590, including but not limited to a continuous authenticated ID 591, a period authenticated ID 592, and a discrete authenticated ID 593.
[0061] 1. Continuous Authenticated ID
[0062] A continuous authenticated ID 591 is an authenticated ID 590 that is re-authenticated on a continuous or at least substantially continuous basis. For example, in the context of a head- mounted display apparatus 200, a sensor 300 in the apparatus 200 could be configured to capture and transmit sensor readings 310 on a continuous basis while the apparatus 200 is being used to deliver a media experience to the user 80.
[0063] 2. Periodic Authenticated ID
[0064] Returning to Figure Id, a periodic authenticated ID 592 is an authenticated ID 590 that is re-authenticated on a predefined periodic basis. So for example, a retina scan 322 could be perpetually re-obtained once the predefined cycle time has lapsed.
[0065] 3. Discrete Authenticated ID
[0066] A discrete authenticated ID 593 is an authenticated ID 590 that is captured in relation to a discrete period of time triggered by the initiation of a specific event or activity.
[0067] D. Sensors and Sensor Readings
[0068] A sensor 300 is a device used to captures information. The system 100 can utilize a wide variety of different sensors that can be useful to the authentication of users 80. The potential diversity of sensors 300 that can be utilized by the system 100 is commensurate with the potential diversity of sensor readings 310 that can be utilized by the sensor 100. Figure lb is a hierarchy diagram illustrating an example of different categories and subcategories of sensor readings 310.
[0069] 1. Biometric Sensor Reading
[0070] A biometric sensor reading 320 is a sensor reading 310 that is based on a biological attribute of the user 80, such as a retina scan 322, a fingerprint scan 324, facial recognition, a voice print 326, and other identifiers that direct relate to who a person is.
[0071] 2. Behavioral Sensor Reading
[0072] A sensor reading 310 that is based on a behavioral attribute of the user 80, such as a voice print/scan 332, a walk scan 334, a mouse scan 336, and other techniques for authentication a user 80 via user 80 behavior.
[0073] E. Confidence Metric
[0074] A confidence metric 563 is an indicator of the magnitude of confidence associated with a sensor reading 310 (as illustrated in Figure lb) or an authentication ID 590 (as illustrated in Figure lc). Some embodiments of the system 100 will not include confidence metrics 563. Confidence metrics 563 can be derived using
[0075] Figure la is a block diagram illustrating an example of a user authentication system 100 that utilizes biometric scans 561 captured from sensor 300 located on a head-mounted display apparatus 200 to link to an authenticated ID 563 on a server 512. Profiles 562 on the server 512 include the profile 562 and authenticated ID 563 associated with the biometric scan 561 that was captured as a sensor reading 310 by a sensor 300 included in the head-mounted display apparatus 200. A device that in the eyes of the user 80 may be primarily focused on making experiences 600 available to the user 80 can thus be transformed into a powerful tool for user 80 authentication. The apparatus 200 can encourage entities 90 to engage in online transactions with users 80 because both parties will have confidence that fraudulent assumptions of identity are far less likely to succeed due to the use of biometric scans 561.
[0076] A. Head-Mounted Display Apparatus
[0077] The head-mounted display apparatus 200 can be implemented in wide variety of different alternative embodiments using wide variety of different components and component configurations. A head-mounted display apparatus 200 is potentially any device capable of being worn on the head of the user 80 that is capable of delivering media experiences 600 to the user 80. The apparatus 200 will typically involve a computer 510 with many of the components of a computer 510 that are discussed below. The apparatus 200 used by the system 100 will typically include the capabilities of both video and audio. In addition to having some type of screen, direct projection of images on the retinas of the user or other similar display technology couple with speakers or some other form of audio technology, the apparatus 200 can also include a sensor 300 for capturing biometric scans 561. Such an embodiment of the apparatus 200 is illustrated in
Figure lb. As a one-person at a time device, the apparatus 200 provides an excellent opportunity for the reliable authentication of users 80 for any number of online transactions and other activities.
[0078] B. Users
[0079] Returning to Figure la, a user 80 of the system 100 is typically a human being, although it is possible to imagine instances where the user 80 could be another species of animal or even a form of robot or artificial intelligence, expert system, or other similar form of automated intelligence.
[0080] C. Networks
[0081 ] The system 100 can be implemented in a wide variety of different configurations using a wide variety of different networks 540. Networks 540 the infrastructure for sharing data 560 such as media experiences 600, but the system 100 can be implemented in such a fashion as to be agnostic on which networks 540 are used.
[0082] D. Server
[0083] Many embodiments of the system 100 and apparatus 200 will involve one or more servers 512. Servers 512 are computers 510 that can enable a wide variety of client devices to participate in the system 100. Servers 12 that host the data 560 and software applications 524 that exist specifically and often exclusively for authentication purposes can be highly desirable.
[0084] E. Profiles
[0085] A profile 562 of a user 80 on the system 100 can include potentially everything about the user 80 that the system 100 is cognizant of, including all past history 564, all social media 568, and past instructions 566 provided by the user 80. For the purposes of authentication, a profile 562 includes an association of a biometric scan 561 and an authenticated ID 563.
[0086] F. Biometric Scans
[0087] A biometric scan 561 is sensor reading of the eye of the user 80 taken by a sensor 300 in the apparatus 200. Examples of potential biometric scans 561 include retina scans and iris scans.
[0088] G. Authenticated ID
[0089] A certification that the outside world and entities 90 unrelated to the user 80 can have confidence in. The system 100 can link an authenticated ID 563 to a profile 562 and a biometric scan 561.
[0090] II. PROCESS-FLOW VIEW
[0091] Figure lc is a flow chart diagram illustrating an example of how a user 80 with a head- mounted display apparatus 200 can be authenticated.
[0092] At 700, a sensor 300 on the apparatus 200 is used to capture a biometric scan 561 which is then submitted to a server 512 on which the scan 561 can be matched to an authenticated
[0093] At 702, the biometric scan 561 is used to identify the corresponding authentication ID 563 by linking to that profile 562.
[0094] At 704, the user 80 is can interact with outside entities 90 on the network 540 because those outside entities have confidence in the authentication ID 563 issued to the user 80 on the basis of the biometric scan 561.
[0095] III. ADDITIONAL ELEMENTS
[0096] The functionality described above occurs in the context of an apparatus 200 that can provide uses 80 with a wide variety of potential media experiences 600. Figure 2a is a block diagram illustrating an example of a head-mounted display apparatus 200 as an interface between a user 80 and a media experience 600. As illustrated in Figure 2b, one or more sensors 300 in the apparatus 200 can function as an interface between the user 80 and useful sensor feedback 310 that can assist in the processing performed by the system 100. As illustrated in Figure 2c, the apparatus 200 can also be used to transmit instructions 568 to the other components of the system 100.
[0097] A. Computers
[0098] Figure 3a is a block diagram illustrating an example of components that can be included in virtually any computer 510 positioned within the apparatus 200 or anywhere else within the system 100. An assembly that includes a processor 521. A computer 510 can also include one or more of the following components: (a) a memory component 522; (b) a storage component 523; (c) a computer program/software application 524 that provide for implementing one or more heuristics 525; (d) a database 5; and (d) a network adapter 529.
[0099] 1. Processor
[00100] A microprocessor or central processing unit that provides for the running of computer programs/software applications 524.
[00101] 2. Memory Component
[00102] A component accessible to the processor 511 where data 560 is directly accessible by the processor 511 but not permanently stored. The memory component 512 is often referred to as RAM, random access memory.
[00103] 3. Storage Component
[00104] A component that provides for the permanent storage of data 560.
[00105] 4. Application
[00106] An application 524 can also be referred to as a computer program 524 or a software application 524. An application 524 is a unit of computer code or a set of instructions that are capable of being implemented or run by the processor 521. A wide variety of different programming languages can be used to create the application(s) 524 used by the system 100. There are a voluminous number of different types of programming languages including but not limited to: object-oriented languages, assembly languages, fourth-generation languages, visual languages, XML-based languages, and hardware description languages.
[00107] 5. Heuristic
[00108] A problem solving, analytical process, or similar approach that can be implemented by a processor 521 running an application 524. Heuristics 525 can include algorithms but are not limited to algorithms.
[00109] 6. Database
[00110] A comprehensive collection of data 560 that is organized for convenient access. The system 100 can incorporate the capabilities of any type of database 526. Examples of databases 526 include but are not limited to: relational databases, object-oriented databases, navigational databases, and NoSQL databases.
[00111] 7. User Interface
[00112] An interface between the user 80 and the system 100. The most prominent example of a user interface 528 is a graphical user interface (GUI) that functions as a virtual layer closest to the physical aspects of the client device 513 or other type of computer 510. For example, a GUI will typically include virtual menus, buttons, editable fields, drop down list boxes, etc. that can users 80 can use to interact with the computer 510. The system 100 is not limited to the use of GUIs or other typical computer-oriented interfaces as user interfaces 528.
[00113] 8. Network Adapter
[00114] Any device by which a computer 510 communicates to a network 540. Network adapters 529 can operate wirelessly or be hardwired.
[00115] a. Network
[00116] A telecommunications network that allows computers 510 to communicate with each other and exchange data 560. A wide variety of networks 540 can be incorporated into the system 100. Examples of network configurations include but are not limited to bus networks, star networks, ring networks, mesh networks, and tree networks. Different networks 540 can utilize different communication protocols and operate in different geographic scales.
[00117] i. Local Network
[00118] A network 540 that is limited in scope to particular geographic location such as a home or office. Common examples of local networks include personal area networks, local area networks, home area networks, and storage area networks.
[00119] ii. Externa] Network
[00120] A network 540 that is not a local network 542. Common examples of external networks 544 include the Internet, wide area networks, metropolitan area networks, and enterprise networks.
[00121] B. Examples of Different Types of Computers
[00122] As discussed above, computers 510 can possess a wide variety of different components. Computers 510 can also be embodied in a variety of different categories of computers 510. Figure 3b is a block diagram illustrating an example of different types of computers 510.
[00123] 1. Embedded Computer
[00124] A computer 510 that is embedded into a device, apparatus, or assembly. An embedded computer 511 is not typically a general purpose computer 514.
[00125] 2. Server
[00126] A computer 510 that primarily exists for the purpose of making functionality available to other computers 510, such as to client device 513.
[00127] 3. Client Device
[00128] A computer 510 that functions at least in part by accessing at least some data 560 or an application 524 on a sever 512. Many examples of client devices 513 are general purpose computers 514. Some examples of client devices 513 are mobile computers 515.
[00129] 4. General Purpose Computer
[00130] A computer 510 that providers a user 80 with substantial flexibility in configuring the computer 510. For example, a one user 80 may use their desktop computer primarily to play games online while another user 80 may use that same model of desktop computer to perform legal research. Examples of general purpose computers 514 include desktop computers, laptop
computers, smart phones, tablet computers notepad computers, mainframe computers, minicomputers, work stations, and other devices that allow users 80 to decide what is run on the computer 520. Some general purpose computers 514 are mobile computers 515.
[00131] 5. Mobile Computer
[00132] A computer 510 that is easily moved from on location to another, such as smart phone, a tablet, or even a notepad or laptop computer.
[00133] C. Examples of System Components that can include Computers
[00134] Figure 3 c is a block diagram illustrating an example of different components of the system 100 that can include computers 10. Virtually any component of the system 200 can include a computer 510. The server 512 is a type of computer 510. The media player 500 can include one or more computers 510. The apparatus 200 can include computer 510 and the sensor 300 of apparatus 200 can include its own separate computer 510. Alternative configurations of various computerized components of the system 100 are disclosed in Figures 3d and 3e.
[00135] D. Data
[00136] Figure 3f is a block diagram illustrating an example of different types of data 560 that can be stored on a database 526 used by the system 100.
[00137] 1. Profiles
[00138] Any form of information that is capable of being accessed and/or stored by a computer 510. Data 560 can represent facts, statistics, and other forms of information. Data 560 can include profiles 562, history 564, instructions 566, as well as social media 568.
[00139] 2. History
[00140] A user 80 can interact with the system 100 and other users 80 of the system 100 through a profile 562. The profile 562 is associated with one or more users 80. A single user 80 can have more than one profile 562. To identify examples of profiles 562 one only has to look at popular websites such as Amazon.com, Facebook.com, eBay.com, Google.com, etc. to understand the wide diversity of profiles 562 that can exist. Profiles 562 can include virtually any information pertaining to the user 80 including the preferences of the user 80 and the totality of a user's history with the system 100. The more data 560 that exists for a particular user 80 and that can be associated with a particular profile 562, the more it is possible for the heuristics 525 of the system 100 to make assessments on what a user 80 wants.
[00141] 3. Instructions
[00142] In addition to collecting and analyzing data 560, the system 100 can also be used to affirmatively trigger instructions 566. For example, the system 100 can be used to initiate automated or semi-automated activities.
[00143] 4. Social Media
[00144] A collection of websites and other online means of communication and interaction. Social media 568 are a substantial avenue for creating and sharing data 560. The system 100 can be integrated work with social media 568.
[00145] 5. Media Experiences
[00146] A media experience is any type of media content that can be accessed or captured using the apparatus 200.
[00147] 6. Sensor Feedback
[00148] A senor feedback 310 is any type of data 560 captured from a sensor 300.
[00149] IV. OVERVIEW OF EYE TRACKING AND USER TRACKING
[00150] The system 100 can utilize sensors to capture information about a user experiencing media as the user is experiencing the media. Such information can be stored, analyzed, updated, and otherwise used to create a profile for a specific user that embodies the interests, tastes, preferences, and history of the user. Such a profile can be used in a wide variety of different ways, including but not limited to the modification of a media experience being delivered to that user.
[00151] A. A wide range of sensor metrics can be captured and stored
[00152] A wide variety of different sensors, sensor configurations, and sensor measurements can potentially be incorporated into the processing of the system. Such sensors can be part of a media access or playing device such as a wearable near-eye display. However, the system can also utilize physically separate sensors that are in communication with each other and the device through which the user is accessing the media.
[00153] What the user is looking at (i.e. the focus of the eyes), whether or not the user nodding his or her head to the beat of the music, the heart rate of the user, blood pressure, skin temperature, skin conductance, and other indicia of emotional and physiological responses (collectively, "sensor metrics" or "sensor feedback") to the media experience can be captured and stored for the purposes of building a profile on the user. The ability to build a highly specific and nuanced profile or model of a user can provide significant benefits for users, suppliers of media experiences, and advertisers.
[00154] B. Profile-enabled functionality
[00155] Over time, the accuracy and utility of a profile will increase. Well developed and frequently updated profiles can be sources of actionable data that impacts the delivery of media experiences to users.
[00156] Consumers of media experiences, organizations in the business of selling or distributing media experiences, and advertisers alike benefit when the customer gets more of what they want, and when the customer has to spend less time looking for what they want.
[00157] 1. Media navigation
[00158] The universe of available media experiences is very large, and growing every minute. Many consumers of media content are simply unable to find what they are looking for, or worse yet, not realize that there are available media experiences out there that they should be looking for, but are unaware of.
[00159] Highly actionable profiles can assist consumers in finding what would be of interest to them. Such profiles can also assist media content providers in driving certain types of media experience opportunities to the subset of users most likely to be interested in that content.
[00160] Profiles can thus use both push and pull methodologies to create more opportunities for mutually beneficial deliveries of media experiences to the consumers who want them, without inundating consumers with unwanted marketing messages and advertisements.
[00161] A media navigation heuristic can include both manual/explicit decisions as well as automated processing assessments as inputs.
[00162] 2. Media modification
[00163] Data-rich profiles with actionable user data can be potentially used to do far more than merely direct media experience opportunities to their likely consumers. Such profiles can also be used to statically or even dynamically modify the media experiences themselves.
[00164] For example, advertisements identified to be of interest to the specific user can be delivered instead of generic advertising. With sufficient information about a user, the system could virtually ensure that only advertisements found interesting to the user would be delivered to the user. Two viewers of the same television program could be subjected to radically different advertisements based on the profiles associated with the users. Not only could different users be
subject to completely different ads, but the ads for the same product by the same advertiser could have variations tailored to the specific tastes, preferences, and histories of the user.
[00165] Similarly, only movie previews for movies that (1) have not yet seen by the user but (2) were highly likely to be of interest to the user can be delivered to the user in lieu of predefined movie previews. In this operating mode, what one user experiences can be totally different than what another user experiences, even if they both select the same media content. By way of example, advertising is an area that is ripe for fully dynamic content that targets the specific user.
[00166] The modification of the media experience delivered to a specific user is based on the profile associated with that specific user. The user profile is created, developed, and maintained by the capturing and updating of various sensor readings that pertain to the physiological and emotional responses of the user. Other marketing-type data pertaining to the user, such as where they live, what magazines they subscribe to, occupation, etc. may also be included in the universe of data that is used to comprise the profile for the user.
[00167] The modification of the media experience delivered to a user in response to the tastes, preferences, and history associated with the user is distinguishable from navigating a 3-D space through the use of a head tracker that modifies a point of view. In other words, the modifications brought about by the system are user-centric, and not based on factors outside the user such as the architecture or layout of simulated space of virtual reality with a point of view that corresponds to head movements within that space.
[00168] V. Introduction of Elements - for use in Eye/User Tracking
[00169] Figure 5 a is a block diagram illustrating an example of a sensor feedback (a sensor feedback attribute 310) being captured by a head-mounted apparatus 200 and then used by one or more heuristics 525 to impact the media experience 600 sent to the user 80 of the head-mounted apparatus 200. Figure la illustrates a processing loop between making a media experience 600 available to a user 80 wearing the head-mounted apparatus 200, and capturing sensor feedback during the time that the user 80 was engaging the media experience 600. The sensor feedback 310 is subjected to a processing heuristic 525 that is used to update a profile 562 associated with the user 80. The updated profile 562 can then be used to selectively influence the delivery of the media experiences 600 to the user 80, which can in turn result in additional sensor feedback 310. As the cycle progresses and reiteratively repeats, the profile 562 of the user 80 can be substantially and comprehensively developed to increase the predictive efficacy of the profile 562 in influencing
the media experiences 600 delivered to the user 600. This can ultimately result in a highly tailored user 80 experience in which only advertisements of interest to the user 80 are shown to the user 80 and in which the system 100 can make highly effective recommendations for media experiences 600 such as films, television programs, video games, music, video clips, and other types of media experiences.
[00170] A. IT Configurations/Architectures
[00171] Although the system 100 can be embodied in a stand-alone device such as a head- mounted display apparatus 210 or even in merely a head-mounted apparatus 200, many embodiments of the system 100 will incorporate varying degrees of distributed data processing across various devices. In still other embodiments of the system 100, the media player 500 may be a conventional television set, with the only thing worn by the user 80 being one or more sensors 300.
[00172] The system 100 can be implemented in a wide variety of different information technology configurations. For example, Figure 5a illustrates an example where the head-mounted apparatus 200 connects to a local media player 500 over a local network 542 and the local media player 500 communicates with a remote server 512 over an external network 544. In the illustrated configuration, the remote server 512 houses the heuristics 525 and profiles 562.
[00173] In the example illustrated in Figure 5b, there is no remote server 512 or an external network 544. All processing is performed over a local network 542 and the modification of media content occurs through heuristics 525 and profiles 562 stored in a local media player 500, such a television set, DVD player, or general purpose computer.
[00174] In the example illustrated in Figure lc, there is no local media player 500 and no local network 542. As in Figure 5a, the heuristics 525 and profiles 562 are housed in a remote server 512.
[00175] An alternative configuration to the examples described above would be a head-mounted apparatus 200 or some other form of near-head display device functioning are a stand-alone device, in which case the apparatus 200 would house the profiles 562 and heuristics 525 of the system 100.
[00176] B. Users
[00177] A user 80 of the system 100 is typically a human being, although it is possible to imagine instances where the user 80 could be another species of animal or even a form of robot or artificial
intelligence, expert system, or other similar form of automated intelligence. The system 100 captures sensor feedback 310 (which can also be referred to as sensor metrics 310) from specific users 80 in creating, developing, and updating one or more profiles 562 for that specific user 80.
[00178] C. Media Experience
[00179] A media experience 600 is typically media content stored in some format cognizable to the system 100 and in some device or component that is accessible to the system 100. In many if not most instances, the media experience 600 is something originating from outside the user 80 and outside the head-mounted apparatus 200. Examples of media experiences 600 can include but are not limited to: films, television programs, songs, video games, music videos, computer programs, virtual reality applications, books on tape, e-books, e-magazines, and other forms of media interactions.
[00180] The media experiences 600 of the system 100 can be modified by the system 100, tailoring the media experiences 600 to a specific user 80 based on the profile 562 associated with the specific user 80 and the various heuristics 525 used to develop a profile 562 for the user based on sensor feedback 310 and other data that the system 100 is cognizant of.
[00181] It is envisioned that the ability of the system 100 to modify the media experience 600 delivered to individual users 80 on the basis of user-specific profiles 562 will impact that way that media providers create and distribute content. For example, a movie could be written to include a library of varying characters, storylines, and endings, with the user-specific profile 562 being used to modify core elements and characters of the story. A film rated R in its generally theatrical release could be automatically modified to a PG-13 rating so that the movie would be suitable for viewing by a young teenager.
[00182] The modified media experience 600 that can made available by the system 100 is not be confused with a prior art virtual reality technology in which the movement of the head triggers a change in the point of view of the user. Such technologies modify the view of the user 80 on the basis of the layout or floor plan of the virtually reality space, and not the emotional or physiological responses of the user 80.
[00183] D. Sensors and Captured Data
[00184] The benefits of an actionable user profile 562 is dependent upon the meaningful collection of data pertaining to the user 80. As illustrated in Figure Id such data is captured by one or more sensors 300. The data captured by a sensor 300 can be referred to as sensor feedback
310, sensor readings 310, or sensor metrics 310. As illustrated in Figure lg, a wide variety of different sensors 300 can be incorporated into the functionality of the system 100. A single embodiment of the system 100 can incorporate one or more different sensors 300 of one or more different types. Some sensors 300 may be physically embedded in a head-mounted display apparatus 210 (a head-mounted media access device that includes the capability of delivering a video component, unlike a standard headphone) as displayed in Figure Id, but other sensors may be physically separate from such an apparatus 200. Figure lg provides examples of different subsets of sensors 300 that are not mutually exclusive (for example, an authentication-based sensor 309 will often be an image -based sensor 310) and are not exhaustive (i.e. other types of sensors 300 can be used).
[00185] The apparatus 200, such as a display apparatus 210, used to "play" or otherwise make the media experience 600 accessible to the user 80 is an excellent opportunity to position certain types of sensors 300, particularly sensors 300 that capture measurements relating to the eye(s) of the user 80. One prominent category of feedback from the sensors (i.e. sensor feedback 310) is eyelid timing, retina tracking, and other near-eye sensor metrics. The individualized nature of the head- mounted apparatus 200, the possibility of definitively identifying the identity of the user 80 based on biometric sensor reading (such as a retina scan or an iris scan), and the ability to closely monitor the eye activity of a user 80 while the user 80 is actively involved in a media experience 600 provides a valuable opportunity to model the behavior of the user 80 and to have that model embodied in a profile 562 that is accessible to the system 100. Conventional data from surveys filled out by the user 80 and other data relating to the user 80 can be used to further augment the profile 562 of the user 80.
[00186] Sensor feedback 310 can also include biometric information used to definitively identify the user 80 to ensure that the correct profile 562 is being updated. Examples of biometric sensor feedback 310 can include but are not limited to iris scans and retina scans.
[00187] 1. Sensor types based on phenomenon being sensed
[00188] One way to delineated between different types of sensors 300 that can be incorporated into the system 100 is to distinguish the sensor 300 based on the phenomenon being sensed. For example an image-based sensor 301 which would include still frame cameras, video cameras, retina scanners, etc operate through the collection of images. The sensors 300 used by the system
100 can include a temperature-based sensor 302, a motion-based sensor 303, a force-based sensor 304, a conductance-based sensor 305, and a sound-based sensor 306.
[00189] 2. Sensor types based on assessment generated
[00190] Another way to delineate between different types of sensors 300 that can be incorporated into the system 100 is to distinguish the sensor 300 based on what is being assessed. Such categories can include but are not limited to an attention-based sensor 307 which captures information about attentiveness and/or the focal point of attention, an physiological-based sensor 308 used to capture physiological data about the user 80, and an authentication-based sensor 309 which captures information that authenticates the identity of the user 80.
[00191] It is believed that information pertaining to the timing of eye movements, to the focus of the attention of the user 80, and other metrics relating to the consumption of the media experience 600 by the user 80 can be particularly effective in building, developing, and updating a profile 562 that can be used to proactive address the wants and needs of the user 80. Figure Id is a diagram illustrating an example of a sensor 300 embedded in the apparatus 200. The sensor 300 captures sensor feedback 310 for use by the system 100, and the system 100 supplies media experiences 600 to the apparatus 200 that can be selectively influenced by the sensor feedback 310.
[00192] E. Profde
[00193] Returning to Figures la-lc, a profile 562 is an aggregate representation of the user 80 within the system 100. A user 80 can interact with the system 100 and potentially other users 80 of the system 100 through a profile 562. A profile 562 is associated with one or more users 80. A single user 80 can have more than one profile 562. To identify examples of profiles 562 one only has to look at popular websites such as Amazon.com, Facebook.com, eBay.com, Google.com, etc. to understand the wide diversity of profiles 562 that can exist. Profiles 562 can include virtually any information pertaining to the user 80 including the preferences of the user 80 and the totality of a user's history with the system 100. The more data 560 that exists for a particular user 80 and that can be associated with a particular profile 562, the more it is possible for the heuristics 525 of the system 100 to make assessments on what a user 80 wants to have happen without having to ask the user 80.
[00194] The ability of the system 100 to track and monitor in a detailed and comprehensive way what the user 80 is looking at while undergoing a media experience 600 provides the system 100 with unique detail in constructing, developing, and maintaining profiles 562 for users 80.
[00195] F. Heuristics
[00196] A heuristic 525 is a problem solving methodology, an analytical process, or similar approach that can be implemented by a processor 521 of a computer 510 that is running an application 524. Heuristics 525 can include algorithms but are not limited to algorithms. The system 100 can include a wide variety of heuristics 525 pertaining to the tracking of eye movements, the timing of eye movements, other eye-related data, and other data pertaining to the user 80 in evaluating the interests, desires, and preferences of users 80.
[00197] G. Head-Mounted Display Apparatus
[00198] A head-mounted apparatus 200 is potentially any device capable of being worn on the head of the user 80 that is capable of delivering media experiences 600 to the user 80. The apparatus 200 used by the system 100 will typically include the capabilities of video and/or audio. An apparatus 200 with the ability to convey a visual component can also be referred to as a display apparatus 210 or a head-mounted display apparatus 210. In addition to having some type of screen, direct projection of images on the retinas of the user or other similar display technology couple with speakers or some other form of audio technology, the display apparatus 210 can also include a sensor 300 for capturing sensor readings (i.e. sensor feedback 310) that pertain to information about the user 80.
[00199] H. Media Player
[00200] In some embodiments of the system 100, the apparatus 200 is the media player 500. In other embodiments, the media player 500 is a separate device such as a TV, DVD player, computer 510, cable box, or similar device.
[00201] I. Server
[00202] Many embodiments of the system 100 and apparatus 200 will involve one or more servers 512. Servers 512 are computers 510 that can enable a wide variety of client devices to participate in the system 100. One way to make the system 100 available to multiple users 80 and to provide users 80 with a high degree of portability in terms of their profiles 562 is to house much of the data 560 and computer 510 components necessary for the functionality of the system 100 on remote servers 512 that can be accessed by multiple users 80 simultaneously, regardless of the locations of those users 80 at the time.
[00203] J. Operating Mode
[00204] Some embodiments of the system 100 can operate in a variety of different operating modes 600. Different embodiments of the system 100 can include different operating modes 600 and different abilities of users 80 to select of otherwise impact the operating mode 600 that applies to a particular media experience 600. The universe of potential operating modes 600 that can be offered by system 100 can include a "record only" mode 660, a "modify only" mode 680, and record and modify mode 670, and a mode of off 690.
[00205] 1. Record only
[00206] In this operating mode 660, the system 100 captures information about the user 80, but such information is not used to impact the media experience 600 being delivered to the user 80 at that time. In a "record only" operating mode 660, the user 80 is exposed to the same media experience 600 as other users 80 even though such other users 80 may possess vastly different profiles 562 in terms of tastes, preferences, histories, etc. This operating mode 660 supports the building and updating a profile 562 associated with the user 80, but the profile 562 is not used to impact the delivery of the media experience 600 currently being delivered to the user 80.
[00207] Some embodiments of the system 100 can distinguish between a "full record" mode 662 where all possible data is collected and stored in contrast to a "partial record" mode 664 where certain data is excluded from the process of storing sensor readings 310 captured by the sensor(s) 300.
[00208] 2. Modify only
[00209] In this operating mode 680, what one user experiences can be totally different than what another user experiences, even if they both select the same media content 600. In a "modify only" mode 680, no data from the user experience is stored by the system 100. Users 80 may choose a "modify only" mode 680 when trying something new, if they are not feeling well, or to protect their privacy with respect to a particular matter.
[00210] Some embodiments of the system 100 may differentiate between different magnitudes of content modification, including a "full modify" mode 682 as well as any number of "partial modify" modes 684 that can range from 1% of the modification capabilities to 99%.
[00211 ] 3. Record and Modify
[00212] Many multi-mode systems 100 will include a mode of "record and modify" 670. As illustrated in Figure If, there are several variations of the record & modify mode 670 that can be implemented by the system 100, including a "full record/full modify" mode 672, a "full
record/partial modify" mode 676, a "partial record/partial modify" mode 674, and a "partial record/ full modify" mode 678.
[00213] The greater the ability of the system 100 to capture and store sensor readings 310, the greater the capacity of the system 100 to create and update actionable profiles 562 that represent accurate models of the tastes, preferences, and history of users 80. However, there can be circumstances when additional data may contribute more noise than signal. Moreover, there can be situations where the privacy concerns of the user 80 may trump the efficacy goals of the system 100.
[00214] Analogous concerns may exist on the modify side of the system 100 as well. Fully dynamic media experiences 600 such as ads can be highly desirable. Two viewers of the same television program could be subjected to radically different advertisements based on the profiles associated with the users. Not only could different users be subject to completely different ads, but the ads for the same product by the same advertiser could have variations tailored to the specific tastes, preferences, and histories of the user. In other contexts, the user 80 may want to limit the customization of their media experience 600 of the content provider may provide fewer options for customizations.
[00215] The modification of the media experience delivered to a user 80 in response to a profile 562 or model associated with the user is distinguishable from navigating a 3-D space through the use of a head tracker that modifies a point of view. The modifications by the system 100 are based on emotional and physiological responses of the user 80 that are embodied in a profile 562 or model of that user 80, i.e. such modifications are based on the tastes, preferences, and history of the user. The processing performed in the navigation of three dimensional space
[00216] VI. PROCESS-FLOW VIEW
[00217] Use of the system 100 and of the head-mounted apparatus 200 can be described in terms of a method comprising process steps.
[00218] At 700, sensor feedback 310 is captured by the sensor 300 in the apparatus 200.
[00219] At 702, one or more profiles 562 corresponding to the user 80 are updated based on the senor feedback 310 pertaining to that user 80.
[00220] At 704, the media experience 600 for that user 80 can selectively and dynamically modified based on the updated profile 562 corresponding to the user 80.
[00221]
[00222] VII. ALTERNATIVE EMBODIMENTS
[00223] Many features and inventive aspects of the system 100 and apparatus 200 are illustrated in the various drawings and/or discussed above or below. No patent application can expressly disclose in words or in drawings, all of the potential embodiments of an invention. In accordance with the provisions of the patent statutes, the system, apparatus, and method are explained and illustrated in certain preferred embodiments. However, it must be understood that the system 100, apparatus 200, and method may be practiced otherwise than is specifically explained and illustrated without departing from its spirit or scope.
[00224] The description of the system 100, apparatus 200, and method provided above and below should be understood to include all novel and non-obvious alternative combinations of the elements described herein, and claims may be presented in this or a later application to any novel non-obvious combination of these elements. Moreover, the foregoing embodiments are illustrative, and no single feature or element is essential to all possible combinations that may be claimed in this or a later application.
[00225] VIII. INDEX
[00226] Table 1 below provides a chart of element numbers, element names, and element descriptions.
basis of shared hobby, a common profession, a geographic location, or any other attribute relevant to users 80.
100 System An aggregate configuration of IT components 500 and data 600 that includes a head-mounted apparatus 200 for interfacing with a user 80. The system 100 can collectively act as an interface between the user 80 wearing a head-mounted apparatus 200 and an IT environment 400. The system 100 can include an authentication subsystem 110, an eye tracking subsystem 120, a user tracking subsystem 130, a customized media subsystem 140, and/or potentially other subsystems.
110 Authentication A subset of the system 100 that can authenticate the identity of
Subsystem a user 80. The authentication subsystem 110 can use one or more sensor readings 310 captured from one or more sensors 300 to authenticate the identity of the user 80. This typically involves linking the user 80 to a profile 620.
120 Eye Tracking A subset of the system 100 that can track the eye 82 and eye
Subsystem attributes 85 of the user 80 wearing the head-mounted apparatus 200.
130 User Tracking A subset of the system 100 that can capture information about
Subsystem the user 80. The user tracking subsystem 130 can include the eye tracking system 120.
140 Customized A subset of the system 100 that can dynamically select and/or
Media modify media experience 610 provided to the user 80 through
Subsystem the head-mounted apparatus 200.
Or
Media Delivery
Subsystem
150 POV Sharing A subset of the system 100 that can provide for sharing a media
Subsystem experience 610 with other users 80.
160 Navigable A subset of the system 100 that can provide for creating a
Space virtual space that can be navigated by one or more users 80.
Subsystem
200 Apparatus A device capable of being worn on the head 86 of a user 80 that
or can deliver some type of media experience 610 to user 80. In
Head-Mounted many embodiments of the apparatus 100, the device also
Apparatus includes one or more sensors 300 that capture sensor readings
310 for the system 100. Different embodiments of the apparatus 200 can provide for delivering media experiences 610 utilizing the different senses of the user 80, such as visual content 611, sound content 613, haptic content 614, olfactory content 61 , and/or taste content 616.
210 Display A head-mounted apparatus 200 that includes a display
Apparatus or component 220. Conventional headphones could not serve as a head-mounted display apparatus 210.
Head-Mounted
Display
Apparatus
220 Display A component of a head-mounted display 210 that provides for
Component delivery the visual content 611. The display component 220 can be a display screen, a virtual retina display (see U.S. Patent Numbers 8,982,014 and 9,076,368), or any other display technology known in the prior art.
222 Speaker A component of a head-mounted apparatus 200 that provides
Component for the delivery of sound content 613.
300 Sensor A device embedded to or otherwise in communication with the apparatus 200 or other system 100 components. There are a wide variety of different types and categories of sensors 300 that can be utilized by the system 100. Any sensor 300 used in the prior art can potentially be used or adapted for use in the system 100. Sensors 300 can be integrated into the head- mounted apparatus 200, other devices in the system 100, and/or free standing devices that are merely in communication with other components of the system 100. Sensors 300 of a wide variety of sensor types 302 can be utilized within the system 100.
302 Sensor Types Sensors can be categorized in terms of the technology that they use (a camera 320 is based on capturing light, a microphone 326 is based on capturing sound) or the attribute of the user 80 that is captured (
310 Camera A sensor 300 that captures an image. Cameras 310 can utilize a wide variety of different ranges of spectral light, including but not limited to the visual spectrum, infrared, ultraviolet, and as well generate images from non-light based sources, such as an ultrasound.
312 Retina Scanner A retina scanner 312 is typically a camera 310, and it is also an example of a biometric sensor 320. A retina scanner 312 captures an image of a human retina that can be used for the purposes of identification.
313 Fingerprint A fingerprint scanner 313 is typically a camera 310, and it is
Scanner also an example of a biometric sensor 320. A fingerprint
scanner 312 captures an image of a human retina that can be used for the purposes of identification.
314 Video Camera A camera 310 that rapidly captures a sequence of still frame images that when played rapidly in succession, convey to the viewer a sense of motion.
316 Microphone A sensor 300 that captures sound. The sounds captured by a microphone can include information pertaining to the source of the sound (distinguishing one spoken voice from another) as well as containing information explicitly through the sound itself (such as a spoken instruction or password).
318 Voice Print A microphone 316 used to authenticate the identity of the Scanner speaker. A voice print scanner 318 is an example of a biometric sensor 320, although part of the validation can include a user ID or password component.
320 Biometric A sensor 300 that measures and analyzes biological attributes
Sensor of a human being that can be used to authenticate the identity of the human being. Biometric attributes can include but are not limited to DNA, fingerprints, eye retinas, eye irises, voice patterns, facial recognition, and hand measurements.
330 Motion Sensor A sensor 300 that captures information pertaining to movement.
332 Gait Scanner or A motion sensor 330 that captures information about the walk
Walk Scanner (i.e. gait) of an individual.
334 Mouse Scanner A motion sensor 330 that captures information about the mouse movements of an individual.
340 Behavior Sensor A sensor 300 that captures information about the behavior of a human being. In many authentication contexts, sensors 300 will either fall into the category of behavior sensors 340 or biometric sensors 320.
342 Temperature A sensor 300 that measures temperature.
Sensor
344 Physiology A sensor 300 that measures one or more physiological
Sensor attributes of a human being, such as weight, heart rate, blood pressure, height, etc.
345 Heart Rate A type of physiology sensor 344 that captures information
Sensor about the frequency that the heart beats.
346 Blood Pressure A type of physiology sensor 344 that captures information
Sensor about the
348 Location Sensor A type of sensor 300 that captures information about location or Position and/or position. GPS is an example of a location sensor 348. Sensor
350 Sensor A sensor reading or metric captured by the sensor 300.
Feedback or Examples of potentially important types of sensor feedback 350 Sensor include biometric data such iris scans and retina scans for
Reading or identification purposes as well as variety of metrics relating to Sensor Metric eye movement, timing, and media content.
400 IT A configuration of IT components 500 and data 600 that the
Environment user 80 interacts with respect to a particular action by a user 80.
By way of example, a user 80 may visit a particular website to make a purchase, in which case the IT components 500 and data 600 utilized to communicate between the user 80 and the seller make up the applicable IT environment 400. Whether the activity is performing an online banking transaction, making a purchase, communicating with friends, watching movies, playing video games, or working on a collaborative project
aptop computer.
521 Processor A microprocessor or central processing unit that provides for the running of computer programs/software applications 524.
522 Memory A component accessible to the processor 11 where data 560 is
Component directly accessible by the processor 511 but not permanently stored. The memory component 512 is often referred to as RAM, random access memory.
523 Storage A component that provides for the permanent storage of data
Component 560.
524 Application An application 524 can also be referred to as a computer
program 524 or a software application 524. An application 524 is a unit of computer code or a set of instructions that are capable of being implemented or run by the processor 521. A wide variety of different programming languages can be used to create the application(s) 524 used by the system 100. There are a voluminous number of different types of programming languages including but not limited to: object-oriented languages, assembly languages, fourth-generation languages, visual languages, XML-based languages, and hardware description languages.
525 Heuristic A problem solving, analytical process, or similar approach that can be implemented by a processor 521 running an application 524. Heuristics 525 can include algorithms but are not limited to algorithms.
526 Database A comprehensive collection of data 560 that is organized for convenient access. The system 100 can incorporate the capabilities of any type of database 526. Examples of databases 526 include but are not limited to: relational databases, object-oriented databases, navigational databases, and NoSQL databases.
528 User Interface An interface between the user 80 and the system 100. The most prominent example of a user interface 528 is a graphical user interface (GUI) that functions as a virtual layer closest to the physical aspects of the client device 513 or other type of computer 510. For example, a GUI will typically include virtual menus, buttons, editable fields, drop down list boxes, etc. that can users 80 can use to interact with the computer 510. The system 100 is not limited to the use of GUIs or other typical computer-oriented interfaces as user interfaces 528.
529 Network Any device by which a computer 510 communicates to a
Adapter network 540. Network adapters 529 can operate wirelessly or be hardwired.
540 Network A telecommunications network that allows computers 510 to communicate with each other and exchange data 560. A wide variety of networks 540 can be incorporated into the system 100. Examples of network configurations include but are not limited to bus networks, star networks, ring networks, mesh
networks, and tree networks. Different networks 540 can utilize different communication protocols and operate in different geographic scales.
542 Local Network A network 540 that is limited in scope to particular geographic location such as a home or office. Common examples of local networks include personal area networks, local area networks, home area networks, and storage area networks.
550 Media Player A device such as disc player or cable box that provides for
"playing" media.
600 Data Any form of information that is capable of being accessed and/or stored by a computer 510. Data 560 can represent facts, statistics, and other forms of information. Data 560 can include profiles 562, history 564, instructions 566, as well as social media 568.
610 Media Content that the user 80 interacts with at least partially through
Experience or the head-mounted apparatus 200. A media experience 610 can Media Content engage one or more senses of the user 80 such a visual content
611 (including but not limited to video content 612), sound component 613, haptic content 614, olfactory content 615, and taste content 616.
611 Visual Content An aspect of the media experience 610 that is experienced through the sense of sight.
612 Video Content An image component 611 that is in the form of video/
613 Sound Content An aspect of the media experience 610 that is conveyed
through the sense of sound.
614 Haptic Content An aspect of the media experience 610 that is conveyed
through the sense of touch. A common example of a haptic component 614 is vibration.
615 Olfactory An aspect of the media experience 610 that is conveyed
Content through the sense of smell.
616 Taste Content An aspect of the media experience 610 that is conveyed
through the sense of taste.
620 Profile An association between an identity of a user 80 and a set of data 600. A user 80 can interact with the system 100 and other users 80 of the system 100 through a profile 620. The profile 620 is associated with one or more users 80. A single user 80 can have more than one profile 620. To identify examples of profiles 562 one only has to look at popular websites such as Amazon.com, Facebook.com, eBay.com, Google.com, etc. to understand the wide diversity of profiles 562 that can exist. Profiles 562 can include virtually any information pertaining to the user 80 including the preferences of the user 80 and the totality of a user's history with the system 100. The more data 560 that exists for a particular user 80 and that can be associated with a particular profile 562, the more it is possible for the heuristics 525 of the system 100 to make assessments on
what a user 80 wants to have happen without having to ask the user 80.
622 User History A subset of data 600 that pertains to the history of a user 80 with an IT environment 400. The historical data 622 of a user 80 can be valuable information for the processing of the system 100. So can the historical data 622 for the aggregated community of users 80.
624 User ID A unique identifier that corresponds to a specific user 80. A specific user 80 may have more than one ID 624, but an ID 624 pertains to only a specific user 80.
626 Group ID A unique identifier that corresponds to a specific group of
664 History A subset of data 560 that relates to the previous interactions with the system 100.
666 Instructions In addition to collecting and analyzing data 560, the system 100 can also be used to affirmatively trigger instructions 566. For example, the system 100 can be used to initiate automated or semi-automated activities.
668 Social Media A collection of websites and other online means of
communication and interaction. Social media 568 are a substantial avenue for creating and sharing data 60. The system 100 can be integrated work with social media 568.
690 Authenticated An identification of a user 80 made with respect to a profile
ID 620 with the applicable user-related data 600.
900 Method A process for interfacing between an IT environment 400 and a user 80.
910 Authentication A process for authenticating a user 80. This process typically
Method involves linking a user 80 to a profile
920 Eye Tracking/ A process for tacking the eye of a user
Monitoring
Method
930 User Tracking/
Monitoring
Method
940 Dynamic Media
Playing Method
950 Operating A status of operations pertaining to the system 100 that pertains
Mode to the degree to which the system 100 is capturing and/or
storing sensor readings 310 and the degree to which the system 100 is modifying media experiences 600 on the basis of a profile 562 created and updated through the capture of such sensor readings 310.
960 Record-Only An operating mode 650 that does not involve the modifying of the media experience 600 being delivered to the user 80. A record only mode 660 can be a full record mode 662 where every possible sensor 310 is captured and stored, or a partial
record mode 664 where certain readings 310 are not stored and captured.
970 Record & An operating mode 650 that involves both the capturing of
Modify Mode sensor readings 310 and the modification of the media
experience 600 delivered to the user 80 on the basis of the profile 562 developed using past and/or present sensor readings 310. The record and modify mode 670 can include a full record/full modify mode 672, a partial record/partial modify mode 674, a full record/partial modify mode 676, and a partial record, full modify mode 678.
980 Modify-Only An operating mode 650 that does not involve capturing a sensor readings 310 from the user 80 during the user's engaging in the media experience 600. The modify only mode 680 can be a full modify mode 682 or a partial modify mode 684.
Table 1. Index of Elements.
Claims
1. A system (100) that acts as an interface between a user (80) and an IT environment (400), said system (100) comprising:
a head-mounted display apparatus (220) worn by the user (80) to communicate a media experience (610) to said user (80);
a sensor (300) that provides for capturing a sensor reading (350) pertaining to the user (80) wearing said head-mounted display apparatus (220);
a server (512) in communication with said head-mounted apparatus (200) and said sensor(300), said server (512) including:
a processor (521);
an application (524) that provides for running on said processor (521);
a database (526) storing a plurality of data (600), said plurality of data (600) including a profile (620) associated with the user (80);
wherein application (524) selectively modifies said profile (620) associated with the user (80) wearing said head-mounted display apparatus (220) using said sensor reading (350).
2. The system (100) of claim 1, wherein said application (524) provides for identifying an authenticated identity (690) of the user (80) as an output by referencing said profile (620) and said sensor reading (350) as a plurality of inputs.
3. The system (100) of claim 2, wherein said authenticated identity (690) is associated with a confidence metric (663).
4. The system (100) of claim 3, said system (100) including a plurality of sensors (300) capturing a plurality of sensor readings (310), wherein said plurality of sensors (300) include a retina scanner (312) and a gait scanner (332).
5. The system (100) of claim 2, wherein said authenticated identity (690) is a continuous authenticated identity (691).
6. The system (100) of claim 2, wherein said authenticated identity (690) is a periodically authenticated identity (692).
7. The system (100) of claim 1, wherein said sensor reading (350) selectively and dynamically modifies said media experience (610) communicated to said head-mounted display apparatus (210).
8. The system (100) of claim 1, where said sensor reading (350) relates to a reaction of the user (80) to said media experience (610).
9. The system (100) of claim 8, wherein said profile (620) includes a subset of said data (600) that pertains to said reaction of said user (80) to a plurality of media experiences (610) experienced by said user (80),
10. The system (100) of claim 1, wherein said profile (620) associated with the user (80) selectively influences a selection of at least one said media experience (610) from a plurality of said media experiences (610).
11. The system (100) of claim 1, wherein said media experience (610) is stored on said server (512).
12. The system (100) of claim 1, wherein said sensor reading (350) is a point-of-interest within an image making up said media experience (610) being viewed by the user (80) through said head-mounted display apparatus (210).
13. The system (100) of claim 1, wherein said profile (620) is created using a plurality of sensors (300), said plurality of sensors (300) including a biometric sensor (320), a location sensor (348), and a behavior sensor (340).
14. A method (900) for interacting with an IT environment (400), said method (900) comprising:
creating a profile (620) pertaining to user (80) wearing a head-mounted display apparatus (210) that is worn by the user (80) to engage in a media experience (610);
identify an authenticated ID (690) for the user (80) through a plurality of sensor readings (650) used to capture user attributes (621);
assess a plurality of preferences associated with the user (80) by tracking a plurality of eye attributes (85) pertaining to the user (80), wherein said plurality of preferences are used to modify said profile (620) associated with the user (80); and
selectively and dynamically modify a plurality of media experiences (610) communicated to the user (80) through said head-mounted display apparatus (210)
15. The method (900) of claim 14, wherein a confidence metric (663) is associated with the authenticated ID (690) and said confidence metric (663) is communicated to said IT environment (400).
16. The method (900) of claim 14, wherein said authenticated ID 690 is a continuously authenticated ID 691 with a continuously re -calculated confidence metric (663).
17. A system (100) that acts as an interface between a user (80) wearing a head-mounted display apparatus (210) and an IT environment (400), said system (100) comprising:
an authentication subsystem (110) that provides for identifying an authenticated ID (690) using a sensor reading (650) captured by said head-mounted display apparatus (210); an eye tracking subsystem (120) that provides for tracking the eye attributes (85) of the user (80) wearing said head-mounted display apparatus (210);
a user tracking subsystem (130) that provides for assessing information about the user (80) through the correlation between eye attributes (85) and a media experience (610) of the user (80); and
a customized media delivery subsystem (140) that provides for dynamically selecting and modifying said media experience (610) communicated to the user (80) through said head- mounted display apparatus (210).
18. The system (100) of claim 17, wherein said IT environment (400) provides for communicating a plurality of media experiences (610) to the user (80).
19. The system (100) of claim 17, wherein said authenticated ID (690) is associated with a confidence metric (663).
20. The system (100) of claim 17, wherein a plurality of profiles (620) pertaining to a plurality of users (80) are stored on said head-mounted display apparatus (210).
Applications Claiming Priority (4)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
US201462041019P | 2014-08-22 | 2014-08-22 | |
US201462041008P | 2014-08-22 | 2014-08-22 | |
US62/041,019 | 2014-08-22 | ||
US62/041,008 | 2014-08-22 |
Publications (2)
Publication Number | Publication Date |
---|---|
WO2016029232A2 true WO2016029232A2 (en) | 2016-02-25 |
WO2016029232A3 WO2016029232A3 (en) | 2016-04-14 |
Family
ID=55351390
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
PCT/US2015/046617 WO2016029232A2 (en) | 2014-08-22 | 2015-08-24 | System, apparatus, and method for interfacing between a user and an it environment |
Country Status (1)
Country | Link |
---|---|
WO (1) | WO2016029232A2 (en) |
Cited By (1)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US10241576B2 (en) | 2017-05-08 | 2019-03-26 | International Business Machines Corporation | Authenticating users and improving virtual reality experiences via ocular scans and pupillometry |
Family Cites Families (2)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US8839358B2 (en) * | 2011-08-31 | 2014-09-16 | Microsoft Corporation | Progressive authentication |
US9092600B2 (en) * | 2012-11-05 | 2015-07-28 | Microsoft Technology Licensing, Llc | User authentication on augmented reality display device |
-
2015
- 2015-08-24 WO PCT/US2015/046617 patent/WO2016029232A2/en active Application Filing
Cited By (3)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US10241576B2 (en) | 2017-05-08 | 2019-03-26 | International Business Machines Corporation | Authenticating users and improving virtual reality experiences via ocular scans and pupillometry |
US10386923B2 (en) | 2017-05-08 | 2019-08-20 | International Business Machines Corporation | Authenticating users and improving virtual reality experiences via ocular scans and pupillometry |
US11042622B2 (en) | 2017-05-08 | 2021-06-22 | International Business Machines Corporation | Authenticating users and improving virtual reality experiences via ocular scans and pupillometry |
Also Published As
Publication number | Publication date |
---|---|
WO2016029232A3 (en) | 2016-04-14 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
US20230300420A1 (en) | Superimposing a viewer-chosen private ad on a tv celebrity triggering an automatic payment to the celebrity and the viewer | |
US9363546B2 (en) | Selection of advertisements via viewer feedback | |
US20160350801A1 (en) | Method for analysing comprehensive state of a subject | |
CN105339969B (en) | Linked advertisements | |
US11416918B2 (en) | Systems/methods for identifying products within audio-visual content and enabling seamless purchasing of such identified products by viewers/users of the audio-visual content | |
US8725567B2 (en) | Targeted advertising in brick-and-mortar establishments | |
EP3433707B1 (en) | Head mounted display system configured to exchange biometric information | |
KR102273750B1 (en) | Apparatus and method for processing a multimedia commerce service | |
US20140130076A1 (en) | System and Method of Media Content Selection Using Adaptive Recommendation Engine | |
US20080004951A1 (en) | Web-based targeted advertising in a brick-and-mortar retail establishment using online customer information | |
US20070276721A1 (en) | Computer implemented shopping system | |
KR20140045412A (en) | Video highlight identification based on environmental sensing | |
US20130126599A1 (en) | Systems and methods for capturing codes and delivering increasingly intelligent content in response thereto | |
US20140325540A1 (en) | Media synchronized advertising overlay | |
US8522310B1 (en) | Psychometric keycard for online applications | |
US11762900B2 (en) | Customized selection of video thumbnails to present on social media webpages | |
US20120173580A1 (en) | Event Feedback Networking System | |
US20210176519A1 (en) | System and method for in-video product placement and in-video purchasing capability using augmented reality with automatic continuous user authentication | |
EP4113413A1 (en) | Automatic purchase of digital wish lists content based on user set thresholds | |
Khan et al. | Smart TV-based lifelogging systems: current trends, challenges, and the road ahead | |
WO2016029232A2 (en) | System, apparatus, and method for interfacing between a user and an it environment | |
Koch | What Are You Looking at? Emergency Privacy Concerns with Eye Tracking in Virtual Reality | |
Lavi | Fordham Intellectual Property, Media and Entertainment La w Journa l |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
121 | Ep: the epo has been informed by wipo that ep was designated in this application |
Ref document number: 15833161 Country of ref document: EP Kind code of ref document: A2 |
|
NENP | Non-entry into the national phase in: |
Ref country code: DE |
|
32PN | Ep: public notification in the ep bulletin as address of the adressee cannot be established |
Free format text: NOTING OF LOSS OF RIGHTS PURSUANT TO RULE 112(1) EPC ( EPO FORM 1205A DATED 16/06/2017 ) |
|
122 | Ep: pct application non-entry in european phase |
Ref document number: 15833161 Country of ref document: EP Kind code of ref document: A2 |