IL266333A - System and method for socially relevant user engagement indicator in reality devices - Google Patents
System and method for socially relevant user engagement indicator in reality devicesInfo
- Publication number
- IL266333A IL266333A IL266333A IL26633320A IL266333A IL 266333 A IL266333 A IL 266333A IL 266333 A IL266333 A IL 266333A IL 26633320 A IL26633320 A IL 26633320A IL 266333 A IL266333 A IL 266333A
- Authority
- IL
- Israel
- Prior art keywords
- user
- indicative
- activity
- data
- indicator
- Prior art date
Links
- 238000000034 method Methods 0.000 title claims description 52
- 230000000694 effects Effects 0.000 claims description 87
- 238000012544 monitoring process Methods 0.000 claims description 38
- 238000012545 processing Methods 0.000 claims description 25
- 238000007654 immersion Methods 0.000 claims description 17
- 230000003190 augmentative effect Effects 0.000 claims description 14
- 230000003213 activating effect Effects 0.000 claims description 8
- 230000000007 visual effect Effects 0.000 claims description 7
- 230000002123 temporal effect Effects 0.000 claims description 3
- 238000004891 communication Methods 0.000 description 5
- 238000004590 computer program Methods 0.000 description 5
- 230000004913 activation Effects 0.000 description 4
- 230000006870 function Effects 0.000 description 4
- 239000011521 glass Substances 0.000 description 4
- 238000010586 diagram Methods 0.000 description 3
- 210000003128 head Anatomy 0.000 description 2
- 230000003287 optical effect Effects 0.000 description 2
- 230000008447 perception Effects 0.000 description 2
- 239000004984 smart glass Substances 0.000 description 2
- 241001122315 Polites Species 0.000 description 1
- 241000287181 Sturnus vulgaris Species 0.000 description 1
- 230000004397 blinking Effects 0.000 description 1
- 239000003086 colorant Substances 0.000 description 1
- 238000004883 computer application Methods 0.000 description 1
- 210000005069 ears Anatomy 0.000 description 1
- 230000004424 eye movement Effects 0.000 description 1
- 230000003993 interaction Effects 0.000 description 1
- 238000013507 mapping Methods 0.000 description 1
- 239000004065 semiconductor Substances 0.000 description 1
- 230000005236 sound signal Effects 0.000 description 1
- 230000002747 voluntary effect Effects 0.000 description 1
Classifications
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F3/00—Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
- G06F3/01—Input arrangements or combined input and output arrangements for interaction between user and computer
- G06F3/011—Arrangements for interaction with the human body, e.g. for user immersion in virtual reality
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F3/00—Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
- G06F3/01—Input arrangements or combined input and output arrangements for interaction between user and computer
- G06F3/011—Arrangements for interaction with the human body, e.g. for user immersion in virtual reality
- G06F3/013—Eye tracking input arrangements
-
- G—PHYSICS
- G02—OPTICS
- G02B—OPTICAL ELEMENTS, SYSTEMS OR APPARATUS
- G02B27/00—Optical systems or apparatus not provided for by any of the groups G02B1/00 - G02B26/00, G02B30/00
- G02B27/01—Head-up displays
- G02B27/017—Head mounted
-
- G—PHYSICS
- G02—OPTICS
- G02B—OPTICAL ELEMENTS, SYSTEMS OR APPARATUS
- G02B27/00—Optical systems or apparatus not provided for by any of the groups G02B1/00 - G02B26/00, G02B30/00
- G02B27/01—Head-up displays
- G02B27/017—Head mounted
- G02B27/0172—Head mounted characterised by optical features
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F3/00—Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
- G06F3/01—Input arrangements or combined input and output arrangements for interaction between user and computer
- G06F3/016—Input arrangements with force or tactile feedback as computer generated output to the user
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T19/00—Manipulating 3D models or images for computer graphics
- G06T19/006—Mixed reality
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06V—IMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
- G06V20/00—Scenes; Scene-specific elements
- G06V20/20—Scenes; Scene-specific elements in augmented reality scenes
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N23/00—Cameras or camera modules comprising electronic image sensors; Control thereof
- H04N23/60—Control of cameras or camera modules
-
- G—PHYSICS
- G02—OPTICS
- G02B—OPTICAL ELEMENTS, SYSTEMS OR APPARATUS
- G02B27/00—Optical systems or apparatus not provided for by any of the groups G02B1/00 - G02B26/00, G02B30/00
- G02B27/01—Head-up displays
- G02B27/0101—Head-up displays characterised by optical features
- G02B2027/014—Head-up displays characterised by optical features comprising information/image processing systems
-
- G—PHYSICS
- G02—OPTICS
- G02B—OPTICAL ELEMENTS, SYSTEMS OR APPARATUS
- G02B27/00—Optical systems or apparatus not provided for by any of the groups G02B1/00 - G02B26/00, G02B30/00
- G02B27/01—Head-up displays
- G02B27/017—Head mounted
- G02B2027/0178—Eyeglass type
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T2219/00—Indexing scheme for manipulating 3D models or images for computer graphics
- G06T2219/024—Multi-user, collaborative environment
Landscapes
- Engineering & Computer Science (AREA)
- Theoretical Computer Science (AREA)
- General Engineering & Computer Science (AREA)
- Physics & Mathematics (AREA)
- General Physics & Mathematics (AREA)
- Human Computer Interaction (AREA)
- Multimedia (AREA)
- Computer Graphics (AREA)
- Computer Hardware Design (AREA)
- Software Systems (AREA)
- Optics & Photonics (AREA)
- Signal Processing (AREA)
- User Interface Of Digital Computer (AREA)
Description
1 SYSTEM AND METHOD FOR SOCIALLY RELEVANT USER ENGAGEMENT INDICATOR IN REALITY DEVICES TECHNOLOGICAL FIELD The present invention is in the field of wearable/head mounted image projection systems adapted for projecting images to a user eye(s) for providing an experience of augmented or virtual reality to users. More specifically, it relates to a technique for user engagement indicator for augmented reality (AR).
BACKGROUND ART A reference considered to be relevant as background to the presently disclosed subject matter is listed below: - US Patent 8,531,590 Acknowledgement of the above reference herein is not to be inferred as meaning that it is in any way relevant to the patentability of the presently disclosed subject matter.
BACKGROUND Head mounted or otherwise wearable image projection systems for projecting virtual and/or augmented virtual reality to a user eye(s) are becoming increasingly popular. Such systems are in many cases configured as glasses mountable onto a user's head and operable for projecting images to the user's eyes for providing virtual reality image/video projection to the user. To this end, certain of the known systems are aimed at providing pure virtual reality image projections to the user eyes, in which light from the external scenery is blocked from reaching the eye(s). Other systems are directed to provide augmented virtual reality perception, in which the light from the external scenery is allowed to pass to the eyes, while also being augmented/superposed by images/video frames projected to the eyes by the image projection systems.
Augmented Reality (AR) provides an instant, ubiquitous and discreet access to digital content, as well as easy and intuitive collaboration on it. Therefore, wearable AR devices become more widely adopted. However, a person (or several people) interacting 2 with virtual content that is invisible to others in a public or social setting can create socially uncomfortable situations. This is in contrast with the currently common situations where the common visibility of objects that are interacted with (e.g. book, laptop, smartphone, headphones), and the clearly visible focus of the user on them (gaze pointed at screen, ears obstructed by headphones etc.) provide clear indication to onlookers as to what the user is interacting with.
GENERAL DESCRIPTION There is a need in the art to provide a technique being capable of avoiding such uncomfortable situations. To this end, the present invention provides a new technique for an AR user engagement indicator. This can be implemented by using external indicator(s) being configured and operable to show (i.e. clearly signal) to the onlooker(s) socially relevant information about the user activity in AR. Although, throughout the following description, the term "onlooker" is used in a single form, the number of persons being in the user's environment is not a part of the invention and the invention is not limited to any number of onlookers. The engagement indicator device is thus operable to indicate to the surroundings whether the user engagement/attention is given to real or virtual content provided to him via the AR system.
Therefore, according to a broad aspect of the present invention, there is provided an indicator device of user activity in Augmented Reality (AR). The indicator device comprises an indicator module being configured and operable to generate at least one signal being indicative of at least one social relevant parameter related to the user activity to at least one user's onlooker, and a monitoring module connectable to the indicator module and being configured and operable to receive and process data related to user activity, determine whether the user is in a certain condition corresponding to a certain social relevant parameter, and activate the indicator module upon identification of the certain user condition. Data related to user activity may include data indicative of current user activity and/or reference data patterns indicative of at least one user activity and/or data being indicative of virtual content presented to the user and/or data related to the current real-world environment of the user and/or data indicative of the gaze direction of the user over time. The indicator device may be a stand-alone device or may be configured to be integrated in any wearable Augmented/Virtual Reality system. For example, the 3 indicator device may be a complete and separate subsystem aimed at providing information to the onlookers. A head mounted or otherwise wearable image projection system may be used with the novel indicator device of the present invention projecting virtual and/or augmented virtual reality to the user eye(s). Such systems may be configured as glasses mountable onto a user's head. The invention is not limited to the use of pure virtual reality image projections to the user eyes, in which light from external scenery is blocked from reaching the eye(s). The invention is also not limited to the use of augmented virtual reality perception, in which light from the external scenery is allowed to pass to the eyes while also being augmented/superposed by images/video frames projected to the eyes by the image projection systems.
The social relevant parameter may be a recording activity and/or an immersion condition and/or a focusing condition and/or a space activity. The indicator module may be a visual indicator module and/or an auditory indicator module.
In some embodiments, the indicator module is operable in at least one of on/off, gradual operating mode and operating mode being indicative of a certain social parameter. The gradual operating mode may be indicative of a level of user condition.
In some embodiments, the monitoring module is configured and operable to process the data related to user activity to determine a certain user condition.
In some embodiments, the monitoring module is configured and operable to process data indicative of current user activity to determine the recording activity of the user.
In some embodiments, the monitoring module is configured and operable to receive data being indicative of temporal patterns of the gaze direction of the user, process the data to identify a certain pattern, compare the identified pattern with one or more reference patterns indicative of at least one user activity to determine a matching pattern, and upon determining the matching pattern, activate the indicator module accordingly.
In some embodiments, the monitoring module is configured and operable to identify a correspondence between 3D position of the virtual content and the gaze direction of the user.
In some embodiments, the indicator module is configured and operable to generate a plurality of signals, each being indicative of different types of user activities. 4 According to another broad aspect of the present invention, there is provided a method for indicating AR user activity. The method comprises receiving and processing data related to user activity; identifying whether the user is in a certain user condition corresponding to a certain social relevant parameter; and generating at least one signal being indicative of at least one social relevant parameter related to a user activity, to an onlooker. The procedure of generating the signal may comprise activating at least one of the following operating modes: on/off, gradual and operating mode being indicative of a specific social parameter. This may be implemented by generating visual or auditory signal(s).
In some embodiments, the method further comprises activating an indicator device upon identification of a certain user condition.
In some embodiments, the method further comprises determining the type of user activity.
In some embodiments, the method further comprises generating at least one signal being indicative of the type of user activity.
In some embodiments, processing data indicative of virtual content presented to the user comprises determining an amount of user's field of view obstructed by virtual content.
In some embodiments, the processing procedure comprises identifying in the data a certain pattern, comparing the identified pattern with one or more reference patterns indicative of at least one user activity to determine a matching pattern, and, upon determining the matching pattern, generating at least one signal corresponding to a certain social parameter.
In some embodiments, the processing procedure comprises anticipating extent of user movement based on the data being indicative of virtual content presented to the user to determine a space condition being related to an activity that may cause the user to suddenly move without awareness of their surroundings referred hereinafter as "space activity", and on reference data patterns indicative of at least one user activity.
In some embodiments, the processing procedure comprises identifying a correspondence between 3D position of the virtual content and the gaze direction of the user.
BRIEF DESCRIPTION OF THE DRAWINGS In order to better understand the subject matter that is disclosed herein and to exemplify how it may be carried out in practice, embodiments will now be described, by way of non-limiting example only, with reference to the accompanying drawings, in which: Fig. 1 is a schematic block diagram illustrating the main functional elements of the device of the present invention; Fig. 2 is a schematic flow chart illustrating the main procedures of the method of the present invention; Fig. 3 is a possible illustration of a specific configuration of the device according to some embodiments of the present invention.
DETAILED DESCRIPTION OF EMBODIMENTS The present invention is aimed at preventing social embarrassing situations related to the use of AR activity. The following are examples of use cases where an unclear AR activity leads to a social faux pas (in all these, Alice has her AR glasses on; Bob does not): ● Alice and Bob attend a party; Alice is looking at Bob and simultaneously chats online with her friend. Bob is concerned because he thinks that Alice might be taking pictures of him and posting them online, but is too polite to ask.
● Bob greets Alice, but she does not react, despite looking straight at Bob.
Bob is confused; in fact, Alice does not see nor hear him at all, because she is watching a concert in AR, and it completely occludes her vision.
● On a train, Alice stares intently at Bob’s chest; Bob is confused and slightly unnerved. In fact, Alice is checking her emails, and the email widget happens to be on the line of sight of Bob’s chest.
● Alice and Bob are talking; Alice glances every now and again at the door; every time she does that, Bob automatically looks there as well, expecting someone to come in; in fact, Alice is reading the messages she receives from her sister.
The comfortable form-factor of AR devices enables to wear them even when they are not actively used (in contrast with, e.g. headphones nowadays). Therefore, the social message related when using headphones of “I visibly have headphones on; therefore, I 6 cannot hear you” cannot be translated onto “I have AR glasses on; therefore, I cannot see you”. In addition, the social implications of minimal AR heads-up display (e.g. a timepiece at the edge of someone’s vision) are very different from a fully immersive surround experience that essentially leaves the user blind to the outside world. The present invention solves both these needs. In order to avoid the unpleasantness situations described in the above examples, the present invention allows to the AR device to clearly signal to onlookers, socially relevant information on the user AR activity, yet without compromising the user’s privacy.
Reference is made to Fig. 1, illustrating by the way of a block diagram, an indicator device of user activity in Augmented/Virtual Reality (AR). The indicator device is configured for reflecting a user activity in a user’s AR device. The "AR device" generally refers to a device being capable of displaying visual content that is visible to its user only, e.g. smart glasses. The indicator device 100 comprises one or more indicator modules 102 being configured and operable to generate at least one signal (e.g. visual or auditory signal) being indicative of at least one social relevant parameter related to the user activity to at least one user's onlooker and a monitoring module 104 connectable to the indicator module 102 and being configured and operable to receive and process data related to user activity, determine whether the user is in a certain condition corresponding to a certain social relevant parameter, and activate the indicator module upon identification of the certain user condition. Data related to user activity may comprise data indicative of current user activity and/or reference data patterns indicative of at least one user activity and/or data being indicative of virtual content presented to the user and/or data related to current real-world environment of the user and/or data indicative of the gaze direction of the user over time. Condition of the user refers to a situation in which the user interacts with an AR content (e.g. identification of a presence, type and 3D position of an AR content) and cannot interact with the environment and/or a situation in which the user performs a certain activity and/or a situation in which the user interacts with an AR content at a certain level of engagement. The certain activity includes recording activity and engagement in a “space activity". The certain level of engagement includes a certain level threshold when interacting with any AR content (e.g. time duration, volume or field or view). 7 In this connection, it should be understood that monitoring module 104 receives a certain data and processes this data in different ways in order to identify different problematic social relevant parameters. The social relevant parameters refer to different types of information communicated to the onlookers via different signals. They may be indicative of a recording activity and/or an immersion condition and/or a focusing condition and/or a space activity. Indicator module 102 generates at least one notification signal being indicative of a social relevant parameter.
More specifically, the recording activity informs the environment that the user is recording audio, video or images for personal use, disregarding the continuous ongoing recording done for the sake of normal device functioning (listening for voice commands, visual positioning etc.). In this way, upon activation of the recording signal, the onlooker is aware that, if he is placed in the user's field of view, the user records data on which he appears. If desired, the onlooker should move from the user's field of view to not appear in the frame, or communicate with the user. Conversely, if the recording signal is off (i.e. not on), the onlooker can reasonably assume that he is not being recorded. Each of monitoring module 104 and indicator module 102 may comprise a communication utility to send and receive the notification data respectively. More specifically, monitoring module 104 processes data related to currently running applications available to the operating system (e.g. their permissions) and determines whether the recording condition is identified (i.e. whether there is a ‘recording for personal use’ going on). In some embodiments, the recording signal is the strongest signal, visible even from the periphery, because it communicates voluntary privacy intrusion on the part of the user. For example, a blinking red light, that is highly visible and attention grabbing, is used.
The immersion condition informs the environment that the user’s senses are taken up by digital content. The immersion signal may come next the recording signal in order of importance. It needs to be visible to onlooker(s) desiring to establish communication with the user. For example, for activating the immersion condition, the monitoring module may proceed as follows: receiving and processing AR content data and user location data to determine how much of the user’s field of view is obscured by AR content; upon identification of a certain threshold (e.g. 50%) of the obscured user's field of view, activating the indicator module. In some embodiments, the immersion condition may be displayed in a scale manner being indicative of how much of the user’s senses 8 (vision and hearing separately) are taken up by digital content, or how immersed a user is in AR, or conversely, how much the user is aware of their real-world surroundings. In this way, upon activation of the immersion condition signal, the onlooker is aware that the user is partially, or not aware, of the real world surrounding him. For example, the gradual scale enables to distinguish between ‘no AR’, ‘a bit of AR’, ‘almost full immersion’ and ‘full VR’. A plurality of light modules may be successively activated corresponding to a different amount of obscured user’s field of view.
The focusing condition informs the environment that the user is currently focused on, or interacting with, virtual content. In this way, upon activation of the focusing condition signal, the onlooker is aware that the user is occupied and would not be attentive to any interaction with the environment, even if his field of view is not obstructed by the virtual content. The focusing condition may be more subtle than the recording activity and the immersion condition. The focusing condition may be used when the user and the onlooker are already in a communication, or to prevent miscommunication between them.
For example, to activate the focusing condition signal, the monitoring module may proceed as follows: receiving and processing eye tracking data to determine gaze direction and 3D gaze point; receiving and processing data from world mapping in conjunction with AR content data to determine what the user is looking at; if the user is looking at an AR content, performing an eye-movement pattern analysis to infer user type of activity and context; determining whether the user is currently actively engaged with the AR content he is looking at; upon identification of the current active engagement of the user with the AR content, activating the indicator module.
The space activity informs the onlookers that the user is engaged in an activity that may cause them to suddenly move without awareness of their surroundings. In this way, upon activation of the space activity signal, the onlooker is warned that the user may suddenly move. In some embodiments, the space signal is the strongest signal, perceivable even from the periphery, because it communicates possible physical harm from the user to the onlooker.
Indicator module 102 generates at least one signal being perceivable (e.g. visible or audible) by person(s) being different from the user without need for special equipment.
More specifically, indicator module 102 may comprise a plurality of different light modules generating different signals or one light module (e.g. small LED(s)) generating 9 different light signals. The signals may be of different colors and/or may be displayed by units having different designs (e.g. shapes) and/or may be binary (on/off) and/or may be of different intensity (e.g. brightness). If the signals are binary, they may have different frequencies (i.e. different on/off periods). Additionally, or alternatively, the indicator module 102 may comprise a speaker generating an audible sound signal. The different elements of the device communicate with each other and may not be placed in the same environment. Each signal is indicative of a different social parameter identifiable by the onlookers. Additionally, the signals may be indicative of different types of activity (e.g. listening to music, watching video, talking to someone). For example, a specific color may be associated with a specific type of activity. The different signals may be gradual according to the level of user condition. To this end, monitoring module 104 processes the received data to determine a proportion of the field of view occupied by the virtual activity. For example, in the immersion or focusing condition, the level of immersion or focusing may be identified and noticed by the onlookers.
In general, monitoring module 104 may be a processor, a controller, a microcontroller or any kind of integrated circuit. Monitoring module 104 is configured generally as a computing/electronic utility including inter alia such utilities as data input and output modules/utilities 104A and 104D, memory 104C (i.e. non-volatile computer readable medium), and analyzer/data processing utility 104B. The utilities of the monitoring module 104 may thus be implemented by suitable circuitry and/or by software and/or hardware components including computer readable code configured for implementing the identification of the user condition. The features of the present invention may comprise a general-purpose or special-purpose computer system including various computer hardware components. Features within the scope of the present invention also include computer-readable media for carrying out or having computer- executable instructions, computer-readable instructions, or data structures stored thereon.
Such computer-readable media may be any available media, which are accessible by a general-purpose or special-purpose computer system. By way of example, without limitation, such computer-readable media can comprise physical storage media such as RAM, ROM, EPROM, flash disk, CD-ROM or other optical disk storage, magnetic disk storage or other magnetic storage devices, or any other media which can be used to carry or store desired program code means in the form of computer-executable instructions, computer-readable instructions, or data structures and which may be accessed by a general-purpose or special-purpose computer system. Computer-readable media may include a computer program or computer application downloadable to the computer system over a network, such as a wide area network (WAN), e.g. Internet. In this description and in the following claims, a " monitoring module" is defined as one or more software modules, one or more hardware modules, or combinations thereof, which work together to perform operations on electronic data. For example, the definition of a processing utility includes the hardware components of a personal computer, as well as software modules, such as the operating system of a personal computer. The physical layout of the modules is not relevant. A computer system may include one or more computers coupled via a computer network. Likewise, a computer system may include a single physical device where internal modules (such as a memory and processor) work together to perform operations on electronic data. Monitoring module 104 may be comprised of a processor embedded therein running a computer program, or attached thereto. The computer program product may be embodied in one or more computer readable medium(s) having computer readable program code embodied thereon. The computer readable medium may be a computer readable signal medium or a computer readable storage medium. Computer program code for carrying out operations for aspects of the present invention may be written in any combination of one or more programming languages. The program code may execute entirely on the user's computer, partly on the user's computer, as a stand-alone software package, partly on the user's computer and partly on a remote computer, or entirely on the remote computer or server. In the latter scenario, the remote computer may be connected to the user's computer through any type of network, including a local area network (LAN) or a wide area network (WAN), or the connection may be made to an external computer (for example, through the Internet using an Internet Service Provider). These computer program instructions may be provided to the processor of a general purpose computer, special purpose computer, or other programmable data processing apparatus to produce a machine, such that the instructions, which execute via the processor of the computer or other programmable data processing apparatus, create means for implementing the functions/acts specified in the flowchart and/or block diagram block or blocks. The specified functions of the processor can be 11 implemented by special purpose hardware-based systems that perform the specified functions or acts, or combinations of special purpose hardware and computer instructions.
The correspondence between the different signals and the different social relevant parameter may be predetermined according to the configured of the AR system to be used with the indicator device of the present invention. Some examples of such correspondence are detailed below for example as illustrated in Fig. 3. For example, a table of correspondence between the different signals and the different social relevant parameters may be stored in a database. Such table may be stored in memory 104C. Alternatively, storage may be separate from the server(s) (e.g. SAN storage). If separate, the location(s) of the storage may be in one physical location, or in multiple locations and connected through any type of wired or wireless communication infrastructure. The database may rely on any kind of methodology or platform for storing digital data. The database may include for example, traditional SQL databases such as Oracle and MS SQL Server, file systems, Big Data, NoSQL, in-memory database appliances, parallel computing (e.g.
Hadoop clusters), etc. If memory 104C is configured as the storage medium of the database, it may include any standard or proprietary storage medium, such as magnetic disks or tape, optical storage, semiconductor storage, etc.
Monitoring module 104 may be placed in the environment (e.g. being integrated in any desired devices) or, alternatively, a server not located in the proximity of the environment may perform receiving the data and/or identifying in the data a certain condition and sending a notification. The server is in data communication with the different parts of the device.
For example, to identify the recording activity, monitoring module 104 may receive and process recorded data of the user's surroundings to determine whether the user is recording the environment. Monitoring module 104 is configured to control whether a permission for recording this data exists, and upon identification of such permission, monitoring module 104 generates a recording signal to the onlookers and controls access and storage of this recorded data. Monitoring module 104 is thus configured to differentiate between data for the operating system use and data for personal use and provide permission to an application to record personal data together with an indication to the surroundings that the user is recording for his personal use. For example, to identify the immersion condition of the user, monitoring module 104 may receive and 12 process data being indicative of virtual content presented to the user and determine (e.g. calculate) the amount of user's field of view obstructed by virtual content. If the amount of user's field of view obstructed by virtual content is above a certain threshold of amount, monitoring module 104 determines that the user is in an immersion condition.
For example, to identify the focusing condition, in addition to the gaze data, monitoring module 104 may receive and process data on virtual content currently presented to the user (e.g. what is shown and where), and current real-world environment of the user (e.g. what is there and where it is located relative to the user). Monitoring module 104 processes the received data to determine what the user is looking at (e.g. a real person, or a virtual window in the same gaze direction).
For example, to identify the space activity condition, monitoring module 104 may receive and process data on current virtual activity to anticipate user movement.
More specifically, monitoring module 104 may process data temporal patterns of the gaze direction of the user to identify a certain pattern, and compare the identified pattern with one or more reference patterns indicative of various respective user activities (e.g. reading, staring, gaming or others) to determine a matching pattern. Upon determining a matching pattern, the user engagement monitoring module determines whether the identified pattern corresponds to the user engagement with virtual content presented to him, and activates the module accordingly.
Certain processing steps are described and a particular order of processing steps is disclosed; however, the sequence of steps is not limited to that set forth herein and may be changed as is known in the art, with the exception of steps or acts necessarily occurring in a certain order. Reference is made to Fig. 2, showing a flow chart illustrating a method for indicating AR user activity. The flow chart or any process or method described herein in other manners may represent a module, segment, or portion of code that comprises one or more executable instructions to implement the specified logic function(s) or that comprises one or more executable instructions of the steps of the progress. Although the flow chart shows a specific order of execution, it is understood that the order of execution may differ from that which is depicted. For example, the order of execution of two or more boxes may be scrambled relative to the order shown. Method 200 comprises receiving and processing data related to user activity in 202; identifying whether the user is in a certain user condition corresponding to a certain social relevant parameter in 204; 13 and generating at least one signal being indicative of at least one social relevant parameter related to a user activity to an onlooker in 206. The signal may be transmitted to an indicator device being activated upon receipt of this signal in 208. Receiving and processing data related to user activity in 202 may comprise determining the type of user activity in 210 (e.g. listening to music, watching video, talking to someone).
For example, to identify the immersion condition of the user, receiving and processing data related to user activity in 202 may also comprise processing data indicative of virtual content presented to the user to determine an amount of user's field of view obstructed by virtual content in 212 and/or a correspondence between a 3D position of the virtual content and the gaze direction of the user in 216.
For example, to identify the space condition of the user, receiving and processing data related to user activity in 202 may also comprise identifying the type of activity in 210 (e.g. gaming) and anticipating extent of user movement in 214 based on the data being indicative of virtual content presented to the user to determine a space condition and on reference data patterns indicative of at least one user activity.
Reference is made to Fig. 3 illustrating a possible configuration of the indicator device of the present invention 300. The three inset figures show different possible operating modes of indicator module 302. However, the invention is not limited to such operating modes. Although in the figure, monitoring module 304 is illustrated as being integrated in the AR wearable system (smart glasses), the invention is not limited to such configuration and monitoring module 304 may be placed in an environment distant from indicator module 302. Indicator module 302 and monitoring module 304 communicate via wires or wirelessly. In this specific and non-limiting example, indicator module 302 comprises a plurality of units having a different number of symbols, each symbol having a different shape: one circular, three rectangles and four curved lines. Each symbol may generate a different color. As illustrated in the figure, for example, for indicating a recording activity, the round red light may blink. For example, for indicating an immersion condition, the blue rectangles may show how immersed the user is (e.g. none - no visible AR content; one rectangle - some; two - more than half of user’s field of view obscured by AR content; three - user is in full VR mode). The same blue rectangles may also show whether the user is interacting with AR by temporarily becoming brighter (those that are on). For example, for indicating a focusing condition such as headphone 14 activity, the green curvy lights may show the headphone activity with brighter light, meaning louder sound.
Claims (26)
1. An indicator device of user activity in Augmented Reality (AR) comprising: an indicator module being configured and operable to generate at least one signal being indicative of at least one social relevant parameter related to the user activity to at 5 least one user's onlooker; and a monitoring module connectable to said indicator module and being configured and operable to receive and process data related to user activity; determine whether the user is in a certain condition corresponding to a certain social relevant parameter; and activate said indicator module upon identification of the certain user condition. 10
2. The indicator device of claim 1, wherein data related to user activity comprises at least one of: data indicative of current user activity, reference data patterns indicative of at least one user activity, data being indicative of virtual content presented to the user, data related to current real-world environment of the user or data indicative of the gaze direction of the user over time. 15
3. The indicator device of claim 1 or 2, wherein the at least one social relevant parameter comprises at least one of a recording activity, an immersion condition, a focusing condition and a space activity.
4. The indicator device of any one of claims 1 to 3, wherein said indicator module is operable in at least one of on/off, gradual operating mode and operating mode being 20 indicative of a certain social parameter.
5. The indicator device of claim 4, wherein said gradual operating mode is indicative of a level of user condition.
6. The indicator device of any one of claims 1 to 5, wherein said indicator module is a visual indicator module. 25
7. The indicator device of any one of claims 1 to 6, wherein said indicator module is an auditory indicator module.
8. The indicator device of any one of claims 1 to 7, wherein said monitoring module is configured and operable to process data related to user activity to determine a certain user condition. 30
9. The indicator device of claim 8, wherein said monitoring module is configured and operable to process data indicative of current user activity to determine the recording activity of the user. 16
10. The indicator device of claim 8 or 9, wherein said monitoring module is configured and operable to process data being indicative of virtual content presented to the user and determine amount of user's field of view obstructed by virtual content.
11. The indicator device of any one of claims 1 to 10, wherein said monitoring 5 module is configured and operable to receive data being indicative of temporal patterns of the gaze direction of the user, process said data to identify a certain pattern, compare the identified pattern with one or more reference patterns indicative of at least one user activity to determine a matching pattern, and, upon determining the matching pattern, activating the indicator module accordingly. 10
12. The indicator device of any one of claims 1 to 11, wherein said monitoring module is configured and operable to identify a correspondence between 3D position of the virtual content and the gaze direction of the user.
13. The indicator device of any one of claims 1 to 12, wherein said indicator module is configured and operable to generate a plurality of signals, each being indicative of 15 different types of user activities.
14. The indicator device of any one of claims 1 to 13, configured to be integrated in a wearable Augmented Reality system.
15. A method for indicating AR user activity comprising: receiving and processing data related to user activity; 20 identifying whether the user is in a certain user condition corresponding to a certain social relevant parameter; and generating at least one signal being indicative of at least one social relevant parameter related to a user activity to onlooker.
16. The method of claim 15, further comprising activating an indicator device upon 25 identification of certain user condition.
17. The method of claim 15 or 16, wherein at least one social relevant parameter comprises at least one of a recording activity, an immersion condition, a focusing condition and a space activity.
18. The method of claim 15 or 16, further comprising determining the type of user 30 activity.
19. The method of claim 18, further comprising generating at least one signal being indicative of the type of user activity. 17
20. The method of any one of claims 15 to 19, wherein said generating of the at least one signal comprises activating at least one of the following operating modes: on/off, gradual and operating mode being indicative of a certain social parameter.
21. The method of claim 20, wherein said gradual operating mode is indicative of a 5 level of user condition.
22. The method of any one of claims 15 to 21, wherein said generating of the at least one signal comprises generating at least one visual or auditory signal.
23. The method of any one of claims 15 to 22, wherein said processing data related to user activity comprises processing data indicative of virtual content presented to the 10 user and determining an amount of user's field of view obstructed by virtual content.
24. The method of claim 23, wherein said processing comprises identifying, in said data, a certain pattern, comparing the identified pattern with one or more reference patterns indicative of at least one user activity to determine a matching pattern, and, upon determining the matching pattern, generating at least one signal corresponding to 15 a certain social parameter.
25. The method of claim 23 or 24, wherein said processing comprises anticipating extent of user movement based on said data being indicative of virtual content presented to the user to determine a space condition and on reference data patterns indicative of at least one user activity. 20
26. The method of any one of claims 23 to 25, wherein said processing data comprises identifying a correspondence between 3D position of the virtual content and the gaze direction of the user.
Priority Applications (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
IL266333A IL266333A (en) | 2020-05-10 | 2020-05-10 | System and method for socially relevant user engagement indicator in reality devices |
Applications Claiming Priority (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
IL266333A IL266333A (en) | 2020-05-10 | 2020-05-10 | System and method for socially relevant user engagement indicator in reality devices |
Publications (1)
Publication Number | Publication Date |
---|---|
IL266333A true IL266333A (en) | 2020-10-28 |
Family
ID=82324311
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
IL266333A IL266333A (en) | 2020-05-10 | 2020-05-10 | System and method for socially relevant user engagement indicator in reality devices |
Country Status (1)
Country | Link |
---|---|
IL (1) | IL266333A (en) |
-
2020
- 2020-05-10 IL IL266333A patent/IL266333A/en unknown
Similar Documents
Publication | Publication Date | Title |
---|---|---|
US20230333377A1 (en) | Display System | |
US10613330B2 (en) | Information processing device, notification state control method, and program | |
US20160054565A1 (en) | Information processing device, presentation state control method, and program | |
US11354805B2 (en) | Utilization of luminance changes to determine user characteristics | |
US20210081047A1 (en) | Head-Mounted Display With Haptic Output | |
CN113661477B (en) | Managing devices with additive displays | |
US20210326594A1 (en) | Computer-generated supplemental content for video | |
US20210326094A1 (en) | Multi-device continuity for use with extended reality systems | |
CN108140045A (en) | Enhancing and supporting to perceive and dialog process amount in alternative communication system | |
US12013977B2 (en) | System and method for socially relevant user engagement indicator in augmented reality devices | |
US20240000312A1 (en) | System and Method for Monitoring and Responding to Surrounding Context | |
IL266333A (en) | System and method for socially relevant user engagement indicator in reality devices | |
US12118646B1 (en) | Transitional effects in real-time rendering applications | |
US20240194049A1 (en) | User suggestions based on engagement | |
EP4312102A1 (en) | Wear detection | |
US11743215B1 (en) | Artificial reality messaging with destination selection | |
US20240296005A1 (en) | Cross-platform sharing of displayed content for electronic devices | |
US20230334765A1 (en) | Techniques for resizing virtual objects | |
US20230224348A1 (en) | Server and control method thereof | |
WO2023244515A1 (en) | Head-mountable device with guidance features | |
WO2023205096A1 (en) | Head-mountable device for eye monitoring |