WO2017012662A1 - A system for providing recommendation information for user device - Google Patents

A system for providing recommendation information for user device Download PDF

Info

Publication number
WO2017012662A1
WO2017012662A1 PCT/EP2015/066780 EP2015066780W WO2017012662A1 WO 2017012662 A1 WO2017012662 A1 WO 2017012662A1 EP 2015066780 W EP2015066780 W EP 2015066780W WO 2017012662 A1 WO2017012662 A1 WO 2017012662A1
Authority
WO
WIPO (PCT)
Prior art keywords
user
data
module
dependent
virtual
Prior art date
Application number
PCT/EP2015/066780
Other languages
French (fr)
Inventor
Pan Hui
Yaofeng ZHANG
Dimitros CHATZOPOULOS
Weikai Li
Christoph Peylo
Original Assignee
Deutsche Telekom Ag
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Deutsche Telekom Ag filed Critical Deutsche Telekom Ag
Priority to PCT/EP2015/066780 priority Critical patent/WO2017012662A1/en
Priority to EP15750939.9A priority patent/EP3207468A1/en
Publication of WO2017012662A1 publication Critical patent/WO2017012662A1/en

Links

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F16/00Information retrieval; Database structures therefor; File system structures therefor
    • G06F16/90Details of database functions independent of the retrieved data types
    • G06F16/95Retrieval from the web
    • G06F16/953Querying, e.g. by the use of web search engines
    • G06F16/9535Search customisation based on user profiles and personalisation
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F16/00Information retrieval; Database structures therefor; File system structures therefor
    • G06F16/90Details of database functions independent of the retrieved data types
    • G06F16/95Retrieval from the web
    • G06F16/953Querying, e.g. by the use of web search engines
    • G06F16/9537Spatial or temporal dependent retrieval, e.g. spatiotemporal queries

Definitions

  • the present invention relates to an information recommendation system comprising a cloud and a user device for providing recommending information to the user device, and in particular to a system for providing recommendation results by adopting user-independent and user-dependent data modules to improve user experience.
  • the present invention also relates to a user device used in an information recommendation system.
  • the present invention is applicable to a mobile augmented reality system.
  • MAR Mobile Augmented Reality
  • mobile devices which provides additional technologies such as optical sensors (portable digital cameras etc.), inertial sensors (accelerometers, compass, gyroscopes etc.), GPS and wireless network sensors.
  • Typical devices that support MAR may include smart glass (e.g. Google Glass ) and smart watch (e.g. Apple Watch ).
  • a smart glass is a wearable computing device in a form of eyeglasses, which typically includes an optical head-mounted display.
  • a smart watch is a wristwatch computerized with functionality that is enhanced beyond timekeeping.
  • Such devices may include features such as camera, screen, GPS navigation, etc. which enable these devices to run MAR applications.
  • Mobile recommendation systems have been developed in mobile devices such as smart mobile phones. After years of development, smart phones such as iPhoneTM, Samsung GALAXY series contain relatively high computing capability that enables smart device to run sophisticated algorithms.
  • the cloud comprises: a) a remote storage for storing first virtual data of objects, each of said objects being associated with its geographical information and also being associated with an attribute indicating whether said object is user-independent or user-dependent; b) a selector for selecting, from said first virtual data, second virtual data of object or objects on the basis of a geographical location of the user device; and c) a transmitter for transmitting said second virtual data to the user device.
  • FIG. 2 is a block diagram illustrating a cloud shown in FIG. 1 ;
  • FIG. 9 is a block diagram illustrating a user-independent data module shown in FIG. 7;
  • FIG. 13 is a flowchart illustrating an exemplary operation performed by the user-independent data module shown in FIG. 9;
  • FIG. 16 is a flowchart illustrating a display operation performed by an exemplary augmented reality system into which the information system according to an embodiment of the present invention has been integrated;
  • an information recommendation system 1 includes a cloud 2 and one or more user devices 3 (such as mobile phones) equipped with a screen.
  • the information recommendation system 1 provides recommendation information to the user device 3, while the information is being updated depending on a geographical location of the user device 3.
  • virtual objects may be further divided into two categories: urgent and regular (i.e. not urgent).
  • Urgency reflects high degree of significance of a virtual object in terms of time. Urgent virtual objects should be presented in time or at once, such as nearby fire alarms, car accidents.
  • Regular virtual objects may include bus schedules, weather, etc.
  • the degree of urgency i.e. whether to classify the object as urgency or regular
  • each virtual object, stored in the remote storage 4, categorized as user-independent is associated with an additional attribute indicating whether the virtual object is urgent or regular (not urgent).
  • non- repeatable objects are assigned lower weight while repeatable objects higher weight.
  • FIG. 5 illustrates exemplary preparatory steps for the information recommendation system.
  • Step 501 Collect virtual object information
  • virtual object information can be retrieved, transformed and/or aggregated from publicly available and/or commercial databases.
  • Virtual object information is information which can be displayed on the screen of the user device 3 and may include and not be limited to name, time/date, geographical location (e.g. address), popularity. As explained above, each virtual object is associated with a geographical location. This geographical location may be identical to that for display, but preferably different therefrom, such as coordinates for allowing to provide finely adjusted recommendation information.
  • Step 502 Classify virtual object information
  • assigning one or more attributes to each virtual object is performed using the aforementioned exemplary classification method.
  • the assigning operation may be performed automatically and/or manually.
  • urgent objects are automatically assigned an attribute of urgency immediately upon collection of urgent information (e.g. car accident).
  • Other attributes of virtual objects can be predetermined by considering various factors, such as time/date, geographical locations, related moods, popularity, specific product information or advertisement information, etc. Examples of attributes are popularity (typically in the form of a numeric value), and keyword (e.g. moods, promotional words, related items etc. which could help to provide useful recommendation results). For example, a keyword "happy" may be assigned to an amusement park.
  • Step 503 Store virtual object information in the cloud
  • virtual objects are stored in the remote storage 4 (database), which resides in the cloud 2.
  • services will be enabled to allow external connections to retrieve data from within.
  • the size and scope of the virtual object information database can be predetermined based on the customization need (system designer's need).
  • FIG. 6 illustrating exemplary usage steps for the recommendation system in terms of how user-dependent data is processed, for the purpose of facilitating the understanding of the present invention.
  • the range used when the cloud 2 retrieves user-location-dependent virtual object information can be predetermined by estimating user's activity scope within the geographical location, user's connectivity quality, etc., to reduce the volume of data transmission. For example, if the user's past activity indicates an inactive motion, smaller range can be applied, and less information will be transmitted. In this way, performance can be improved without sacrificing user's experience.
  • Step 603 Filter, weight and rank the user-location-dependent virtual object information, according to user's information
  • the virtual object information categorized as user-dependent is retrieved from the local storage, and then subjected to filtering, weighting and ranking operations.
  • filtering, weighting, and ranking techniques can be applied.
  • sensory inputs such as eyeball focus, heartbeat rate, pulse rate, walking speed, etc. may be used in the filtering and/or weighting operations.
  • current time can be used to predict user's behaviour in a day (just woke up, about to have a dinner, etc.), and be used to filter and/or weight virtual object information. Further, when multiple pieces of virtual object information are found simultaneously, they are sorted (ranked) based on the weighting result.
  • the data preparation 701 is adapted to receive the user-location-dependent data via the communication unit 706 from the cloud 2 such that it is stored in the local storage 705 as local virtual data.
  • the data preparation module 701 comprises a location change detector
  • Geographical location may be represented by coordinates.
  • the coordinate system for the geographical location may be any appropriate coordinate system, including but not limited to latitude and longitude coordinates or UTM (Universal Transverse Mercator Grid System) coordinate system and so on.
  • Definition of "change” may be customized (i.e. determined by the system designer) inside the location change detector 801. For example, it is determined that a location change has occurred when the user (user device) has moved more than a predetermined distance (e.g. several meters). In one embodiment, the definition of "change” may be adjusted through the user device 3.
  • the user-independent data module 702 functions to retrieve user-independent data that is within a range of user's current geographical location from the local data store, which is already prepared by the data preparation module 701.
  • the urgency attribute detector 901 functions to decide which information should be passed to the display module 704 immediately for presentation.
  • priorities may be given to a customized set of virtual objects.
  • the urgency attribute detector 901 can be implemented to select urgent virtual object information, such as fire alarms, car accidents nearby, for display.
  • the user-facing camera 708 may be used to obtain eyeball focus to record movements of eyeballs as the viewer looks at different objects, and measures the rotation of the eyes with respect to the measuring system.
  • the user-facing camera 708 is a camera which is arranged opposite to the face or eye.
  • a camera can be installed in a smart glass.
  • the eyeball focus detector 902 may then be used to determine what the user is looking at.
  • the eyeball focus filter 903 may then be applied to urgent virtual object and/or non-urgent object (preferably to non-urgent object only since urgent virtual object is often important information, such as car accident, which would need user's immediate attention) to ensure that only information that is related to user's eyeball focus (e.g.
  • the eyeball focus detector 902 may receive input from a timer (not shown), so that an object which is being focused on for more than a predetermined period of time (e.g. a few seconds) may be subjected to retrieving processing of its corresponding virtual object.
  • a timer not shown
  • User's social network preference indicates preference obtained by aggregating user's social network, such as friend relationship.
  • information may be input by the user.
  • Social network website APIs can be used to gather such information and provide it to the user device.
  • social network analysis method software such as Netvizz (https://github.com bernorieder/netvizz) may be used as well.
  • the history user data will include user's check-in history of each object, which is stored in the user device 3.
  • the repeat factor to the object is defined and updated by using the user's check-in history of the object.
  • the weighting unit 1002 is adapted to set lower weight to not-filtered-out virtual object information if the repeatability attribute detector 1004 detects that the object is categorized as non-repeatable, and set higher weight to not-filtered-out virtual object information if the repeatability attribute detector 1004 detects that the object is categorized as repeatable.
  • the weighting unit 1002 is further adapted to increase the lower weight for the non-repeatable object as the corresponding user's repeat factor increases as a result of the update, as is explained above using the example of users A and B.
  • the display module 704 functions to display virtual data objects upon receiving the user- independent data retrieved from the user-independent data module 702 and the filtered, weighted, and ranked user-dependent data from the user-dependent module 703.

Abstract

Information recommendation system comprises cloud and user device. Cloud stores first virtual data of objects. Each object is associated with geographical information and user- independent/user-dependent attribute. Cloud selects second virtual data of objects based on device location, and transmits it to user device. User device comprises four modules. Data preparation module stores received second virtual data in local storage. Data preparation module detects device location change, informs cloud, and updates local data in response to new second virtual data. User-independent data module retrieves user-independent data from local storage. User-dependent data module retrieves user-dependent data from local storage, and filters/weights/ranks retrieved data. Weighting is performed based on user's preference. Ranking is performed based on weighting when there are multiple items found simultaneously. Display module receives data from data modules. Display module applies predefined policies to display virtual data objects, and ranks data when there are multiple items at the same time.

Description

A SYSTEM FOR PROVIDING RECOMMENDATION INFORMATION
FOR USER DEVICE
Technical Field
The present invention relates to an information recommendation system comprising a cloud and a user device for providing recommending information to the user device, and in particular to a system for providing recommendation results by adopting user-independent and user-dependent data modules to improve user experience. The present invention also relates to a user device used in an information recommendation system. The present invention is applicable to a mobile augmented reality system.
Background
A recommendation system (or recommender system) is a system that filters out unwanted or redundant information from an information source in providing information by predicting the preference of a user. Recommendation systems have been extensively used in recent years, and have been applied to a large number of applications. For example, if a customer has bought a book from an online bookstore, the bookstore may use this information to predict the preferences of this customer, so books from the same author or similar categories may appear to attract the customer's interest. Same principle applies to movies, music, news recommendation.
Augmented Reality (AR) most often relates to a field of computer research that focuses on the combination of real-world and computer generated information. Elements in the real-time direct or indirect view of the physical, real-world environment will be augmented by computer generated information, which may include virtual objects such as video, sounds and graphics. As a result, one's perception of the reality will be enhanced.
Mobile Augmented Reality (MAR) deals with a subset of problems in AR by limiting the scope of application on mobile devices, which provides additional technologies such as optical sensors (portable digital cameras etc.), inertial sensors (accelerometers, compass, gyroscopes etc.), GPS and wireless network sensors. These additional technologies provide extra features to MAR application developers. Typical devices that support MAR may include smart glass (e.g. Google Glass ) and smart watch (e.g. Apple Watch ). A smart glass is a wearable computing device in a form of eyeglasses, which typically includes an optical head-mounted display. A smart watch is a wristwatch computerized with functionality that is enhanced beyond timekeeping. Such devices may include features such as camera, screen, GPS navigation, etc. which enable these devices to run MAR applications. Mobile recommendation systems have been developed in mobile devices such as smart mobile phones. After years of development, smart phones such as iPhone™, Samsung GALAXY series contain relatively high computing capability that enables smart device to run sophisticated algorithms.
However, a recommendation system specialized for MAR applications has not been studied yet. MAR applications are different from applications such as websites, normal mobile applications in the sense that MAR applications provide richer sensory input, such as user's eyeball focus and mood, and in the sense that they imply a system whose recommendations should be specialized to reflect and improve user's real-world experience. Conventional MAR systems employed goggles such as Google Glass, which enables a user to see real world as well as computer-generated images on top of real world vision. Consider the scenario of browsing in one shopping center and wearing a smart glass. The amount of virtual objects (shops, cloths, etc.) is enormous, while the screen space available for presenting objects is limited. Similar situations pose an urgent demand for a recommendation system on MAR applications.
Accordingly, a new system for providing recommendations, which can be used in MAR applications is in need.
Summary
According to one aspect of the present disclosure, an information recommendation system including a cloud and a user device (client device) is provided. The cloud includes at least a server and storage means. The cloud can be any number remote servers that are networked to allow a centralized data storage. The user device is a mobile (portable) device equipped with at least a display screen. The user device includes a data preparation module, a user-independent data module, a user-dependent data module and a display module. The data preparation module is configured to connect to the cloud, receive virtual object information selected based on a geographical location of the user device and store that information in a local storage of the user device. The user-independent data module is configured to search information of virtual objects that are independent from user and may be prioritized for display. The user-dependent data module is configured to search and rank virtual object information that reflects user preference. The display module is configured to, according to different display policies, present recommendation information corresponding to the virtual object.
More specifically, the cloud comprises: a) a remote storage for storing first virtual data of objects, each of said objects being associated with its geographical information and also being associated with an attribute indicating whether said object is user-independent or user-dependent; b) a selector for selecting, from said first virtual data, second virtual data of object or objects on the basis of a geographical location of the user device; and c) a transmitter for transmitting said second virtual data to the user device. The user device comprises: a) a local storage; b) a data preparation module for receiving said second virtual data from the cloud such that it is stored in the local storage as local virtual data, wherein the data preparation module is configured to detect a change of the geographical location of the user device, transmit a request including information on a new geographical location of the user device to the cloud, and update the local virtual data in response to newly received second virtual data from the cloud; c) a user-independent data module for retrieving user-independent data from the local storage, by referring to the attributes of objects of which local virtual data are stored; d) a user-dependent data module for i) retrieving user-dependent data from the local storage, by referring to the attributes of objects of which local virtual data are stored, and ii) filtering, weighting, and ranking said retrieved user-dependent data, said weighting being performed based on user's preference, said ranking being performed based on said weighting when there are multiple items, as user-dependent data, found simultaneously; and e) a display module for displaying virtual data objects. The display module is adapted to receive said user-independent data retrieved from the user-independent data module and said filtered, weighted, and ranked user- dependent data from the user-dependent data module, to apply predefined policies to display virtual data objects in terms of position and/or style, and to rank data when there are multiple items, as user-independent data and/or user-dependent data, available for presentation at the same time.
According to another aspect of the present disclosure, a user device used in such an information recommendation system is provided. Other aspects of the present disclosure can be understood by those skilled in the art in light of the description, the claims, and the drawings of the present disclosure.
It should be noted that the terms "virtual object information" and "virtual data of object" used in the present disclosure are interchangeable.
It should also be noted that although the present invention is preferably applied to MAR applications, the present invention works if the recommendation information and real- world information are not displayed in a superimposed manner. The above-mentioned problem to be solved by the present invention is not necessarily related to the basic feature of AR that real-world information is displayed at the same time. Brief Description of the Drawings
FIG. 1 is a schematic view illustrating an information recommendation system according to an embodiment of the present invention;
FIG. 2 is a block diagram illustrating a cloud shown in FIG. 1 ;
FIG. 3 illustrates a table stored in a remote storage shown in FIG. 2, which defines a relationship between virtual objects and attributes;
FIG. 4 illustrates an exemplary classification method of virtual objects stored in the remote storage;
FIG. 5 illustrates exemplary preparatory steps for the information recommendation system;
FIG. 6 illustrating exemplary usage steps for the recommendation system in terms of how user- dependent data is processed;
FIG. 7 is a block diagram illustrating a user device shown in FIG. 1 ;
FIG. 8 is a block diagram illustrating a data preparation module shown in FIG. 7;
FIG. 9 is a block diagram illustrating a user-independent data module shown in FIG. 7;
FIG. 10 is a block diagram illustrating a user-dependent data module shown in FIG. 7;
FIG. 11 is a block diagram illustrating a display module shown in FIG. 7;
FIG. 12 is a flowchart illustrating an exemplary data preparation operation performed by the data preparation module shown in FIG. 8;
FIG. 13 is a flowchart illustrating an exemplary operation performed by the user-independent data module shown in FIG. 9;
FIG. 14 is a flowchart illustrating an exemplary operation performed by the user-dependent data module shown in FIG. 10;
FIG. 15 is a flowchart illustrating an exemplary display operation performed by the display module shown in FIG. 11 ;
FIG. 16 is a flowchart illustrating a display operation performed by an exemplary augmented reality system into which the information system according to an embodiment of the present invention has been integrated; and
FIG. 17 illustrates an example of how different pieces of virtual object information are displayed in the augmented reality system. Detailed Description of the Preferred Embodiments
The innovation is now described with reference to the drawings, wherein like reference numerals are used to refer to like elements throughout. Hereinafter, preferred embodiments according to the present invention are described with reference to the drawings. For explanation purpose, various specific examples and details are set forth in order to provide a thorough understanding of the subject innovation. However, the examples presented are not exhaustive and various modifications can be implemented without departing from the scope defined in the claims.
INFORMATON RECOMMENDATION SYSTEM
Referring to FIG. 1 , an information recommendation system 1 according to an embodiment of the present invention includes a cloud 2 and one or more user devices 3 (such as mobile phones) equipped with a screen. The information recommendation system 1 provides recommendation information to the user device 3, while the information is being updated depending on a geographical location of the user device 3.
CLOUD
As shown in FIG. 2, the cloud 2 comprises a remote storage 4, a processor 5, and a communication unit 6. The cloud 2 may be public, private, or hybrid. The remote storage 4 is used to store virtual object information ("first virtual data") on virtual objects in advance, which can be provided to the user device 3 for later processing thereof. Each virtual object is associated with its geographical information. As will be explained below in detail, by detecting the user device location, user-location-dependent virtual object information ("second virtual data") is provided to the user device 3.
The processor 5 is used to conduct computational tasks. The communication unit 6 is used to transfer data between the cloud 2 and the user device 3. More specifically, the processor 5 functions as a selector for selecting, from the original virtual object information ("first virtual data"), partial data thereof ("second virtual data") on the basis of a geographical location of the user device 3. The communication unit 6 functions as a transmitter for transmitting that user-location- dependent data to the user device 3.
CLASSIFICATION
It is to be noted that any piece of information that is presentable to the screen of the user device 3 (which will help to augment user's reality in AR applications) can be under the category of "virtual object"(or "virtual data object" as defined in the claims). Examples of "virtual object" information are coffee shops, weather, bus routes, restaurants, hotels, etc. Each virtual object comes with a name, and preferably a label (in other words, type or category). For example, Hilton (which represents a name) belongs to label "hotel". Starbucks (which represents a name) belongs to label "coffee shop". The top-level label can alternatively or additionally be divided into sub-labels. For example, a restaurant A for Chinese food and a restaurant B for Japanese food may belong to different labels.
Virtual objects are stored digitally in an organised manner in the remote storage 4, for example in the form of a table as shown in FIG. 3, which can be referred to as "database". Preferably, the remote storage 4 storing virtual objects is designed in a way such that it contains useful labels to users, including but not limited to tangible objects such as restaurants, hotels, shops, monuments, and/or intangible objects such as fire alarms, weather, bus routes/schedules. The species of labels can vary according to the designer of the information recommendation system (or database provider).
In more detail, referring to FIG. 4 showing classification of virtual objects stored in the remote storage 4, all virtual objects are first divided into two categories: user-independent and user- dependent. Virtual objects whose significance is not determined by user's profile are classified as "user-independent", and vice versa. For example, weather and bus schedule belong to user- independent objects, while hotels information and restaurants information reflect user's preferences and hence belong to user-dependent objects. Each virtual object, stored in the remote storage 4, is associated with an attribute indicating whether the virtual object is user-independent or user- dependent.
In one embodiment, within the user-independent category, virtual objects may be further divided into two categories: urgent and regular (i.e. not urgent). Urgency reflects high degree of significance of a virtual object in terms of time. Urgent virtual objects should be presented in time or at once, such as nearby fire alarms, car accidents. Regular virtual objects may include bus schedules, weather, etc. The degree of urgency (i.e. whether to classify the object as urgency or regular) can vary according to the designer of the infomiation recommendation system (or database provider). In this embodiment, each virtual object, stored in the remote storage 4, categorized as user-independent, is associated with an additional attribute indicating whether the virtual object is urgent or regular (not urgent).
In one embodiment, within the user-dependent category, virtual objects may be further divided into two categories: repeatable and unrepeatable. Repeatability reflects the willingness of a user "in general" performing a similar activity within a short time period. For example, a user is probably willing to visit several shops at a short range of time, so "shops" (such as Puma; see FIG 3) may belong to repeatable objects. On the other hand, a user is probably not willing to have another meal or coffee at a different place if he just finished one half an hour ago, thus "restaurants" or "coffee shops" (such as Starbucks; see FIG. 3) may belong to unrepeatable objects. In this embodiment, each virtual object, stored in the remote storage 4, categorized as user-dependent, is associated with an additional attribute indicating whether the virtual object is repeatable or not (i.e. non-repeatable).
As is apparent from the above, whether to classify the object as repeatable or not can vary depending on the designer of the information recommendation system (database provider). Only the idea of differentiation by repeatability is important in this innovation, so it is of less importance how exactly objects are classified. Furthermore, this bipartition approach may be substituted by plural degrees of repeatability to enhance accuracy.
As will be explained below, in a weighting operation within the user device 3, non- repeatable objects are assigned lower weight while repeatable objects higher weight.
PREPARATORY PROCEDURE
FIG. 5 illustrates exemplary preparatory steps for the information recommendation system. Step 501: Collect virtual object information
For example, virtual object information can be retrieved, transformed and/or aggregated from publicly available and/or commercial databases.
Virtual object information is information which can be displayed on the screen of the user device 3 and may include and not be limited to name, time/date, geographical location (e.g. address), popularity. As explained above, each virtual object is associated with a geographical location. This geographical location may be identical to that for display, but preferably different therefrom, such as coordinates for allowing to provide finely adjusted recommendation information.
Step 502: Classify virtual object information
More specifically, assigning one or more attributes to each virtual object is performed using the aforementioned exemplary classification method. The assigning operation may be performed automatically and/or manually. Preferably, urgent objects are automatically assigned an attribute of urgency immediately upon collection of urgent information (e.g. car accident). Other attributes of virtual objects can be predetermined by considering various factors, such as time/date, geographical locations, related moods, popularity, specific product information or advertisement information, etc. Examples of attributes are popularity (typically in the form of a numeric value), and keyword (e.g. moods, promotional words, related items etc. which could help to provide useful recommendation results). For example, a keyword "happy" may be assigned to an amusement park.
Step 503: Store virtual object information in the cloud
More specifically, virtual objects are stored in the remote storage 4 (database), which resides in the cloud 2. After the storing step is performed, services will be enabled to allow external connections to retrieve data from within.
The size and scope of the virtual object information database can be predetermined based on the customization need (system designer's need).
The previous description of classification, for the purpose of explanation, has been discussed with examples. The classification constitutes the foundation (preparation) of the proposed innovation. The remaining sections will discuss the construction of the user device, and the steps performed in the information recommendation system in detail.
USAGE PROCEDURE
Before explaining the construction of the user device and its operation in detail, reference is made to FIG. 6 illustrating exemplary usage steps for the recommendation system in terms of how user-dependent data is processed, for the purpose of facilitating the understanding of the present invention.
Step 601: Obtain user's information
For example, user's information may include and not be limited to: user's basic information (age, gender, birthday, height, weight, etc.), user's current mood (happy, sad, melancholy, etc.), user's previous check-in history, user's social network's information (preferences from user's friends (e.g. the user might like what his or her friends like) etc.). User information such as user's basic information, previous check-in history, and social network's information can be obtained, for example, by allowing the user to input such information into the user device 3. Previous check-in history may be automatically updated by detecting that the user has checked-in an actual object (e.g. restaurant, hotel). Geographical location of the user is also obtained. Current time may be obtained.
Step 602: Receive virtual object information ("second virtual data") within a range of the current geographical location from the cloud 2, and store the virtual object information in the local storage of the user device 3
The range used when the cloud 2 retrieves user-location-dependent virtual object information can be predetermined by estimating user's activity scope within the geographical location, user's connectivity quality, etc., to reduce the volume of data transmission. For example, if the user's past activity indicates an inactive motion, smaller range can be applied, and less information will be transmitted. In this way, performance can be improved without sacrificing user's experience.
Step 603: Filter, weight and rank the user-location-dependent virtual object information, according to user's information
More specifically, the virtual object information categorized as user-dependent is retrieved from the local storage, and then subjected to filtering, weighting and ranking operations. Various filtering, weighting, and ranking techniques can be applied. For example, sensory inputs such as eyeball focus, heartbeat rate, pulse rate, walking speed, etc. may be used in the filtering and/or weighting operations. Additionally, current time can be used to predict user's behaviour in a day (just woke up, about to have a dinner, etc.), and be used to filter and/or weight virtual object information. Further, when multiple pieces of virtual object information are found simultaneously, they are sorted (ranked) based on the weighting result.
Step 604: Display the filtered/weighted/ranked virtual object information according to display policies in terms of position and/or style.
USER DEVICE
FIG. 7 illustrates an information recommendation apparatus (user device) according to an embodiment of the present invention. As shown, the user device 3 comprises a data preparation module 701, a user-independent data module 702, a user-dependent data module 703 and a display module 704. Apart from these modules, hardware components such as local storage 705, communication unit 706, GPS 707, user-facing camera (that will be explained later) 708, and mood detecting unit (that will be explained later) 709 are included. Other components (e.g. those used for AR applications, such as camera, video camera for video see-through, half-mirror for optical see- through) may also be included.
The four modules 701 to 704 are interconnected to form the recommendation system.
DATA PREPARATION MODULE
The data preparation module 701 is in charge of preparing data for the user device 3. In detail, this module keeps track of location changes and updates local virtual object information. Given the limited storage space of the user device 3 and vast volume of virtual object information, the remote storage 4 (remote data store) at the cloud side is needed, and local data (local data store) will be updated in real-time. Preferably, trigger event for this module is location changes. Consider the scenario where a user is walking. When the user is moving to a new location, nearby virtual objects will be loaded.
More specifically, the data preparation 701 is adapted to receive the user-location-dependent data via the communication unit 706 from the cloud 2 such that it is stored in the local storage 705 as local virtual data.
Referring to FIG. 8, the data preparation module 701 comprises a location change detector
801 for detecting a change of the geographical location of the user device 3, for example, on the basis of an input from the GPS 707. As will be appreciated by the skilled person, GPS is one example for providing location information, and other techniques may be used.
Geographical location may be represented by coordinates. For example, the coordinate system for the geographical location may be any appropriate coordinate system, including but not limited to latitude and longitude coordinates or UTM (Universal Transverse Mercator Grid System) coordinate system and so on. Definition of "change" may be customized (i.e. determined by the system designer) inside the location change detector 801. For example, it is determined that a location change has occurred when the user (user device) has moved more than a predetermined distance (e.g. several meters). In one embodiment, the definition of "change" may be adjusted through the user device 3.
The data preparation module 701 further comprises a request generator 802 for generating a request including information on a new geographical location of the user device 3 to the cloud 2 for data update. The request will be transmitted via the communication unit 706. Communication between the user device 3 and the cloud 2 may be performed via WLAN, 3G, 4G, and alike wireless and/or mobile network.
The data preparation module 701 additionally comprises an updating unit 803 for updating, by (re)writing, the local virtual data in the local storage 705 in response to newly received user- location-dependent virtual data from the cloud 2. The virtual object stored in the local storage 705 maintains its attribute(s) (at least attribute of user- independent or user-dependent).
In one embodiment, if any error occurs during transmission, the data preparation module
701 may send a message to the display module 704, so that a warning is shown.
As will be explained below, it is preferable that the user-independent data module 702 is invoked by the data preparation module 701 only after the transmission of data is completed (i.e. only after the data preparation module 701 finishes its operation).
USER-INDEPENDENT DATA MODULE
The user-independent data module 702 functions to retrieve user-independent data that is within a range of user's current geographical location from the local data store, which is already prepared by the data preparation module 701.
More specifically, the user-independent data module 702 is adapted to retrieve user- independent data from the local storage 705, by referring to the attributes of objects of which local virtual data are stored.
Referring to Fig. 9, the user-independent data module 702 is provided with an urgency attribute detector 901, which determines what virtual object information should be displayed immediately, by referring to the attribute as to whether the object is urgent or not. The user- independent data module 702 is further provided with an eyeball focus detector 902 and an eyeball focus filter 903.
More specifically, the urgency attribute detector 901 functions to decide which information should be passed to the display module 704 immediately for presentation. Among common user- independent data such as weather, bus-schedule, fire alarms, etc., priorities may be given to a customized set of virtual objects. Preferably, the urgency attribute detector 901 can be implemented to select urgent virtual object information, such as fire alarms, car accidents nearby, for display.
In one embodiment, the user-facing camera 708 may be used to obtain eyeball focus to record movements of eyeballs as the viewer looks at different objects, and measures the rotation of the eyes with respect to the measuring system. The user-facing camera 708 is a camera which is arranged opposite to the face or eye. For example, such a camera can be installed in a smart glass. The eyeball focus detector 902 may then be used to determine what the user is looking at. The eyeball focus filter 903 may then be applied to urgent virtual object and/or non-urgent object (preferably to non-urgent object only since urgent virtual object is often important information, such as car accident, which would need user's immediate attention) to ensure that only information that is related to user's eyeball focus (e.g. bus schedule if the user is looking at the bus schedule) is displayed. The eyeball focus detector 902 may receive input from a timer (not shown), so that an object which is being focused on for more than a predetermined period of time (e.g. a few seconds) may be subjected to retrieving processing of its corresponding virtual object.
As will be explained below, it is preferable that the user-dependent data module 703 is invoked only after the user-independent data module 702 finishes its operation. USER-DEPENDENT DATA MODULE
The user-dependent data module 703 functions to retrieve user-dependent data that is within a range of user's current geographical location from the local data store, which is already prepared by the data preparation module 701, and then perform filtering, weighting and ranking operations on the retrieved user-dependent data.
More specifically, the user-dependent data module 703 functions to i) retrieve user- dependent data from the local storage 705, by referring to the attributes of objects of which local virtual data are stored, and ii) filter, weight, and rank the retrieved user-dependent data.
Referring FIG. 10, the user-dependent data module 703 comprises a filtering unit 1001, a weighting unit 1002, a ranking unit 1003, and a repeatability attribution detector 1004. The filtering unit 1001 functions to apply filters to ignore retrieved items (as user-dependent data) that do not match one or more predetermined criteria. Preferably, filter criteria can be mood, status, and/or the physical distance between an object and the user. Mood can be happy, sad, relaxed, cheerful, depressed etc. Status can be marital status (single, married, divorced etc.), social status (position, number of friends etc.), online status (online, offline, away etc.), etc. For example, if a user's mood is happy, all virtual object information that has an attribute of negative mood will be ignored. For this purpose (and for the subsequent weighting operation to work), when virtual object information is stored in the remote storage 4 of the cloud 2, various attributes (e.g. in the form of keywords, as is shown in the table of FIG. 3 stored in the remote storage 4) will be added to enable the filtering operation at the user device 3. In order to detect the user's mood, the user- dependent data module 703 will obtain information from the mood detecting unit 709.
The mood detecting unit 709 functions to detect or predict user's mood on the basis of one or more inputs, or any combination thereof. For example, heartbeat rate or pulse rate can be used to predict user's mood. The mood detecting unit 709 may include an image analysing algorithm to detect a facial expression on the basis of an input from the user- facing camera 708 in order to predict user's mood.
In one embodiment, filter criteria which are configured by the system designer as default may be adjusted through the user device.
The weighting unit 1002 functions to assign weights to the filtered items based on current or history user data that indicates or predicts user's preference. Weighting represents the process of assigning a numeric value to each virtual object to indicate or predict user's preference.
Preferably, weights can be a function of a number of parameters: type or category of the object inside user's eyeball focus, user's short-term preference, user's long-term preference, user's social network preference, physical distance and/or repeat factor, etc. Any other parameter(s) that indicates or predict user's preference may be further included. History user data such as user's short-term preference, user's long-term preference, user's social network preference, and repeat factor are stored in the user device 3.
More specifically, a user will likely to look at items that he is interested in, so user's eyeball focus indicates a preference. For this purpose, the user-dependent data module 703 will obtain information from the eyeball focus detector 902 within the user-independent data module 702. As will be appreciated by the skilled person, the eyeball focus detector 902 may not necessarily be located within the user-dependent data module 703. A virtual object corresponding to an actual object inside user's eyeball focus will receive higher weighting. Preferably, virtual objects belonging to an identical or similar type or category (i.e. label) to the object inside user's eyeball focus may also receive higher weighting. The eyeball focus detector 902 may receive input from a timer (not shown), so that an object which is being focused on for more than a predetermined period of time (e.g. a few seconds) may be subjected to weighting processing of its corresponding virtual object and/or a virtual object of an identical or similar type or category.
User's short-term preference indicates short-term activity that a user is conducting within a short time period (e.g. within one to a few hours). For example, shopping may be used to indicate short-term preference if it is detected by means of the eyeball focus detector 902 that the user focuses on shops (the user may have a buy list and is likely to focus on shops selling things on the list). In this case, high weight is assigned to objects which have an attribute (keyword) "shopping". Likewise, if it is detected that a user is constantly looking at supermarkets, relative information may be used. Short-term preference is in general current user data for indicating or predicting user's preference, but can be history user data.
User's long-term preference indicates long-term activity that a user is conducting within a long time period (e.g. within one week, within one month, within one year). When a user visits a place, the information on this place (e.g. type, category) may be added to a (not shown) history check-in database (which is typically provided in the user device 3 itself). User's long-term preference can be calculated by aggregating check-in data in the history check-in database.
User's social network preference indicates preference obtained by aggregating user's social network, such as friend relationship. For this purpose, information may be input by the user. Social network website APIs can be used to gather such information and provide it to the user device. Alternatively, social network analysis method (software) such as Netvizz (https://github.com bernorieder/netvizz) may be used as well.
Physical distance indicates a distance between the user (user device 3) and the object. For example, a higher weight is assigned to an object as the object is closer to the user. Note that the usage of "physical distance" in weighting is different from that in filtering. To reduce the frequency of data update, the data preparation module 701 may retrieve objects within a large radius around the user at once, from which the filtering unit 1001 uses a physical distance threshold such that only objects within this threshold will be included. Thus, when the user moves a small distance, the filtering unit 1001 can include new objects without having to trigger the data preparation module 701 to update objects introduced by this small location change, since they are already cached in the local storage 705. On the contrary, "physical distance" is used in weighting as a factor to differentiate objects that are not filtered out by the previous step.
Repeat factor indicates a user's willingness to revisit a category of non-repeatable objects after he/she has already checked-in this category of non-repeatable objects a short time ago. Note that repeatability is an attribute of virtual objects that is determined by the system designer, while the repeat factor reflects user's preference to non-repeatable objects. In the user-dependent data module 703, the repeatability attribute detector 1004 functions to exclude repeatable objects, so non-repeatable objects will be considered only for repeat factor. Each user is set to have the same repeat factor for a category of non-repeatable objects initially. Over time, the repeat factor will be gradually updated according to each user's behaviour. For example, every user will have the same repeat factor to the category of restaurants (which is a non-repeatable object, since people tend not to go to restaurants again shortly after they just finish a meal) initially. Over time, if user A takes more meals per day than user B, user A will have a more frequent pattern of checking-in restaurants, so even though for the same non-repeatable category of objects (restaurant in this example), user A has a higher repeat factor that reflects his/her higher willingness of revisiting restaurants, so results from the category of restaurants receive a higher weighting.
In an embodiment where "repeat factor" is used, the history user data will include user's check-in history of each object, which is stored in the user device 3. The repeat factor to the object is defined and updated by using the user's check-in history of the object. The weighting unit 1002 is adapted to set lower weight to not-filtered-out virtual object information if the repeatability attribute detector 1004 detects that the object is categorized as non-repeatable, and set higher weight to not-filtered-out virtual object information if the repeatability attribute detector 1004 detects that the object is categorized as repeatable. The weighting unit 1002 is further adapted to increase the lower weight for the non-repeatable object as the corresponding user's repeat factor increases as a result of the update, as is explained above using the example of users A and B.
By tuning different input parameters (such as type or category of the object inside user's eyeball focus, user's short-term preference, user's long-term preference, user's social network preferences, physical distance and/or repeat factor) by the system designer, the weight given to virtual object will be changed. In a preferred embodiment, the function can be designed as a linear combination of these parameters.
The ranking unit 1003 functions to perform a ranking operation based on the weighting result, when there are multiple items, as user-dependent data, being found simultaneously.
More specifically, when there are multiple virtual objects, they can be ranked in any appropriate ranking strategies. A simple strategy may be that a virtual object assigned with a higher weight will receive a higher ranking. Another strategy may be that only the top-N results with the highest weights may be ranked as inputs that are subsequently being supplied to the display module
704. As will be explained below, it is preferred that the display module 704 is invoked only after the user-dependent data module 703 finishes its operation. DISPLAY MODULE
The display module 704 functions to display virtual data objects upon receiving the user- independent data retrieved from the user-independent data module 702 and the filtered, weighted, and ranked user-dependent data from the user-dependent module 703.
Referring to FIG.l 1, the display module 704 comprises an input differentiation unit 1 101 (as display control means) for applying predefined policies to display virtual data objects in terms of position and/or style on the screen 1 102. The input differentiation unit 1101 is further configured to rank data when there are multiple items, as user-independent data and/or user-dependent data, are available for presentation at the same time. The "style" may include and be not limited to size, color and/or position on the screen.
Preferably, urgent event such as fire alarm is prioritised on the screen, while other virtual objects are displayed at the corresponding positions. In other words, if the virtual object belongs to "urgent" category, the user device 3 is expected to prioritize this information, and display the item in the way that attracts higher degree of user's attention. Such a rule is referred to as "display policies". Further, different display policies can be imposed on different kinds of virtual object information. For example, a display policy may place urgent items in the center of screen, while another display policy may use a round-robin approach to display multiple items to utilize the limited screen space. Further, the maximum number of recommendation information items presenting on the screen 1102 simultaneously can be preconfigured to a value, depending on the size of screen and size of virtual object information visualization.
When there are multiple items (as user-independent data and/or user-dependent data) are available for presentation at the same time, the data for display will be ranked in view of a limited screen space. Various ranking strategies may be deployed. For example, weighted round robin may be used in display. In performing the display ranking operation, the input differentiating unit 1101 is adapted to refer to the ranking result at the ranking unit 1003. That is, this ranking result is used as one criterion for display ranking.
Urgent objects may be presented additionally by means of audio/sound means (not shown) to receive user's immediate attention.
Preferably, each of the four basic modules 701 to 704 may adopt event-driven approach for implementation. Event-driven approach is a widely adopted technique in computer science area.
Under event-driven approach, the flow of the program is determined by events such as user actions
(hand gesture), sensor input (location changes) etc. For example, the data preparation module 701 can be automatically triggered when location of the user device is changed without user's direct interference. Using this technique, different events may trigger a single module directly, or multiple modules simultaneously.
However, urgent objects (such as fire alarm) may be supplied from the cloud 2 to the user device 3 for display even if the data preparation module 701 does not detect a change of the geographical location of the user device 3.
OPERATION
Referring further to FIGS. 12 to 15, the operation of the information recommendation system will be explained.
DATA PRERATION OPERATION
As shown in FIG. 12, the operation performed by the data preparation module 701 may include the following steps:
Step 1201: Detect change of geographical location on the basis of an input from the GPS 707
Step 1202: Contact the cloud 2, and provide current geographical location
More specifically, a request which includes the current location of the user is generated by the request generator 802 and sent via the communication unit 706 to the cloud 2.
Step 1203: The cloud 2 handles the request, retrieves and sends back all virtual object information within a range of the specified geographical location
Preferably, the cloud 2 may have content control within itself. In this context, it is preferable that the cloud 2 can adjust the range as to reduce frequency of data exchange while maintaining user experience simultaneously.
Step 1204: Store received data in the local storage 705
More specifically, upon receiving the data, the updating unit 803 updates the data in the local storage 705.
If any error occurs during transmission, the data preparation module 701 may send a message to the display module 704, so that a warning is shown.
After the transmission of data is completed, the data preparation module 701 invokes the user-independent data module 702.
USER-INDEPENDENT DATA ACQUISITION
As shown in FIG. 13, the operation performed by the user-independent data module 702 may include the following steps: Step 1301: Retrieve user- independent data
More specifically, user-independent data that is within a range of user's current geographical location is retrieved from the local data store, which is already prepared by the data preparation module 701.
Step 1302: Determine what virtual object information should be displayed immediately
More specifically, the urgency attribute detector 901 decides which information should be passed to the display module 704 immediately for presentation.
Step 1303: Pass the virtual object information to the display module 704
More specifically, after checking urgency, the virtual object information is passed to the display module 704 in such a manner that urgent object is displayed immediately while non-urgent object is passed to the display module 704 so that it may be subjected to the display ranking step by the display module 704.
After the user-independent data module 702 finishes its operation, it will invoke the user- dependent data module 703.
USER-DEPENDENT DATA ACQUISITION
FIG. 14 illustrates the operation by the user-dependent data module 703. Trigger event for this module may be eyeball focus change and user-dependent criteria such as mood, status, etc. As shown, the operation may include the following steps:
Step 1401: Retrieve user-dependent data
More specifically, the user-dependent data will be retrieved from the local data store, which is already prepared by the data preparation module 701.
Step 1402: Filter data
More specifically, the filtering unit 1001 applies filters to ignore retrieved items (as user- dependent data) that do not match one or more predetermined criteria such as mood, status, and/or the physical distance between an object and the user. In one embodiment, the user-dependent data module 703 obtains information from the mood detecting unit 709 in order to filter out retrieved items that do not match detected or predicted user's mood. For example, information on an amusement park is not presented to a user in a sad mood. In one embodiment, the user-dependent data module 703 filters out retrieved items that do not match marital status stored in the user device 3. For example, information on kid's clothes stores is not presented to a user who is single.
Step 1403: Weight data
More specifically, the weighting unit 1002 performs a weighting operation on the filtered items (i.e. not-filtered-out items) based on current or history user data that indicates or predicts user's preference, such as type or category of the object inside user's eyeball focus, user's short- term preference, user's long-term preference, user's social network preference, physical distance and/or repeat factor. Current user data such as type or category of the object inside user's eyeball focus and physical distance are obtained on the basis of sensor inputs such as eyeball focus detector 902 and GPS 707. On the other hand, history user data such as user's short-term preference, user's long-term preference, user's social network preference, and repeat factor are stored in the user device 3. In an embodiment where "type or category of the object inside user's eyeball focus" is used, the weighting unit 1002 assigns a virtual object corresponding to an actual object inside user's eyeball focus higher weighting. Preferably, virtual objects belonging to an identical or similar type or category (i.e. label) to the object inside user's eyeball focus may also receive higher weighting.
In an embodiment where "repeat factor" is used, if the repeatability attribute detector 1004 detects that the object is non-repeatable, the weighting unit 1002 sets lower weight to the corresponding virtual object information, but increases the lower weight for the non-repeatable object as the user's repeat factor increases as a result of the update.
Step 1404: Rank data
The ranking unit 1003 performs a ranking operation based on the weighting result, when there are multiple items, as user-dependent data, being found simultaneously. More specifically, the ranking unit 1003 assigns a higher ranking to virtual object information with a higher weight. The ranking result will be provided to the display module 704 for its display ranking operation.
DISPLAY OPERATION
FIG. 15 illustrates the operation by the display module 704. Trigger event is input from the user-independent and user-dependent modules 702, 703, and may additionally be input from the data preparation module 701. At step 1501, given an input, the input differentiation unit (display control means) 1 101 applies a corresponding predefined policy to generate display control data in order to show the virtual object information in a place defined in the display policy with a style defined in the display policy, using the screen 1102. At step 1502. the input differentiating unit 1 101 sends the display control data to a screen driver (not shown) for display.
When there are multiple inputs at the same time, the input differentiating unit 1101 ranks the inputs in a predetermined rule. In a preferred embodiment, urgent event such as fire alarm is prioritised on the screen 1102, while other virtual objects are displayed at the corresponding positions in a non-prioritised manner. When performing the display ranking operation, the input differentiating unit 1101 refers to the ranking result at the ranking unit 1003. In one embodiment, errors which occur within each of the three modules 701 to 703 may be passed to the display module 704 and displayed on the screen 1 102 for user's notification.
MOBILE AUGMENTED REALITY APPLICATIONS
The information recommendation system according to the present invention is advantageously applicable to MAR applications (e.g. smart glass). FIG. 16 illustrates an exemplary workflow showing how the virtual object information is displayed in the AR context. The user device 3 monitors real-world objects (e.g. real view of a restaurant) in the form of image, video, or direct view (step 1601 ), gathers virtual object information (e.g. virtual information relating to the restaurant) (step 1602), i.e. inputs from the user-independent data module 702 and/or from the user- dependent data module 703, augments real-world objects (step 1603), and renders augmented objects information (step 1604). Virtual data are overlaid upon real-world information. For example, virtual object information of a real-world object may be displayed adjacent to or on the real- world object. In one embodiment, a user may set up a filter that augments reality based on his or her context, including location, current activity, date/time etc. For example, if the user is inside an amusement park, and it's at 10am which is not a normal meal time, he could specify that information as a context, based on which a filter can be set up, so reality will be augmented such that only amusement facilities will be displayed.
FIG. 17 shows an example of how items are displayed on the screen in the AR context. The user is on top of the mountain, looking down towards an amusement park. User-independent virtual objects includes weather info, which is displayed on top of the screen, and car accident notice, which is in bold type to attract user attention. User-dependent virtual objects, including Hair Raiser, Refill Station, Clown Corner and Rainforest, are displayed with their respective name and relative distance to the user. The upper right corner is a graphical radar to indicate nearby virtual objects, from which one can see there are a large number of virtual objects available nearby. The grey dotted box represents user's current eyeball focus. Note that with this combination of filtering, weighting, ranking and/or display policies, even there are other user-dependent virtual objects available nearby as shown by the radar, only those within user's eyeball focus are displayed. On the contrary, an urgent user-independent virtual object, e.g., car accident, is displayed in bold with eye- catching shape. Non-urgent user-independent virtual object, such as weather info and wind speed, is allocated to a less prominent space. Although there are a large number of virtual objects available for display, a user-friendly interface can be presented with a proper combination of display policies, such that it augments user's reality without interfering his vision excessively. Note that FIG. 17 only serves as one preferred embodiment to illustrate some principles that a specific system designer wants to achieve, so styles and display policies can be customized in other ways.
While the present invention has been described in connection with certain preferred embodiments, it is to be understood that the subject-matter encompassed by the present invention is not limited to those specific embodiments. On the contrary, it is intended to include any alternatives and modifications within the scope of the appended claims.

Claims

Claims
1. An information recommendation system comprising a cloud and a user device,
wherein the cloud comprises:
a remote storage for storing first virtual data of objects, each of said objects being associated with its geographical information and also being associated with an attribute indicating whether said object is user-independent or user-dependent;
a selector for selecting, from said first virtual data, second virtual data of object or objects on the basis of a geographical location of the user device; and
a transmitter for transmitting said second virtual data to the user device, and wherein the user device comprises:
a local storage;
a data preparation module for receiving said second virtual data from the cloud such that it is stored in the local storage as local virtual data,
wherein the data preparation module is configured to detect a change of the geographical location of the user device, transmit a request including information on a new geographical location of the user device to the cloud, and update the local virtual data in response to newly received second virtual data from the cloud;
a user-independent data module for retrieving user-independent data from the local storage, by referring to the attributes of objects of which local virtual data are stored;
a user-dependent data module for i) retrieving user-dependent data from the local storage, by referring to the attributes of objects of which local virtual data are stored, and ii) filtering, weighting, and ranking said retrieved user-dependent data, said weighting being performed based on user's preference, said ranking being performed based on said weighting when there are multiple items, as user-dependent data, found simultaneously; and a display module for displaying virtual data objects,
wherein the display module is adapted
to receive said user-independent data retrieved from the user-independent data module and said filtered, weighted, and ranked user-dependent data from the user-dependent data module,
to apply predefined policies to display virtual data objects in terms of position and/or style, and to rank data when there are multiple items, as user-independent data and/or dependent data, available for presentation at the same time.
2. The information recommendation system according to claim 1, wherein in the remote storage each object categorized as user-independent is further associated with a second attribute indicating whether said object is urgent or not,
the display module is adapted to immediately display virtual data object categorized as urgent upon receipt from the user-independent module, and/or rank virtual data object categorized as urgent at the highest.
3. The information recommendation system according to claim 1, wherein in the remote storage each object categorized as user-dependent is further associated with a second attribute indicating whether said object is repeatable or not,
the user-dependent data module is further configured to:
assign weights to retrieved items, as user-dependent data, according to current or history user data that indicates or predicts user's preference, said history user data including user's check-in history of each object, a user's repeat factor to the object being defined and updated by using said user's check-in history of the object,
detect the second attribute of an object corresponding to each retrieved item,
set lower weight to said retrieved item if the object is categorized as non-repeatable, and increase the lower weight for the non-repeatable object as the user's repeat factor increases as a result of the update.
4. The information recommendation system according to claim 1, wherein the user device further comprises means for detecting user's eyeball focus, and
the user-independent data module in the user device is further configured to perform the retrieving operation by referring to the detected user's eyeball focus.
5. The information recommendation system according to claim 1, wherein the user-dependent data module is further configured to:
assign weights to retrieved items, as user-dependent data, according to current or history user data that indicates or predicts user's preference such as type or category of the object inside user's eyeball focus, user's short-term preference, user's long-term preference, user's social network preference, and/or physical distance to the object.
6. The information recommendation system according to claim 1, wherein the user device further comprises means for detecting user's mood, and
the user-dependent data module is adapted to perform the filtering operation on the basis of the detected user's mood.
7. The information recommendation system according to any one of claims 1 to 6, wherein the display module is further configured to:
receive and display notification messages from the data preparation module.
8. A user device used in an information recommendation system according to any one of clams 1 to 7.
PCT/EP2015/066780 2015-07-22 2015-07-22 A system for providing recommendation information for user device WO2017012662A1 (en)

Priority Applications (2)

Application Number Priority Date Filing Date Title
PCT/EP2015/066780 WO2017012662A1 (en) 2015-07-22 2015-07-22 A system for providing recommendation information for user device
EP15750939.9A EP3207468A1 (en) 2015-07-22 2015-07-22 A system for providing recommendation information for user device

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
PCT/EP2015/066780 WO2017012662A1 (en) 2015-07-22 2015-07-22 A system for providing recommendation information for user device

Publications (1)

Publication Number Publication Date
WO2017012662A1 true WO2017012662A1 (en) 2017-01-26

Family

ID=53879464

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/EP2015/066780 WO2017012662A1 (en) 2015-07-22 2015-07-22 A system for providing recommendation information for user device

Country Status (2)

Country Link
EP (1) EP3207468A1 (en)
WO (1) WO2017012662A1 (en)

Cited By (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN107704491A (en) * 2017-08-22 2018-02-16 腾讯科技(深圳)有限公司 Message treatment method and device
CN108170795A (en) * 2017-12-28 2018-06-15 百度在线网络技术(北京)有限公司 Information-pushing method, device and equipment
WO2020076946A1 (en) * 2018-10-09 2020-04-16 Google Llc Selecting augmented reality objects for display based on contextual cues
CN111095361A (en) * 2017-09-29 2020-05-01 高通股份有限公司 Display of live scenes and auxiliary objects

Citations (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20070005419A1 (en) * 2005-06-30 2007-01-04 Microsoft Corporation Recommending location and services via geospatial collaborative filtering
US20120233158A1 (en) * 2011-03-07 2012-09-13 David Edward Braginsky Automated Location Check-In for Geo-Social Networking System
US20140007010A1 (en) * 2012-06-29 2014-01-02 Nokia Corporation Method and apparatus for determining sensory data associated with a user
EP2750054A1 (en) * 2012-12-26 2014-07-02 HTC Corporation Content recommendation method
US20140337436A1 (en) * 2012-07-23 2014-11-13 Salesforce.Com, Inc. Identifying relevant feed items to display in a feed of an enterprise social networking system
US20150172327A1 (en) * 2012-09-13 2015-06-18 Google Inc. System and method for sharing previously visited locations in a social network

Patent Citations (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20070005419A1 (en) * 2005-06-30 2007-01-04 Microsoft Corporation Recommending location and services via geospatial collaborative filtering
US20120233158A1 (en) * 2011-03-07 2012-09-13 David Edward Braginsky Automated Location Check-In for Geo-Social Networking System
US20140007010A1 (en) * 2012-06-29 2014-01-02 Nokia Corporation Method and apparatus for determining sensory data associated with a user
US20140337436A1 (en) * 2012-07-23 2014-11-13 Salesforce.Com, Inc. Identifying relevant feed items to display in a feed of an enterprise social networking system
US20150172327A1 (en) * 2012-09-13 2015-06-18 Google Inc. System and method for sharing previously visited locations in a social network
EP2750054A1 (en) * 2012-12-26 2014-07-02 HTC Corporation Content recommendation method

Non-Patent Citations (1)

* Cited by examiner, † Cited by third party
Title
TAKUMI TOYAMA ET AL: "Gaze guided object recognition using a head-mounted eye tracker", PROCEEDINGS OF THE SYMPOSIUM ON EYE TRACKING RESEARCH AND APPLICATIONS, ETRA '12, 30 March 2012 (2012-03-30), New York, New York, USA, pages 91, XP055115774, ISBN: 978-1-45-031221-9, DOI: 10.1145/2168556.2168570 *

Cited By (9)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN107704491A (en) * 2017-08-22 2018-02-16 腾讯科技(深圳)有限公司 Message treatment method and device
CN111095361A (en) * 2017-09-29 2020-05-01 高通股份有限公司 Display of live scenes and auxiliary objects
US11854133B2 (en) 2017-09-29 2023-12-26 Qualcomm Incorporated Display of a live scene and auxiliary object
US11887227B2 (en) 2017-09-29 2024-01-30 Qualcomm Incorporated Display of a live scene and auxiliary object
US11915353B2 (en) 2017-09-29 2024-02-27 Qualcomm Incorporated Display of a live scene and auxiliary object
CN108170795A (en) * 2017-12-28 2018-06-15 百度在线网络技术(北京)有限公司 Information-pushing method, device and equipment
CN108170795B (en) * 2017-12-28 2022-02-15 百度在线网络技术(北京)有限公司 Information pushing method, device and equipment
WO2020076946A1 (en) * 2018-10-09 2020-04-16 Google Llc Selecting augmented reality objects for display based on contextual cues
US11436808B2 (en) 2018-10-09 2022-09-06 Google Llc Selecting augmented reality objects for display based on contextual cues

Also Published As

Publication number Publication date
EP3207468A1 (en) 2017-08-23

Similar Documents

Publication Publication Date Title
US10332172B2 (en) Lead recommendations
KR102379643B1 (en) Data mesh platform
US20200342550A1 (en) Methods and systems for generating restaurant recommendations
US10388070B2 (en) System and method for selecting targets in an augmented reality environment
US9501745B2 (en) Method, system and device for inferring a mobile user's current context and proactively providing assistance
US8494215B2 (en) Augmenting a field of view in connection with vision-tracking
EP3015145A1 (en) Device and method of managing user information based on image
US20210209676A1 (en) Method and system of an augmented/virtual reality platform
KR20210046085A (en) Wearable apparatus and methods for analyzing images
KR20180032508A (en) Systems and methods for improved data integration in augmented reality architectures
US20120209907A1 (en) Providing contextual content based on another user
TWI680400B (en) Device and method of managing user information based on image
US11782933B2 (en) Search result optimization using machine learning models
US11006242B1 (en) Context sensitive presentation of content
US11599935B2 (en) Computer program product, computer implemented method, and system for cognitive item selection with data mining
WO2017012662A1 (en) A system for providing recommendation information for user device
US20170164029A1 (en) Presenting personalized advertisements in a movie theater based on emotion of a viewer
US10810270B2 (en) Web search based on browsing history and emotional state
US20210133851A1 (en) Personalized content based on interest levels
CN105378626A (en) Situation-aware presentation of information
US10679190B1 (en) Context-dependent inferred social network
US20200302500A1 (en) Creating custom objects from a static list of objects and turning the custom objects into trends
US20140244750A1 (en) Intelligent, mobile, location-aware news reader application for commuters
AU2016101802A4 (en) Data mesh platform
CN110598087A (en) Searching method and system based on environmental information and user preference

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 15750939

Country of ref document: EP

Kind code of ref document: A1

REEP Request for entry into the european phase

Ref document number: 2015750939

Country of ref document: EP

NENP Non-entry into the national phase

Ref country code: DE