GB2574809A - Method and apparatus for Verifying Interaction Of A Plurality Of Users - Google Patents

Method and apparatus for Verifying Interaction Of A Plurality Of Users Download PDF

Info

Publication number
GB2574809A
GB2574809A GB201809919A GB201809919A GB2574809A GB 2574809 A GB2574809 A GB 2574809A GB 201809919 A GB201809919 A GB 201809919A GB 201809919 A GB201809919 A GB 201809919A GB 2574809 A GB2574809 A GB 2574809A
Authority
GB
United Kingdom
Prior art keywords
user
portable electronic
electronic device
data set
users
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Withdrawn
Application number
GB201809919A
Other versions
GB201809919D0 (en
Inventor
Sedgley Leo
Horthy Stewart
Lassalle Dahlan
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Orbit Services Ltd
Original Assignee
Orbit Services Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Orbit Services Ltd filed Critical Orbit Services Ltd
Priority to GB201809919A priority Critical patent/GB2574809A/en
Publication of GB201809919D0 publication Critical patent/GB201809919D0/en
Publication of GB2574809A publication Critical patent/GB2574809A/en
Withdrawn legal-status Critical Current

Links

Classifications

    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04WWIRELESS COMMUNICATION NETWORKS
    • H04W4/00Services specially adapted for wireless communication networks; Facilities therefor
    • H04W4/20Services signaling; Auxiliary data signalling, i.e. transmitting data via a non-traffic channel
    • H04W4/21Services signaling; Auxiliary data signalling, i.e. transmitting data via a non-traffic channel for social networking applications
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04LTRANSMISSION OF DIGITAL INFORMATION, e.g. TELEGRAPHIC COMMUNICATION
    • H04L67/00Network arrangements or protocols for supporting network services or applications
    • H04L67/2866Architectures; Arrangements
    • H04L67/30Profiles
    • H04L67/306User profiles
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04MTELEPHONIC COMMUNICATION
    • H04M1/00Substation equipment, e.g. for use by subscribers
    • H04M1/72Mobile telephones; Cordless telephones, i.e. devices for establishing wireless links to base stations without route selection
    • H04M1/724User interfaces specially adapted for cordless or mobile telephones
    • H04M1/72448User interfaces specially adapted for cordless or mobile telephones with means for adapting the functionality of the device according to specific conditions
    • H04M1/72454User interfaces specially adapted for cordless or mobile telephones with means for adapting the functionality of the device according to specific conditions according to context-related or environment-related conditions
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04WWIRELESS COMMUNICATION NETWORKS
    • H04W4/00Services specially adapted for wireless communication networks; Facilities therefor
    • H04W4/30Services specially adapted for particular environments, situations or purposes
    • H04W4/38Services specially adapted for particular environments, situations or purposes for collecting sensor information
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04MTELEPHONIC COMMUNICATION
    • H04M2250/00Details of telephonic subscriber devices
    • H04M2250/22Details of telephonic subscriber devices including a touch pad, a touch sensor or a touch detector
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04WWIRELESS COMMUNICATION NETWORKS
    • H04W4/00Services specially adapted for wireless communication networks; Facilities therefor
    • H04W4/02Services making use of location information
    • H04W4/023Services making use of location information using mutual or relative location information between multiple location based services [LBS] targets or of distance thresholds
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04WWIRELESS COMMUNICATION NETWORKS
    • H04W4/00Services specially adapted for wireless communication networks; Facilities therefor
    • H04W4/02Services making use of location information
    • H04W4/025Services making use of location information using location based information parameters
    • H04W4/027Services making use of location information using location based information parameters using movement velocity, acceleration information

Landscapes

  • Engineering & Computer Science (AREA)
  • Computer Networks & Wireless Communication (AREA)
  • Signal Processing (AREA)
  • Environmental & Geological Engineering (AREA)
  • Human Computer Interaction (AREA)
  • Telephone Function (AREA)

Abstract

A method of verifying interaction between a first user and at least one other user, each having a portable electronic device. The first user performs an action using their portable electronic device and the action is recorded as a data set from at least one sensor of that portable electronic device. The other user attempts to perform the same action using their portable electronic device and the action is also recorded as a data set. The two data sets are communicated to a server which calculates a degree of similarity of the actions by comparing the data sets and records a successful verification if the similarity is above a threshold. The action may be a gesture or spatial movement of the device, a sound or image recorded by the device or a pattern traced on a touchscreen. The actions may be recorded within a time window measured by a countdown timer and the similarity may be weighted by a time synchronisation between actions or locations of the devices.

Description

METHOD AND APPARATUS FOR VERIFYING INTERACTION OF A PLURALITY OF USERS
The present invention relates to a method and apparatus for verifying interaction between a plurality of users, and more particularly to a software application for a portable electronic device.
BACKGROUND TO THE INVENTION
In many contexts and particularly in social or professional networking applications, it is desirable for a plurality of people to interact with each other in person, rather than by written means or telephony. The plurality may be a large group or simply a pair. It may be desirable that people interact in person because interaction in person is thought to provide a more versatile and useful form of communication than mere verbal or written communication. This is because cues such as body language and eye contact are typically absent from verbal and written communication. For social and professional networks, there is therefore motivation to increase the number of interactions in person that take place between users to improve social and professional connections between people.
To increase the number of personal interactions, it may be desirable to motivate people, such as members of a social network, to interact in person, for example by rewarding members for personal interactions. It is then necessary to monitor when interactions in person actually occur. It is possible to incentivise users to interact in person, but without a robust verification mechanism, users may attempt to deceive a networking system to obtain rewards while avoiding personal interactions.
Users may avoid personal interactions out of shyness, to avoid unnecessary effort or simply through lack of initiative. There is therefore a need for robust verification that users have interacted in person.
It is possible to use geographical location services such as satellite positioning systems, particularly the Global Positioning System (GPS) to verify that users have been in the same place at the same time. Use of such systems is typically possible via portable electronic devices such as smartphones, which are now carried by the majority of people and typically have satellite positioning capabilities.
A problem with this approach is that it cannot verify that users have interacted, only that their spatio-temporal proximity is within a certain threshold. The minimum threshold is constrained by the accuracy of satellite positioning systems, which is typically of the order of three to five metres in portable electronic devices such as smartphones. This allows the social network to be deceived into believing that a large number of social interactions have taken place when they have not. For example, commuters on public transport typically spend extended periods in close proximity without interacting, so a system based on physical co-location may not be able to distinguish silent commuting behaviour from meaningful social interaction and network building.
It is also desirable to guide users towards other users with whom they have a relatively high chance of having a useful or meaningful interaction, rather than allowing users to interact at random.
It is an object of the present invention to provide a method and apparatus for verifying that interactions in person have taken place.
STATEMENT OF INVENTION
According to a first aspect of the present invention, there is provided a method of verifying interaction between a plurality of users, the plurality of users including a first user and at least one other user, each user having a portable electronic device, the method comprising the steps of: by the first user, performing an action utilising his portable electronic device; by the portable electronic device of the first user, recording a data set from at least one sensor of that portable electronic device while the first user performs the action; by the or each other user, attempting to perform the action using the respective portable electronic device of that other user; by the respective portable electronic device of the or each other user, recording a data set from at least one sensor of that portable electronic device while the user attempts to perform the action; by each portable electronic device, communicating the respective data set to a server; by the server, for each other user, calculating a Connection Strength from the data set of the first user and the data set of that other user and, if the Connection Strength is higher than a Connection Strength threshold, recording a successful verification for that other user.
The method provides a way of verifying that users of a social or professional network have interacted in person, rather than merely interacting via written message or telephony, or being in the same place at the same time without interacting. In some embodiments, the action of the first user acts as a signal that the other users can imitate. The imitation can be concurrent, i.e. the other users imitate the action of the first user substantially while the first user performs it, or sequential, i.e. the other users watch the first user perform the action and then attempt to imitate it. Imitation of the action of the first user by the other users requires a level of attention and engagement with the first user, which effectively requires interaction in person. This encourages deeper social and professional connections. In other embodiments, imitation of a first user is not required. For example, the users could collectively improvise an action, or could be directed to point a camera on the portable electronic device at the same subject; the microphones of the portable electronic devices could be used to monitor the same sound, or the users could trace the same pattern on a touchscreen of the portable electronic device. The data sets recorded by sensors of the portable electronic devices provide a means of verifying that the actions performed by the other user(s) sufficiently resemble the action of the first user to verify that a deliberate interaction took place. Transmission of the data sets to a server allows this verification to take place on the server by comparing the data sets. Known signal processing and synchronising means may be used to achieve this. It will be appreciated that the data sets do not need to be identical to record a successful verification and are indeed extremely unlikely to be identical. Instead, a Connection Strength is calculated and compared to a threshold, so that sufficiently similar actions, which result in a sufficiently high Connection Strength, may be recorded as successful verifications. The Connection Strength is a measure of how similar or dissimilar the data sets are, and therefore a metric for how similar the actions of the users were.
When the method is employed with more than two users, it will be understood that a Connection Strength is recorded between the first user and each other user. It is envisaged that an interaction may be verified between a pair of the other users if interaction between each of those other users and the first user is verified, but this need not necessarily be the case. Alternatively a Connection Strength may be recorded between each pair of users in the plurality of users.
The method may further include a predetermined verification countdown period, and a successful verification may only be recorded if the data sets of the users are separated in time by less than the verification countdown period. This may for example be implemented as a countdown timer running on the server, or alternatively as a comparison of timestamps of the data sets.
The method may further comprise the step of, by the first user, when the action is complete, indicating to his portable electronic device that the action is complete. This allows a data set to be recorded which closely corresponds to the action of the first user without including large amounts of ambient data before and after the action of the first user.
The method may alternatively further comprise the step of, by the portable electronic device of the first user, initiating an action countdown timer, in which recording the first data set comprises continuously recording data from the sensor from the initiation of the action countdown timer until the countdown timer reaches zero. This has the same effect as the indication of action completion by the first user, but does not require specific input from the user. The portable electronic device could emit an audible beep or a vibration on initiation and/or completion of the action countdown timer depending on the user’s preference. Using an audible beep allows other users to confirm that the action is starting and that data is being recorded, preventing a user from pretending to participate in the method.
Each portable electronic device may communicate its geographical location to the server, and a successful verification may not recorded for any of the other users unless the distance between the geographical locations of the portable electronic devices of the first user and that other user is less than a distance threshold.
This provides a means of checking that the users are located in the same place, which may be considered necessary for verification of personal interaction. For example, users may interact via video camera and monitor arrangements such as online video chat applications, allowing the other users to imitate the action of the first user while being located at a different location, and it may be desirable to verify that users are not doing this.
The geographical location of each portable electronic device may be obtained using a satellite-based positioning system by the respective portable electronic device. This provides a simple method of verifying that users are in the same location which is readily available on many portable electronic devices, including most smartphones.
Calculating the or each Connection Strength may include calculating a time synchronisation between the data set of the first user and the data set of the respective other user. This allows verification that the users have interacted in the case that the first user performs the action and the second user imitates the action after watching the first user perform the action, i.e. the actions are performed sequentially. In this case, the datasets may be very dissimilar when compared as a function of absolute timestamp, but may exhibit higher similarity when adjusted, i.e. time synchronised.
The action may include moving the respective portable electronic device on a spatial path. This provides a unique action that may be easily imitated by the other users, depending on the complexity of the path. This also enhances the interaction between the users, as the users are forced to perform an unconventional movement in unison, for example as if dancing. This requires cooperation between the users.
The spatial path may be specified to the first user by his portable electronic device. This ensures that the action has an appropriate level of complexity, being complex enough to require meaningful interaction between the users to successfully imitate, but not being so complex as to be too difficult to reproduce within an acceptable tolerance.
The spatial path may be improvised by the first user. This may increase the users’ level of engagement with the process.
Each data set may be recorded from two or more sensors of the respective portable electronic device. This provides additional verification that the actions of the first user and other users are sufficiently similar.
Each data set may be recorded from three or more sensors of the respective portable electronic device.
Each data set may be recorded from at least an accelerometer of the respective portable electronic device. For example, the users could move the portable electronic devices in space in a pattern collectively improvised by the users. The accelerometer data then characterises the movements and allows them to be compared for calculation of Connection Strengths between the users.
Each data set may be recorded from at least a camera of the respective portable electronic device. For example, the users could be directed to point the cameras of the portable electronic devices at the same subject. Standard image processing techniques my then be used to compare the images or video from the cameras and calculate a Connection strength. It is envisaged that the users may be directed to stand adjacent to one another and hold the devices at arm’s length in front of the users, directing cameras of the portable electronic devices towards the users’ faces. I.e., the users may be directed to record a ‘selfie’ and the resulting images may be compared to verify that the same users are present in the field of view of each camera, possibly by using face recognition algorithms. Again, this requires the users to cooperate.
Each data set may be recorded from at least a touchscreen of the respective portable electronic device. In this case the action may include a pattern to be drawn on the touchscreen. For example, a pattern may appear on the screen of each portable electronic device and the users may be directed to trace the pattern. The pattern may be complex and the users may be directed to trace the pattern in synchronisation with one another, i.e. matching the position of each user’s finger on the pattern at a given point in time. This requires cooperation between the users to trace the patterns at the same rate.
Each data set may be recorded from at least a microphone of the respective portable electronic device. In this case the action may include a noise to be made by the users, such as a word or phrase to be spoken or a tune to be sung. Alternatively, the microphones may be used to record ambient noise from the surroundings of the users. For example, all the users or a subset of users or even a single user may be directed to read out a set phrase such as “the rain in Spain falls mainly on the plain”. The data recorded by the microphones of the participating users may then be compared, to ensure that the users’ microphones all recorded the same sound. This may require cooperation if multiple users are required to be involved in creating the sound, for example if the users are required to sing or speak sequentially or simultaneously, or engage in a call and response. Alternatively the microphone data may act as a further check, while not requiring cooperation specifically.
According to a second aspect of the invention, there is provided a system including a server and at least two portable electronic devices, each portable electronic device comprising a sensor, a wireless communication means, a user interface, memory, a processor and a software application, the software application when run on the processor: prompting a user to perform an action utilising the portable electronic device; initiating a recording period; during the recording period, recording a data set from the sensor; terminating the recording period; transmitting the data set via the wireless communication means for reception of the data set by a server, and the server including a processor and a software application, the software application, when run on the processor, calculating a Connection Strength from a data set received from the first portable electronic device and a data set received from the second portable electronic device and, if the Connection Strength is higher than a Connection Strength threshold, recording a successful verification.
The software applications allow the method of the first aspect of the invention to be implemented using portable electronic devices such as a smartphone or tablet and a server. This is advantageous as most people now carry such portable electronic devices during their daily routine.
The system may further include a predetermined verification countdown period, and a successful verification may only be recorded if the data sets of the users are separated in time by less than the verification countdown period. This may for example be implemented as a countdown timer running on the server, or alternatively as a comparison of timestamps of the data sets by the server.
The software application when run on the processor may terminate the recording period a predetermined amount of time after initiating the recording period. This allows a data set to be recorded which closely corresponds to the action of the first user without including large amounts of ambient data before and after the action of the first user.
The software application when run on the processor may terminate the recording period upon receipt of a termination instruction via the user interface. This provides an alternative to termination of the recording period a predetermined amount of time after initiating the recording period. Advantageously, this allows the user to terminate the recording period when required so that the user does not need to wait for the period to end if the action is finished early, in which case the user may be uncertain of the remaining time.
The software application when run on the processor may provide an instruction to the user via the user interface for specifying an action.
According to a third aspect of the invention, there is provided a computer implemented method of guiding a user towards other user(s) of interest comprising: monitoring a location of the user and locations of at least two other users; monitoring at least one user datum characterising the user and at least one user datum characterising each other user; calculating a metric between the user and each other user based on the physical proximity of the user to each other user and on a degree of relevance of the user data of the user to each other user; calculating an aggregate metric by combining the metrics; displaying to the user the aggregate metric.
The method allows users to be guided towards other users with whom they are more likely to have a useful or meaningful interaction. The user data contain information about the users, such as their ambitions, interests, romantic preferences or professional information. In some embodiments, the user data may contain words, sounds or pictures. In a preferred embodiment, the amount of data collected is minimal, for example a set of 1 to 3 wishes of the user. It will be understood that the method may involve collecting or monitoring other data about the user, such as the history of the user’s interactions with other users, but the phrase ‘user data’ is used to refer to the information described above only. Users whose data are relevant or compatible are likely to benefit from meeting, and it is preferable that they should meet face-to-face rather than communicate by written means or telephony, as this tends to produce more useful and meaningful interactions. The aggregate metric quantifies to the user whether he is in an area in which he is close to many people with whom he has relevant or compatible user data or not, allowing him to adjust his location accordingly to maximise the chance of meaningful real world interactions.
According to a fourth aspect of the invention, there is provided a computer implemented method of guiding a user towards other user(s) of interest comprising: monitoring a location of the user and locations of at least two other users; monitoring at least one user datum characterising the user and at least one user datum characterising each other user; calculating a metric between the user and each other user based on the physical proximity of the user to each other user and on a degree of relevance of the user datum of the user to that of each other user; selecting another user or group of users between whom and the (first mentioned) user the metric is high, and calculating a direction uncertainty, the direction uncertainty increasing with the proximity of the (first mentioned) user to the selected other user or group of users; to the (first mentioned) user, displaying a guidance direction in the form of a visual element having a direction and an angular extent, in which the direction of the visual element is towards the selected user(s), and in which the angular extent of the visual element is proportional to the direction uncertainty.
The computer implemented method allows users to be guided towards other users in the real world with whom they may have a worthwhile interaction. Monitoring the locations of the users allows the computer implemented method to direct each user towards other users. Monitoring user data allows the method to direct users towards users with relevant user data, for example similar or complementary ambitions, interests or professions. This increases the chance that an interaction will be meaningful or useful to the users. Combining these factors into a metric allows the method to guide users towards meaningful interactions with people who are nearby, increasing the rate of interactions that occur. The visual element serves to guide the user towards another user with whom he or she is likely to have a meaningful or useful interaction, or to an area rich in such users. The use of a direction uncertainty improves the chance of user interaction. The direction uncertainty is not a lack of knowledge of direction on the part of the system, rather it is a means of withholding information from the user. The direction uncertainty is low when the user is far from the user(s) he or she is being guided towards, allowing him or her to proceed in the correct direction efficiently. As the user approaches, the direction uncertainty becomes large, forcing the user to pay more attention to his or her surroundings. Finally, when the user is in the vicinity of the target user(s), the direction uncertainty is large enough that the user is forced to look around and make him or herself available for face-to face interaction. This requires more openness and engagement from users than would be the case without the direction uncertainty.
The method may further comprise the steps of: for each user, calculating an aggregate metric by combining the metrics between that user and each other user; displaying to each user that user’s aggregate metric.
The user data may be classified taxonomically, and the degree of relevance between sets of user data may be determined from observation of pas user behavious. This allows a large variety of user data to be provided by users while enabling the method to quantify the relevance or degree of alignment between the data provided by two users.
The visual element may be an arc of a circle centred on a reference point. This provides a visually simple and intuitive means of directing the user, combining direction and angular extent in a simple geometrical object.
BRIEF DESCRIPTION OF THE DRAWINGS
For a better understanding of the present invention, and to show more clearly how it may be carried into effect, reference will now be made by way of example only to the accompanying drawings, in which:
Figure 1 shows a schematic view of a system for guiding and verifying interactions between a plurality of users;
Figure 2a shows a front view of a portable electronic device displaying a user interface for guiding users of the system of Figure 1;
Figure 2b shows a front view of the portable electronic device of Figure 2a in which the user is closer to another user;
Figure 3 shows a schematic view of 4 users of the system of Figure 1.
DESCRIPTION OF PREFERRED EMBODIMENTS
Referring firstly to Figure 1, a system for guiding and verifying interactions between a plurality of users is indicated generally at 10.
The system 10 includes at least a server 12 and a plurality of portable electronic devices
14. In this embodiment, the portable electronic devices 14 are smartphones. The system 10 includes a software application that can be installed on the portable electronic devices 14. Each portable electronic device 14 includes a location sensor 16. The location sensor 16 is preferably a satellite positioning module and most preferably a Global Positioning System (GPS) module. Each portable electronic device 14 also includes memory 18, a processor 20, a wireless communication module 22, a screen 24, preferably a touchscreen, an accelerometer 25 and a gyroscope 27.
The system 10 is for use by a plurality of users. Each user is associated with one of the portable electronic devices 14, on which is installed the software application.
Each user enters user data into the portable electronic device 14. The user is prompted to do this by the software application. The user data may include leisure interests of the user, biographical information, leisure information or preferences. Preferably, the user data includes at least one wish, which is a personal goal or desire of the user. In this embodiment, each user inputs at least 1 wish into the system, with most users making 3 wishes.
The portable electronic devices 14 monitor user location using the location sensor 16. It is assumed that each user carries the associated portable electronic device 14 the majority of the time, or at least when wishing to use the system 10. It will be understood that the software application will receive location data from a location service of the operating system of the portable electronic device, which may utilise satellite location systems, wi-fi or Bluetooth (RTM) location.
Location data of the users is sent to the server 12 periodically. User data is sent to the server 12 after it is entered into the portable electronic device 14. The user data is stored in a database on memory 26 accessible to the server 12. User data can be updated by the user using the portable electronic device 14 at any time. New user data is sent to the server 12 if this happens.
For each pair of users, the server 12 calculates a metric. The metric is a numerical metric based on distance between user locations and also on user data. The metric of a pair of users is weighted by the distance between the locations of the pair of users. Smaller distance between users results in a greater metric. In this embodiment, user data includes wishes and the metric is greater for users with relevant wishes. For example, wishes may be categorised into abstract taxonomies and deep-learning algorithms may be used to learn from past user interactions which taxa of wishes are compatible or relevant, and thus likely to produce ongoing interaction. Users with wishes that belong to taxonomic categories that have been learnt to be relevant or compatible have the highest metric, while users with wishes from categories that have been learnt not to produce ongoing interactions will have a lower metric.
The server 12 calculates an aggregate metric for each user. For a user, the aggregate metric is the sum of the metrics of other users normalised to be variable between 0 and 100. The aggregate metric is a function of user location.
A user who is close to another user to whom he has relevant wishes will have a higher aggregate metric than a user who is further away from such a user, or close to a user with wishes that the system has learnt are less relevant.
The server 12 calculates a guidance direction for each user. The guidance direction is a spatial direction which points the user towards another user or group of users with which the first mentioned user has a high metric. The guidance direction of each user is communicated to the portable electronic device 14 of that user. When the software application is run on the processor 20 of the portable electronic device 14, the guidance direction is displayed to the user on the screen 24.
Referring now to Figure 2a, a user interface of the software application is indicated generally at 30. When the software application is run on the processor 20 of the portable electronic device 14, the screen 24 shows the user interface 30.
In the user interface, the user’s aggregate metric is displayed on the screen of the portable electronic device. The user’s aggregate metric is converted to a colour and the colour is shown on the screen. In this embodiment, the colour is shown on substantially the entire screen to form a background (not illustrated).
If the user’s aggregate metric is close to 100, the colour will be bright red. If the user’s aggregate metric is close to 0, the colour is blue. If the user’s aggregate metric is close to 50, the colour is purple. Intermediate aggregate metric values are shown as shades between these colours.
It will be understood that temperature can be used as an analogy for the user’s aggregate metric, with ‘hot’ corresponding to a high aggregate metric and the colour red, and ‘cold’ corresponding to a low aggregate metric and the colour blue.
The user’s aggregate metric is also shown numerically. In this embodiment, the user’s aggregate metric is shown as a numeral 32 at the top right corner of the screen 24.
A directional indicator 34 is shown on the screen 24. The directional indicator 34 is positioned in the centre of the screen. The directional indicator 34 is displayed over the background, i.e. over the aggregate metric colour. The directional indicator 34 guides the user towards another user or group of users whose metric to the player is high.
In some embodiments, the directional indicator 34 guides the user towards the nearest user whose metric is above a threshold. In other embodiments, the directional indicator 34 guides the user towards the other user whose metric is highest.
The directional indicator 34 includes a circular central portion 36 and a guidance portion 38. In this embodiment, the guidance portion 38 is an arc on the circumference of the central portion 36 of greater radial extent than the rest of the central portion 36. The guidance portion 38 moves around the circumference of the central portion 36 to indicate the guidance direction of the user. The guidance portion 38 may move such that, when the screen of the portable electronic device is horizontal, the guidance direction is the imaginary line that joins the centre of the central portion 36 to the centre of the guidance portion 38. In other words, the guidance portion 38 points in the guidance direction.
The angular extent of the guidance portion 38 is variable. When the user is close to another user or group of users of high metric to the user, the angular extent of the guidance portion 38 is large. When the user is far from the nearest such user or group of users, the angular extent of the guidance portion 38 is small. As the user approaches such a user or group of users, the angular extent of the guidance portion 38 increases steadily.
In other embodiments, the radial extent of the guidance portion 38 may be variable. For example, when the user is close to another user or group of users of high metric to the user, the radial extent of the guidance portion 38 is small. When the user is far from the nearest such user or group, the radial extent is large. As the user approaches such a user or group of users, the radial extent decreases steadily.
In further embodiments, the opacity of the guidance portion 38 may be variable. For example, when the user is close to another user or group of users of high metric to the user, the opacity of the guidance portion 38 is low. When the user is far from the nearest such user or group, the opacity is high. As the user approaches such a user or group of users, the opacity decreases steadily.
It is envisaged that in some embodiments, the guidance portion 38 may disappear entirely when the user is within a certain proximity of another user or group of users of high metric to the user.
Referring now to Figure 2b, the user interface of Figure 2a is shown after the user has moved relative to another user with whom the user has high metric. The user has come closer to the other user and the other user is now to the user’s right. Because the user is closer to the other user, the angular extent of the guidance portion 38 has grown. The guidance portion 38 occupies a larger segment of the circumference of the central portion 38. The centre of the guidance portion also points to the right, because the other user is to the right of where the user’s device is pointing. The user’s aggregate metric has also increased because he is closer to the other user. This is shown as a change in the numeral 32 at the top right corner of the screen 24.
Referring again to Figure 1, in this embodiment, each portable electronic device 14 includes a haptic feedback module 28. When the aggregate metric of the user exceeds an aggregate metric threshold, the haptic feedback module 28 is activated to notify the user. For example, when the aggregate metric of a user exceeds 80, the portable electronic device of the user vibrates.
Referring now to Figure 3, an exemplary set of users is shown. Each user is represented as a rectangle. The metrics between the top left user as viewed and the other users are shown, but metrics between other users are omitted for clarity. In this example, users are categorised into one of two categories, A or B, depending on their user data. Users of the same category have user data that have been found to be relevant. In other embodiments, many categories will be used.
The metric between the top left user and the top right user is low, because these users have dissimilar user data.
Between the top left user and the bottom left user, the metric is also low, because although these users have similar user data, the distance between them is larger.
The metric between the top left user and the lower right user is high. This is because the distance between these users is low and the user data of the users is similar. The distance between these users is slightly higher than the distance between the top right user and the top left user, but the metric is nevertheless higher because the user data of the users is more complementary.
The directional indicator of the top left user will point towards the lower right user.
A method of verifying interaction between a plurality of users will now be described.
A user can initiate the method using the software application running on the processor 20 of his portable electronic device 14. When the user (first user) has a metric with another user or group of users (other user(s)) above a threshold, an icon appears on the user interface. The icon 42 is shown in Figure 2b. The first user initiates the method by selecting this icon. The phrase ‘first user’ is used to refer to the user who initiates the method and does not imply that there need be any difference between the first user and the other user(s).
In some embodiments, the method will not be initiated unless all participating users select the icon. In other embodiments, only one user is required to select the icon. An embodiment is envisaged in which, in certain contexts, all users are required to select the icon, while in other contexts, only one user is required to select the icon. In yet further embodiments, it may not be necessary for any user to select the icon to initiate the method.
When the method is initiated, a countdown timer is started. In this embodiment, a message is sent from the portable electronic device 14 to the server 12 to initiate the countdown timer when the icon is selected. The server 12 then monitors the countdown timer.
The portable electronic device 14 of the first user prompts the first user to move his portable electronic device 14 in a spatial pattern. The spatial pattern is improvised in the moment by the first user. In this embodiment, the spatial pattern is a figure of 8 made in the air in front of the user.
While the first user moves the portable electronic device 14, the other user(s) attempt to imitate the first user by moving their respective portable electronic devices 14 in a matching spatial pattern. The other user(s) move their portable electronic devices in a figure 8 pattern in front of them.
Although in this embodiment, the first user moves his portable electronic device in a figure 8 pattern, substantially any path in space in the vicinity of the user may be used. For example, the user could move the portable electronic device in a circle, a Z shape or back and forth along a line.
While the first user and other user(s) move their portable electronic devices along the spatial pattern, each portable electronic device 14 records accelerometer data from the accelerometer 25 of that portable electronic device.
When the first user and other user(s) have finished moving the portable electronic devices 14, each portable electronic device 14 stops recording accelerometer data. In this embodiment, the software application determines that the first user and other user(s) have finished moving the portable electronic devices 14 when a predetermined amount of time has elapsed from initiation of the method.
In other embodiments, the users may be required to notify the software applications when they have completed the movement, or the software applications may determine that the movement has been completed by monitoring the accelerometer data for a period with no movement.
When each software application has determined that the movement is complete, it communicates the recorded accelerometer data set to the server 12 via the wireless communication module 22.
A location of each device 14 is also communicated to the server 12 by that device 14. The location is obtained from the GPS module 16 of each device. The location is communicated to the server 12 by the wireless communication module 22 of each device 14.
For each pair of data sets received by the server 12 before expiry of the countdown timer, a Connection Strength is calculated by the server. The Connection Strength is a numerical metric for how similar a pair of data sets is. Calculating Connection Strength includes pattern matching of each data set and calculating a time synchronisation between data sets. The Connection Strength is weighted by the geographical proximity of the data sets, as calculated from the location data sent to the server 12 by each device 14, and by the number of participating users.
If the Connection Strength between the data sets of a pair of users is greater than a Connection Strength threshold, a successful verification that the users have interacted in person is recorded by the server. This is recorded in the memory 26 and communicated to the portable electronic device 14 of each user of the pair.
The successful verification increases the metric of the pair of users from that point on.
Although in this embodiment, accelerometer data is used, other embodiments may use alternative or additional sensor data to calculate the connection strength.
For example, the portable electronic devices 14 may each include a microphone. The microphone may be used to record sound data while the users move the devices 14. The software applications may prompt the users to make a sound, for example to read out a specified phrase. Alternatively, the first user may improvise a phrase and the other user(s) may repeat the phrase after hearing the first user.
It will be understood that, although the movement of the portable electronic devices in spatial patterns is described as being substantially simultaneous in this embodiment, in other embodiments it may be sequential.
Another sensor that may be employed is a camera of each portable electronic device. For example, the users may be prompted by the software application to point the cameras of their portable electronic devices at the same subject or in the same direction. Images recorded by the cameras may then be communicated to the server 12.
Calculating Connection Strength between the pairs of users may then include pattern recognition in the images.
These embodiments are provided by way of example only, and various changes and modifications will be apparent to persons skilled in the art without departing from the scope of the present invention as defined by the appended claims.

Claims (19)

1. A method of verifying interaction between a plurality of users, the plurality of users including a first user and at least one other user, each user having a portable electronic device, the method comprising the steps of:
by the first user, performing an action utilising his portable electronic device;
by the portable electronic device of the first user, recording a data set from at least one sensor of that portable electronic device while the first user performs the action;
by the or each other user, attempting to perform the action using the respective portable electronic device of that other user;
by the respective portable electronic device of the or each other user, recording a data set from at least one sensor of that portable electronic device while the user attempts to perform the action;
by each portable electronic device, communicating the respective data set to a server;
by the server, for each other user, calculating a Connection Strength from the data set of the first user and the data set of that other user and, if the Connection Strength is higher than a Connection Strength threshold, recording a successful verification for that other user.
2. A method as claimed in claim 1, further comprising the step of, by the first user, when the action is complete, indicating to his portable electronic device that the action is complete.
3. A method as claimed in claim 1, further comprising the step of, by the portable electronic device of the first user, initiating an action countdown timer, in which recording the first data set comprises continuously recording data from the sensor from the initiation of the action countdown timer until the countdown timer reaches zero.
4. A method as claimed in any preceding claim, in which each portable electronic device communicates its geographical location to the server, and a successful verification is not recorded for any of the other users unless the distance between the geographical locations of the portable electronic devices of the first user and that other user is less than a distance threshold.
5. A method as claimed in claim 4, in which the geographical location of each portable electronic device is obtained using a satellite-based positioning system by the respective portable electronic device.
6. A method as claimed in any preceding claim, in which calculating the or each Connection Strength includes calculating a time synchronisation between the data set of the first user and the data set of the respective other user.
7. A method as claimed in any preceding claim, in which the action includes moving the respective portable electronic device on a spatial path.
8. A method as claimed in claim 7, in which the spatial path is specified to the first user by his portable electronic device.
9. A method as claimed in claim 7, in which the spatial path is improvised by the first user.
10. A method as claimed in any preceding claim, in which each data set is recorded from two or more sensors of the respective portable electronic device.
11. A method as claimed in any preceding claim, in which each data set is recorded from three or more sensors of the respective portable electronic device.
12. A method as claimed in any preceding claim, in which each data set is recorded from at least an accelerometer of the respective portable electronic device.
13. A method as claimed in any preceding claim, in which each data set is recorded from at least a camera of the respective portable electronic device.
14. A method as claimed in any preceding claim, in which each data set is recorded from at least a touchscreen of the respective portable electronic device.
15. A method as claimed in any preceding claim, in which each data set is recorded from at least a microphone of the respective portable electronic device.
16. A system including a server and at least two portable electronic devices, each portable electronic device comprising a sensor, a wireless communication means, a user interface, memory, a processor and a software application, the software application when run on the processor:
prompting a user to perform an action utilising the portable electronic device; initiating a recording period;
during the recording period, recording a data set from the sensor;
terminating the recording period;
transmitting the data set via the wireless communication means for reception of the data set by a server, and the server including a processor and a software application, the software application, when run on the processor:
calculating a Connection Strength from a data set received from the first portable electronic device and a data set received from the second portable electronic device and, if the Connection Strength is higher than a Connection Strength threshold, recording a successful verification.
17. A system as claimed in claim 16, in which the software application of each portable electronic device when run on the processor terminates the recording period a predetermined amount of time after initiating the recording period.
18. A system as claimed in claim 16, in which the software application of each portable electronic device when run on the processor terminates the recording period upon receipt of a termination instruction via the user interface.
19. A system as claimed in any of claims 16 to 18, in which the software application of each portable electronic device when run on the processor provides an instruction to the user via the user interface for specifying an action.
GB201809919A 2018-06-18 2018-06-18 Method and apparatus for Verifying Interaction Of A Plurality Of Users Withdrawn GB2574809A (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
GB201809919A GB2574809A (en) 2018-06-18 2018-06-18 Method and apparatus for Verifying Interaction Of A Plurality Of Users

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
GB201809919A GB2574809A (en) 2018-06-18 2018-06-18 Method and apparatus for Verifying Interaction Of A Plurality Of Users

Publications (2)

Publication Number Publication Date
GB201809919D0 GB201809919D0 (en) 2018-08-01
GB2574809A true GB2574809A (en) 2019-12-25

Family

ID=63042375

Family Applications (1)

Application Number Title Priority Date Filing Date
GB201809919A Withdrawn GB2574809A (en) 2018-06-18 2018-06-18 Method and apparatus for Verifying Interaction Of A Plurality Of Users

Country Status (1)

Country Link
GB (1) GB2574809A (en)

Citations (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
EP2731371A2 (en) * 2012-11-07 2014-05-14 Samsung Electronics Co., Ltd Method and device for user terminal pairing
US20140279531A1 (en) * 2013-03-15 2014-09-18 SingTel Idea Factory Pte. Ltd. Systems and methods for financial transactions between mobile devices via hand gestures
US20150312419A1 (en) * 2014-04-24 2015-10-29 Panasonic Intellectual Property Corporation Of America Configuration method for sound collection system for meeting using terminals and server apparatus
WO2016038378A1 (en) * 2014-09-10 2016-03-17 Moo Print Limited Interaction between users of mobile devices
US20160337161A1 (en) * 2012-01-09 2016-11-17 Bump Technologies, Inc. Method and apparatus for facilitating communication between devices
US20170034333A1 (en) * 2015-07-28 2017-02-02 Verizon Patent And Licensing Inc. Exchanging contact information based on identifying social interaction

Patent Citations (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20160337161A1 (en) * 2012-01-09 2016-11-17 Bump Technologies, Inc. Method and apparatus for facilitating communication between devices
EP2731371A2 (en) * 2012-11-07 2014-05-14 Samsung Electronics Co., Ltd Method and device for user terminal pairing
US20140279531A1 (en) * 2013-03-15 2014-09-18 SingTel Idea Factory Pte. Ltd. Systems and methods for financial transactions between mobile devices via hand gestures
US20150312419A1 (en) * 2014-04-24 2015-10-29 Panasonic Intellectual Property Corporation Of America Configuration method for sound collection system for meeting using terminals and server apparatus
WO2016038378A1 (en) * 2014-09-10 2016-03-17 Moo Print Limited Interaction between users of mobile devices
US20170034333A1 (en) * 2015-07-28 2017-02-02 Verizon Patent And Licensing Inc. Exchanging contact information based on identifying social interaction

Also Published As

Publication number Publication date
GB201809919D0 (en) 2018-08-01

Similar Documents

Publication Publication Date Title
US11622141B2 (en) Method and apparatus for recommending live streaming room
US11302207B2 (en) System and method for validating honest test taking
US11887016B2 (en) Actionable suggestions for activities
CN105320726B (en) Reduce the demand to manual beginning/end point and triggering phrase
US9858584B2 (en) Advising management system with sensor input
US20220013026A1 (en) Method for video interaction and electronic device
KR20190126906A (en) Data processing method and device for care robot
US11527171B2 (en) Virtual, augmented and extended reality system
CN108833991A (en) Video caption display methods and device
KR20170012979A (en) Electronic device and method for sharing image content
JP6302381B2 (en) Operation support apparatus, operation support method, and operation support program
US20140272843A1 (en) Cognitive evaluation and development system with content acquisition mechanism and method of operation thereof
JP2014094029A (en) Motion evaluation support device, motion evaluation support system, motion evaluation support method and program
EP3677392B1 (en) Robot and method of controlling the same
US20200211406A1 (en) Managing multi-role activities in a physical room with multimedia communications
US20160260347A1 (en) Method for providing psychological inspection service
GB2574809A (en) Method and apparatus for Verifying Interaction Of A Plurality Of Users
WO2023221233A1 (en) Interactive mirroring apparatus, system and method
CN109788367A (en) A kind of information cuing method, device, electronic equipment and storage medium
JP7400838B2 (en) Information processing device, control method and program
KR20230049179A (en) Virtual Reality Education Platform System and the Operation Method thereof
TWI602174B (en) Emotion recording and management device, system and method based on voice recognition
KR102590988B1 (en) Apparatus, method and program for providing metaverse service to exercise with avatar
KR20140110557A (en) E-Learning system using image feedback
JP7289169B1 (en) Information processing device, method, program, and system

Legal Events

Date Code Title Description
WAP Application withdrawn, taken to be withdrawn or refused ** after publication under section 16(1)