EP2227301A1 - On-line monitoring of resources - Google Patents

On-line monitoring of resources

Info

Publication number
EP2227301A1
EP2227301A1 EP08843017A EP08843017A EP2227301A1 EP 2227301 A1 EP2227301 A1 EP 2227301A1 EP 08843017 A EP08843017 A EP 08843017A EP 08843017 A EP08843017 A EP 08843017A EP 2227301 A1 EP2227301 A1 EP 2227301A1
Authority
EP
European Patent Office
Prior art keywords
online
activity
game
community
inappropriate
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Ceased
Application number
EP08843017A
Other languages
German (de)
French (fr)
Other versions
EP2227301A4 (en
Inventor
Gary Zalewski
Adam Harris
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Sony Interactive Entertainment America LLC
Original Assignee
Sony Computer Entertainment America LLC
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Priority claimed from US11/925,570 external-priority patent/US7865590B2/en
Priority claimed from US11/927,357 external-priority patent/US8490199B2/en
Priority claimed from US11/929,617 external-priority patent/US8204983B2/en
Priority claimed from US11/932,863 external-priority patent/US20090111583A1/en
Application filed by Sony Computer Entertainment America LLC filed Critical Sony Computer Entertainment America LLC
Publication of EP2227301A1 publication Critical patent/EP2227301A1/en
Publication of EP2227301A4 publication Critical patent/EP2227301A4/en
Ceased legal-status Critical Current

Links

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F21/00Security arrangements for protecting computers, components thereof, programs or data against unauthorised activity
    • G06F21/30Authentication, i.e. establishing the identity or authorisation of security principals
    • GPHYSICS
    • G07CHECKING-DEVICES
    • G07FCOIN-FREED OR LIKE APPARATUS
    • G07F17/00Coin-freed apparatus for hiring articles; Coin-freed facilities or services
    • G07F17/32Coin-freed apparatus for hiring articles; Coin-freed facilities or services for games, toys, sports, or amusements
    • G07F17/3241Security aspects of a gaming system, e.g. detecting cheating, device integrity, surveillance
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06QINFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR ADMINISTRATIVE, COMMERCIAL, FINANCIAL, MANAGERIAL OR SUPERVISORY PURPOSES; SYSTEMS OR METHODS SPECIALLY ADAPTED FOR ADMINISTRATIVE, COMMERCIAL, FINANCIAL, MANAGERIAL OR SUPERVISORY PURPOSES, NOT OTHERWISE PROVIDED FOR
    • G06Q10/00Administration; Management
    • G06Q10/10Office automation; Time management
    • G06Q50/40

Definitions

  • the present invention relates to on-line sessions, and more specifically, to: community based moderation of online sessions; moderation of cheating in an online sessions; allocation of on-line resources based on community based moderation of online sessions; and improving application integrity.
  • Background [0002]
  • users may interact and communication with other on-line users in the on-line community. During this interaction, the members of the on-line community may be subjected to inappropriate or offensive behavior from other members of the community.
  • one community member may begin sending chat messages that include profane or other inappropriate language to the other members of the community.
  • one member of the community may make obscene gestures or drawings that are visible to the other community members.
  • a community member may engage in illegal activity.
  • one of the community members may post pornography or engage in other illegal activity.
  • the illegal activity would be offensive to other members of the community.
  • members of the online community may be engaged in an online game.
  • the online game one, or more, or the game players may engage in cheating to take an unfair advantage over the other game players.
  • the cheating activity can lead to dissatisfaction with the online game by the other online game players.
  • Offensive, illegal, cheating, or other inappropriate actions by particular community members can decrease the enjoyment of the on-line session for the other community members. Thus, there is a need for improving moderation in on-line sessions.
  • Embodiments of the present invention provide methods, systems, apparatus, and programs for moderating online sessions.
  • a method for community moderation of an online session includes observing inappropriate behavior of a first online user by a second online user. The second online user activates or presses a triggering mechanism in response to the inappropriate behavior. A time based history of the online session is captured. Then the time based history is transmitted to a moderation entity.
  • the time based history of the online session includes online session activity that occurred a predetermined amount of time before the triggering mechanism is activated or pressed.
  • the duration of the time based history can be set by the user, or it can be a predetermined period, or by a network entity, or by other techniques.
  • the time based history can include information that associates online user identities with their online activity.
  • a reward can be issued to a user that observes inappropriate behavior and activates or presses the triggering mechanism.
  • An example of a triggering mechanism is a panic button.
  • the method includes receiving an indication of a triggering mechanism being activated by a community member in response to inappropriate activity by another community member. Then receiving a time based history of community member's activity around a time of the triggering mechanism being activated. Recreating the community activity from the time based history. Then evaluating activities of the community members to determine if there was inappropriate activity and if there is inappropriate activity by an offending community member, taking appropriate action against the offending community member.
  • an online community there are at least two users that communicate in the online community, wherein a first user in the online community observes inappropriate behavior by a second user in the online community and presses a panic button in response to the inappropriate behavior, the pressing of the panic button initiating storing a time based history of online community activity, the time based history covering a period that extends a desired duration before the pressing of the panic button and a desired duration after pressing the panic button.
  • the online community also includes a moderation entity that receives the time based history and recreates the online activity to determine if there was inappropriate activity by one of the users, and if there was inappropriate activity by one of the users, taking appropriate action against that user.
  • a network enabled device includes a triggering mechanism.
  • the device also includes a processor that captures a time based history of online activity of users in an online community.
  • a network interface that transmits the time based history to a moderation entity, the moderation entity determines if there has been inappropriate online activity by one of the online users.
  • taking appropriate action against the offending community member includes one or more of issuing a warning to the offending community member, limiting available online options to the offending community member, and restricting access to the online community by the offending community member.
  • the triggering mechanism being activated can be pressing a panic button.
  • Embodiments of the present invention also provide methods, systems, apparatus, and programs for detecting and discouraging cheating in an online game session are described. Aspects include playing an online game. During play of the game one of the players detects suspected cheating behavior by another online game player. Game information is collected about the activity of all players in the online game, the game information includes a period of the game during which the suspected cheating behavior occurred. The game information is communicated to a game cheat monitoring entity that evaluates the game information to determine if there was cheating activity, and take appropriate action if there was cheating activity. [0014] In one embodiment, capturing the game information of the online game session includes capturing online game session activity that occurred a predetermined amount of time before detecting the suspected cheating behavior.
  • capturing the game information includes associating an online game player's identity with the player's online activity.
  • a reward is provided to a game player that observes cheating behavior and communicates the game information to the game cheat monitoring entity.
  • a method of moderating cheating activity in an online game community includes receiving an indication that a player in an online game session suspects that another player in the game session is engaging in cheating behavior. Receiving game information of game activity around a time of the suspected cheating behavior. Recreating the game activity from the game information. Evaluating activities -A-
  • an online game session includes at least two players that communicate in the online game session, wherein a first player in the online game session detects suspected cheating behavior by a second player in the online game session, the first player communicates an indication to a game cheat monitoring entity that there is suspected charting behavior.
  • the game cheat monitoring entity upon receiving an indication that there is cheating behavior, collects game information of the play of all of the players in the online game session, the game information includes a time period that extends a desired duration before and after receiving the indication, the game cheat monitoring entity uses the game information to recreate online game activity of the players to determine if there was cheating activity by one of the players, and if there was cheating activity by one of the players, the game cheat monitoring entity takes appropriate action.
  • An example of appropriate action includes restricting access to the online game session by the cheating player.
  • a game cheat monitoring entity includes a network interface that receives an indication that there is cheating behavior.
  • the game cheat monitoring entity also includes a processor that collects game information of all of the players in the online game session, the game information includes a time period that extends a desired duration before and after receiving the indication, the processor uses the game information to recreate online game activity of players in the game session to determine if there was cheating activity by one or more of the players, and if there was cheating activity by one or more of the players, the game cheat monitoring entity takes appropriate action.
  • Embodiments of the present invention also provide methods, systems, apparatus, and programs for allocating online or other network resources to monitor an online community.
  • a method of allocating online resources to monitor online community members that have been identified as engaging in inappropriate behavior includes receiving an indication that an online user may be engaging in inappropriate behavior. Capturing a time based history of an online session that includes the user's behavior. Recreating the online activity and determining if there was inappropriate activity by an offending online user.
  • the time based history of the online session includes capturing online session activity that occurred a predetermined amount of time before receiving the indication that an online user may be engaging in inappropriate behavior.
  • capturing the time based history includes associating online user identities with their online activity.
  • allocating online resources for a desired level of monitoring of the offending online user includes assigning online resources to track the activities of the offending online user.
  • Another embodiment includes a network resource allocation entity that captures the time based history.
  • a moderation entity captures the time based history, recreates the online activity, and communicates the desired level of monitoring of the offending user to a network resource allocation entity that allocates network resources.
  • a method of allocating online resources to monitor online community members that have been identified as engaging in inappropriate behavior includes receiving an indication of a triggering mechanism being activated by an online community member in response to suspected inappropriate activity by another online community member. Receiving a time based history of community members' online activity around a time of the triggering mechanism being activated. Recreating the community activity from the time based history. Evaluating activities of the community members to determine if there was inappropriate activity and if there is inappropriate activity by an offending community member allocating online resources to monitor community members that have been identified as engaging in inappropriate behavior.
  • an online community with online resources that are allocated to monitor members of the online community include at least two users that communicate in the online community, wherein a first user in the online community observes suspected inappropriate behavior by one or more other users in the online community, the first user presses a panic button in response to the inappropriate behavior, the pressing of the panic button initiating storing a time based history of online community activity, the time based history covering a period that extends a desired duration before the pressing of the panic button and a desired duration after pressing the panic button.
  • a moderation entity that receives the time based history and recreates the online activity to determine if there was inappropriate activity by one of the users, and if there was inappropriate activity by one of the users, determining a desired level of monitoring to track an offending users' activity.
  • a network allocation resource allocation entity that allocates online resources to track the activities of the offending user.
  • a network entity includes a network interface that receives an indication that an online user may be engaged in inappropriate activity.
  • a processor that captures a time based history of online activity of users in an online community when the indication is received, recreates the online activity of the online community and determines if there has been inappropriate online activity by one or more of the online users and if there is inappropriate activity allocates online resources to achieve a desired level of monitoring of an offending users.
  • the time based history of the online session includes online session activity that occurred a predetermined amount of time before the triggering mechanism is pressed.
  • the duration of the time based history can be set by the user, or it can be a predetermined period, or by a network entity, or by other techniques.
  • the time based history can include information that associates online user identities with their online activity.
  • a reward can be issued to a user that observes inappropriate behavior and presses the triggering mechanism.
  • An example of a triggering mechanism is a panic button.
  • Embodiments of the present invention also provide methods, systems, apparatus, and programs for improving application integrity.
  • a method for improving the integrity of an application includes interacting with the application. Observing unexpected operation of the application. Activating a triggering mechanism in response to the unexpected operation. Capturing a time-based history of the application session. Communicating the time-based history to a network entity for evaluation.
  • the application comprises testing an online game, and capturing the time-based history comprises capturing online game session activity that occurred a predetermined amount of time before the triggering mechanism is pressed.
  • activating a triggering mechanism includes pressing a panic button.
  • observing unexpected operation of the application comprises observing a glitch in the operation of the application.
  • the network entity includes a server, or a moderation entity, or other network entity.
  • communicating the time-based history includes transmitting the time -based history over a local area network, or a wide area network such as the Internet, or any combinations of networks.
  • a method of testing an online game includes receiving an indication of a triggering mechanism being activated in response to unexpected operation of an online game. Receiving a time-based history of online game activity around a time of the triggering mechanism being activated. Recreating game activity from the time- based history. Evaluating the game activity to determine if there is a malfunction in the operation of the game.
  • testing the online game includes troubleshooting the malfunction in the operation of the game.
  • an online game test unit includes a triggering mechanism.
  • the test unit also includes a processor that captures a time-based history of game activity when the triggering mechanism is activated.
  • the test unit includes a network interface that transmits the time-based history to a network entity, the network entity determines if there is a malfunction in the operation of the online game.
  • Figure 1 is a block diagram illustrating an exemplary architecture for moderating online user activity.
  • Figure 2 is a block diagram of another embodiment of network architecture for moderating online user activity.
  • Figure 3A is a block diagram of a peer-to-peer communication network illustrating aspects of community moderation.
  • Figure 3B is a block diagram illustrating an indicating that there is inappropriate behavior by another user in the network of Figure 3 A.
  • Figure 3C is a block diagram of the peer-to-peer network of Figure 3A showing the moderation entity 108 taking preventive action.
  • Figure 4A is a block diagram of a client server communication network illustrating aspects of community moderation.
  • Figure 4B illustrates the network of Figure 4A where the server transmits the audio chat message from the first user to other users.
  • Figure 4C illustrates the network of Figure 4A where a user sends an inappropriate message.
  • Figure 4D illustrates the network of Figure 4A showing the server taking appropriate action for the inappropriate message sent by a user.
  • Figure 5 is a flowchart illustrating a method of detecting and preventing inappropriate online activity.
  • Figure 6 is a flowchart of another embodiment of detecting inappropriate online behavior.
  • Figure 7 is a flowchart illustrating aspects of taking appropriate action in response to inappropriate activity.
  • Figure 8 is a flow diagram illustrating an embodiment of using community moderation to prevent cheating in an online video game.
  • Figure 9 is a flow diagram illustrating aspect of moderating online behavior.
  • Figure 10 is a flow diagram of another embodiment of evaluating user online activity.
  • Figure 11 is a block diagram of a test environment.
  • Figure 12A is a flow diagram of an online test environment as illustrated in
  • Figure 12B is a flow diagram of an embodiment of a test environment as illustrated in Figure 11.
  • Figure 13 is a table illustrating examples of different types of actions that can be taken in response to a user's inappropriate behavior.
  • Figure 14 is a block diagram illustrating an example network enabled device
  • Figure 15 is a block diagram illustrating an example game cheat monitoring entity that may be used in connection with various embodiments described herein.
  • Figure 16 is a flowchart illustrating an embodiment of detecting cheating in an online environment.
  • Figure 17 is a flowchart illustrating another embodiment of detecting cheating in an online environment.
  • Figure 18 is a block diagram of another embodiment of a moderation entity that can allocate resources.
  • Figure 19 is a flow chart illustrating aspects of online or other network resource allocations.
  • Figure 20 is a flow chart illustrating additional aspects of allocating online or other network resources.
  • FIG. 1 is a block diagram illustrating an exemplary architecture for moderating online user activity.
  • one or more users or clients 102a-c are in communication with a network 104.
  • the users 102a-c communicate via the network with each other in an ad hoc communication network.
  • the users communicate via the network with a server 106.
  • the users 102 may use a network enabled device, such as game console such as a Sony play station 3, a laptop computing device, a portable game device such as a play station portable, a desktop computing device, a cellular telephone, or any other device capable of interfacing to the communication network 104.
  • the architecture includes a moderation entity 108 which is also in communication with the network 104.
  • the moderation entity 108 can be used to take appropriate action if the one of the users 102a-c is engaged in inappropriate or unacceptable behavior. For example, as discussed further below, the moderation entity 108 may interrupt communications from one user to another or may restrict an offending user's access to the network for a desired period of time.
  • the moderation entity 108 is a separate network node. In other embodiments, the moderation entity 108 may be incorporated within another network node, such as one or more of the users 102a-c or the server 106 or other network entity.
  • a user 102a-c and a server 106 and moderation entity 108 are merely for convenience of understanding various embodiments.
  • embodiments of the present invention may be implemented in the context of a peer-to-peer network, a client server network, or within a peer group. Therefore, in some instances a client or user may function as a server or moderation entity and vice versa, depending on the timing and the nature of the data exchange.
  • various clients in a peer-to-peer network may each comprise a portion of an online activity such as a virtual reality and may send and receive data related to the online activity.
  • any reference to a user or a server or a moderation entity is meant to be inclusive of operations performed by one or any of the operating entities unless specified otherwise by specific limitations.
  • a device with user/server functionality may be referred to in a generic moniker such as network node, computing node or network device.
  • server and moderation entity may each be considered network computing nodes or a network device.
  • one user 102c may monitor the activity of other online users 102a and 102b as they interact in the online environment. When one of the users 102c believes one of the other users 102a and 102b is engaged in inappropriate conduct for the online environment, they can, for example, press a panic button or some other indication that inappropriate activity is taking place.
  • this discussion describes one user 102c monitoring other users 102a-b, in other embodiments all users are monitoring the activities of all other users. In other embodiments selected users or groups of users can be authorized to monitor other online users.
  • each user device 102 that is monitoring online activity includes a buffer or other type of memory where duration of all of the users' activity that is being monitored in the online environment is being stored. In this way when the panic button is pressed, the contents of -l i ⁇
  • the buffer which includes a period of time prior to the pressing of the panic button as well as a desired period of time following the pressing of the panic button is sent to the moderation entity 108 for evaluation.
  • the duration of the time based history can be set by the user, or it can be a predetermined period, or by a network entity, or by other techniques.
  • the moderation entity receives the stored online activity of the users.
  • the moderation entity 108 evaluates the online activity against a set of norms or rules that have been pre-established. If the moderation entity 108 determines that one of the users' behavior is inappropriate, the moderation entity 108 can take appropriate action. For example, if a user is using offensive language the moderation entity 108 could disable that user's microphone. In another example, the moderation entity 108 could warn the user to stop using the offensive language or the moderation entity 108 could restrict the user and only allow the user to access portions of the online environment where the language is acceptable, such as an adult only portion of the environment, or the users could be restricted from the online environment entirely.
  • the moderation entity 108 could warn the user to stop the cheating activity or the moderation entity 108 could restrict the user and not allow the cheating user to participate in the game.
  • users that identify inappropriate behavior can be reworded. For example, if a user identifies a cheater in a game, the user can be given a reward. Rewards encourage user to identify inappropriate behavior, such as cheating, and because appropriate action is taken the online experience for all of the other users is improved. Of course users can abuse the reward feature by identifying others that are not involved in inappropriate behavior. To discourage these types of false identification a user can receive demerits for making fake identifications.
  • FIG. 2 is a block diagram of another embodiment of network architecture for moderating online user activity.
  • multiple users 102a, 102b, and 102c are in communication with a network 104.
  • a server 106 Also in communication with the network is a server 106.
  • moderation entities 108a through 108n are configured to evaluate a specific type of inappropriate behavior.
  • one moderation entity could be configured to evaluate offensive language in the online environment.
  • a different moderation entity can be configured to evaluate cheating activity in an online game.
  • Still another moderation entity can be configured to evaluate online illegal activity such as distribution of pornographic or other illegal materials.
  • other moderation entities are configured to evaluate other types of inappropriate online behavior. Similar to the communication network of Figure 1, once the inappropriate online activity has been determined by the moderation entity, appropriate action can be taken.
  • FIG. 3A is a block diagram of a peer-to-peer communication network illustrating aspects of community moderation.
  • the community includes three users 102a, 102b and 102c in communication with each other through the communication network 104. Also in communication with the network 104 is the moderation entity 108.
  • the first user 102a is communicating by sending voice messages to the other users 102b and 102c.
  • the voice message sent by the first user 102a includes inappropriate or profane language.
  • Figure 3B is a block diagram illustrating a indicating that there is inappropriate behavior by another user in the network of Figure 3 A.
  • the user 102c presses a panic button to indicate there is inappropriate behavior.
  • the third user 102c upon hearing the inappropriate and profane message from the first user 102a presses a panic button or other triggering device to indicate inappropriate behavior is, or has, occurred.
  • the users network enabled devices While engaged in the online activity, the users network enabled devices have been buffering a time segment, or time based history, of online activity thereby recording the online activity of all of the monitored users in the community.
  • a buffer in the third user's device 102c has a sliding window of memory that is always recording a portion of previous online activity by the users.
  • a message sent to the moderation entity 108 can include an indication of the type of offensive or inappropriate behavior that the third user 102c is reporting.
  • Examples of the type of online activity that can be buffered include a time-based history of online activity such as text chat, audio chat, the state of the characters and/or online participants as well as other types of online activity.
  • the sights and sounds of Avatars that are engaged in an online game can be captured and stored in the time-based history. The moderation entity 108 can then evaluate the time -based history of the online activity of the users and determine if the first user's 102a behavior is inappropriate such as if the first user is cheating.
  • Figure 3C is a block diagram of the peer-to-peer network of Figure 3A showing the moderation entity 108 taking preventive action.
  • the moderation entity 108 can take preventive action. For example, the moderation entity 108 can send a warning to the first user 102a indicating that their behavior is inappropriate and to not engage in such behavior in the future. Other types of preventive action can also be taken.
  • the moderation entity 108 can send a command to the first user's 102a device and disable the first user's 102a communication capability such as disabling the first user's microphone.
  • the moderation entity 108 could take actions such as cutting off the offending user's subscription so that they can no longer engage in the online activity.
  • the moderation entity 108 could also add or increase monitoring of a particular user who has been engaged in inappropriate activity. In other embodiments these types of corrective actions can be used individually or in any combination.
  • FIGs 3A-C show three users, in other embodiments there may be different number of users. Also, different numbers, and groups of users may monitor and be monitored in other embodiments.
  • Figure 4A is a block diagram of a client server communication network illustrating aspects of community moderation.
  • three users 102a, 102b and 102c use network enabled devices to communicate through a server 106 while engaged in an online activity.
  • the first user 102a is engaged in an audio chat session with second and third users, 102b and 102c.
  • the audio message from 102a is routed to the server 106.
  • Figure 4B illustrates the network of Figure 4A where the server transmits the audio chat message from the first user to other users.
  • the server 106 transmits the audio chat message from the first user 102a to the second and third users, 102b and 102c.
  • the first user's message can be transmitted to one other user or to any number of other users.
  • Figure 4C illustrates the network of Figure 4A where a user sends an inappropriate message.
  • the first user 102a sends an audio chat message intended for the second and third users, 102b and 102c, and the message includes inappropriate content.
  • Figure 4D illustrates the network of Figure 4A showing the server taking appropriate action for the inappropriate message sent by a user.
  • the server 106 detects that the audio message sent by the first user 102a and determines that it is inappropriate. Because the message includes inappropriate material the seriverlO ⁇ does not transmit it to the second and third users, 102b and 102c.
  • the server 106 can also take other actions such as warning the first user 102a that his audio message and behavior is inappropriate, cutting off the subscription of the first user, as well as additional or increased monitoring of the first user, and other types of actions.
  • functionality of the moderation entity has been incorporated into the server 106.
  • the functionality of the moderation entity can be incorporated into other network entities, for example, a user device, or other network device.
  • a user device for example, a smart phone, or other smart phone.
  • the examples illustrated in Figures 4A-D show three users, in other embodiments there may be different number of users. Also, different numbers, and groups of users may monitor and be monitored in other embodiments.
  • FIG. 5 is a flowchart illustrating a method of detecting and preventing inappropriate online activity.
  • Flow begins in block 502 where an online user observes offensive or inappropriate behavior.
  • the types of behavior that are considered offensive or inappropriate can be based on an individual user's perception of inappropriate behavior, or based on community norms of what is appropriate and inappropriate behavior.
  • Various techniques for establishing what is appropriate and inappropriate behavior are disclosed in US Patent Application No. 11/502,265 filed August 9, 2006, and entitled "Dynamic Rating of Content" which is incorporated herein by reference in its entirety.
  • Flow continues to block 504 where a user presses the panic button or performs another action to indicate or response to observing the offensive or inappropriate online behavior.
  • a time-based history of all the community members' activity is captured.
  • the time-based history can be stored in a user's device and includes a sliding window of online activity.
  • a portion of the past online activity is continually recorded in a buffer such that when the panic button is pressed, the previous online activity is stored as well as the present and a portion of a future period of online activity. In this way evidence indicating a user's inappropriate or offensive online activity is captured in the time-based history.
  • the time-based history is sent to a moderation entity.
  • an optional indication of the type of offensive behavior can also be sent to the moderation entity. For example, an indication can be sent showing that the user believes the inappropriate activity is offensive language or illegal activity such as online pornography or a player cheating in a game or other inappropriate activity.
  • the moderation entity evaluates the time-based history to determine if the activity is offensive or inappropriate.
  • the time-based history could be routed to a particular engine within the moderation entity or to an appropriate moderation entity based upon the types of activity.
  • one moderation entity, or engine within a moderation entity can be optimized to identify and take appropriate action for a particular type of inappropriate activity, for example, profane language.
  • a different engine or moderation entity can be optimized to detect and take action for other types of inappropriate activity, for example, illegal online activity or game cheats or the like.
  • the moderation entity takes appropriate action.
  • the moderation entity determines that the activity is not inappropriate that may take no action. If the moderation entity determines that the behavior is offensive or inappropriate then the moderation entity can take appropriate action. For example, the moderation entity could warn the user about his behavior or it could cut off the user's subscription or increase or add monitoring to track the online activities of the offending user.
  • the moderation entity could warn the user about his behavior or it could cut off the user's subscription or increase or add monitoring to track the online activities of the offending user.
  • the user reporting the activity may receive an award. If it is determined that there is no inappropriate activity then the user reporting the activity may receive demerits. In this way users are encouraged to report inappropriate activity while discouraged from making false reports.
  • FIG. 6 is a flowchart of another embodiment of detecting inappropriate online behavior.
  • Flow begins in block 602 where a user joins an online community activity. For example, a user could join an online game activity or they could engage in online virtual reality sessions or other online activities, for example, such as Sony Home ® environment.
  • Flow continues to block 604 where the user interacts with other members of the online community.
  • Flow then continues to block 606 where the user becomes aware of inappropriate activity of one of the other community members.
  • Flow then continues to block 608 where the user presses a panic button or otherwise indicates that inappropriate activity has been observed.
  • Flow then continues to block 610 where a time -based history of inappropriate activity of the online environment is captured and sent to a moderation entity.
  • FIG. 7 is a flowchart illustrating aspects of taking appropriate action in response to inappropriate activity.
  • the action may be taken by a network entity such as a moderation entity 108 or server 106 in Figures 1 and 2.
  • Flow begins in block 702 where an indication of the occurrence of inappropriate activity, such as the pressing of a panic button, is received.
  • Flow then continues to block 704 where a time -based history of the online community members' activity is received.
  • the online community members' activity is evaluated.
  • any inappropriate activity recorded in the time-based history of the online community is identified.
  • users reporting inappropriate activity can receive rewards while users making false reports can receive demerits.
  • FIG. 8 is a flow diagram illustrating an embodiment of using community moderation to prevent cheating in an online video game.
  • preventing cheating in an online video game may be accomplished by a network entity such as a moderation entity 108 or server 106 in Figures 1 and 2.
  • Flow begins in block 802 where an online game user observes questionable game play of one of the other participants.
  • Flow continues to block 804 where the user observing the questionable play indicates that they believe another player may be cheating by, for example, presses the panic button or a triggering mechanism, or other type of indication.
  • Flow then continues to block 806 where a time-based history of the online game members' activity is captured.
  • the time- based history includes duration of game play that has been stored prior to the pressing of the panic button as well as a period of game play following the pressing of the panic button. In this way a sliding window of time surrounding the pressing of the panic button has been recorded.
  • Types of activity that can be included in the time-based history include text chat, audio chat, the state of all characters, their positions, and any other data that will be useful in recreating the online environment.
  • the history is sent to a moderation entity.
  • an optional indication of the type of inappropriate behavior observed is also included. For example, if a player has observed the suspected cheating player of disappearing, or having exceptional strength, or being resistant to attacks from other players, that information can be included and sent along with the time-based history.
  • the moderation entity evaluates the online behavior of the game participants. Using the time-based history, the moderation entity can play back the scenario leading up to the pressing of the panic button. In this way it can be determined whether or not someone was cheating.
  • Various techniques for detecting cheating in an online game are described in pending US Patent Application Serial No. 11/386,039 entitled “Active Validation of Network Devices” filed March 20, 2006, Serial No. 11/415,881 entitled “Passive Validation of Network Devices” filed May 1, 2006, Serial No. 11/449,141 entitled “Game Metrics” filed on June 7, 2006, and Serial No.
  • the moderation entity can take appropriate action based on the severity of the inappropriate behavior. In one embodiment if no inappropriate behavior is detected, then the moderation entity will take no action. In other embodiments if inappropriate behavior is detected, then the moderation entity can take any of a range of appropriate actions including warning, cutting of a user's subscription, adding increased monitoring, or any combination of the above. Optionally a user reporting cheating can receive rewards while a user making false reports can receive demerits.
  • Figures 3 through 7 describe embodiments associated with inappropriate online activity such as offensive language
  • the same techniques can be applied to prevent cheating in online gaming.
  • a user could detect suspected cheating in an online game environment and report that to the moderation entity where appropriate action will be taken.
  • Figures 4A-D in a server/client-based architecture, the server could detect suspected online cheating by a user and take appropriate action.
  • Figures 5 to 7 the offensive or inappropriate behavior could be cheating in an online game environment.
  • Figure 9 is a flow diagram illustrating aspect of moderating online behavior.
  • the aspects of Figure 9 can be implemented by a moderation entity or a server as illustrated in Figures 1 and 2.
  • Flow begins in block 902 where an indication that inappropriate behavior has been observed, such as that a panic button has been pressed, is received.
  • Flow continues to block 904 where a time-based history of activity of community members around the time the panic button was pressed is received.
  • community members' activity is evaluated to determine if it is inappropriate activity. Inappropriate activity could include profane or inappropriate language, distribution of, or showing of, pornography to other online users, cheating in an online game, and the like.
  • block 910 the complaint against the user is logged in a user's file. This user file may be maintained to keep track of the number of indications of other users' believing there was inappropriate activity being performed by the suspect user. [0090] Flow then continues to block 912. In block 912 the number of complaints is compared against a predetermined value or threshold. If it is determined that the number of complaints against this user do not exceed the threshold level, flow continues back to block 902 and the system waits for the next pressing of a panic button. Returning to block 912, if it is determined that the number of complaints exceeds the threshold, then flow continues to block 914.
  • block 916 the user's file is updated indicating there is inappropriate activity or that there has been an action taken.
  • the user file may indicate that a warning has been issued to this user about his activity.
  • a later action is taken in 916 against the same user, it may increase the severity of the action in response to the previous action taken.
  • the number of complaints logged for a particular type of behavior can be used to modify the standards and rules set used by the moderation entity in evaluating behavior. For example, if a particular type of behavior is not originally considered inappropriate, but the majority of other online users find a particular activity to be inappropriate, as indicated by a large number of complaints for that activity, the moderation entity can modify the standards that it evaluates activity against and set this new activity as being inappropriate. In this way, as the community changes and evolves over time, the standards by which activity is considered inappropriate will evolve with the community.
  • Figure 10 is a flow diagram of another embodiment of evaluating user online activity.
  • the aspects of Figure 10 can be implemented by a moderation entity or a server as illustrated in Figures 1 and 2.
  • Flow begins in block 1002 where an indication that inappropriate activity has taken place, such as that a panic button has been pressed is received.
  • Flow continues to block 1004 and a time-based history of activity of the community members around the time the panic button was pressed is received. This time-based history can include data used to recreate the online activity around the time the panic button was pressed so that a moderator can evaluate if the online activity of a particular user is inappropriate or not.
  • the threshold could be set such that the first time a particular inappropriate activity is done an appropriate action is taken. For example, if there is an illegal activity such as pornography or some other illegal behavior, flow will continue to block 1008 where appropriate action is immediately taken due to the severity of the activity.
  • a level of monitoring of a particular user may be adjusted. For example, the level of monitoring could be increased such that this particular offending user's activity online is monitored at all times by the moderation entity.
  • the user's file is also updated to indicate his inappropriate activity [0095] Adjusting the level of monitoring allows a system with limited resources to more effectively allocate those resources across the community members. For example, if there is a large community with many members, the moderation entity may be able to monitor all of the members' online activity. By increasing the level of monitoring of particular, identified, individuals that have been identified as engaging in inappropriate behavior, limited system resources can be applied more effectively.
  • the inappropriate activity does not exceed a threshold, then flow continues to block 1010.
  • the member's file is evaluated to see if there have been previous complaints against this particular member.
  • the accumulated inappropriate activity is evaluated to see if it exceeds a threshold. If the accumulated inappropriate activity by this particular member does not exceed the threshold, flow continues to block 1014.
  • the level of monitoring of this user can be adjusted. For example, the level of monitoring can be increased to more closely monitor the particular member's activities.
  • the member's file is updated to indicate that there is possible inappropriate behavior.
  • Figure 11 is a block diagram of a test environment.
  • Figure 11 can be a test environment for testing of an online game or other online application.
  • testers 1102 A, 1102B and 1102C there are multiple testers 1102 A, 1102B and 1102C. In other embodiment, there may be any desired number of testers, for example, one, two, or any number of testers.
  • These online testers communicate with a network 1104 and a server 1106. As the testers interact and evaluate the online activity, they will find bugs or glitches which they wish to report to the server, for trouble shooting and updating the application. When one of the testers comes across a glitch, he can trigger an indicate such as pressing a panic button which will record the online environment for a duration around the time the panic button was pressed. For example, the duration of time can extend from before the button was pressed until after the button was pressed for a desired period of time.
  • the testers communicate with a network 1104.
  • the network 1104 can be a local area network, a wide area network such as the Internet, or other type of network.
  • Also in communication with the network are other network entities.
  • a server 1106, or a moderation entity 1108, or other network entities, in any combination can be in communication with the network 1104.
  • a tester 1102a includes a network interface 1110, a processor 1112, and a triggering mechanism 1114, such as a panic button.
  • the triggering mechanism 1114 can be pressed and the processor 1112 captures a time-based history of activity, such as game activity, when the triggering mechanism is activated.
  • the time- based history can be communicated via the network interface 1110 to another network entity.
  • the time -based history can be communicated to the server 1106, or the moderation entity 1108, or other network entity.
  • the testers will find bugs or glitches which they wish to report to the server, for trouble shooting and updating of the application.
  • the testers When testers come across a glitch, they can trigger a mechanism, such as pressing a panic button, to provide an indication of the glitch.
  • a time-based history of the test environment is recorded for duration around the time the triggering mechanism was activated. For example, the duration of time can extend from before the triggering mechanism was activated until a period of time after the triggering mechanism was. In this way the activity and parameters of the application can be captured for evaluation as to the cause of the glitch.
  • Figure 12A is a flow diagram of an online test environment as illustrated in Figure 11.
  • Flow begins in block 1202 where testers engage in testing of an online environment or application.
  • Flow continues to block 1204 where a tester identifies an instance of interest during testing. For example, they may identify a glitch or some discontinuity in the application which they wish to report.
  • Flow continues to block 1206 where the tester presses the panic button at the time of the point of interest.
  • Flow then continues to block 1208 where a time-based history of the online environment during the testing activity is captured.
  • the time-based history is a sliding window of memory beginning before the pressing of the panic button through and after pressing of the panic button.
  • FIG. 12B is a flow diagram of another embodiment of a test environment as illustrated in Figure 11.
  • Flow begins in block 1212 where testers engage in testing of an application.
  • the application can be a non-online game, an online game, or other application.
  • Flow continues to block 1214 where a tester identifies an instance of interest during testing.
  • the tester may identify a glitch or some discontinuity in the application which they wish to report.
  • Flow continues to block 1216 where the tester activates a triggering mechanism.
  • the tester can press a panic button, or other type of mechanism to indicate the time of the point of interest.
  • a time-based history of the online environment during the testing activity is captured.
  • the time-based history is a sliding window of memory beginning before the activating of the triggering mechanism through and after activating the triggering mechanism.
  • the time-based history is communicated to a server via a local area network.
  • the time- based history is communicated to a server via a wide area network, such as the Internet.
  • the time-based history is used for troubleshooting of the application.
  • Figure 13 is a table indicating examples of possible actions that can be take against a user as a result of a user's inappropriate behavior.
  • the table shown in Figure 13 has a first column 1302 listing different types of inappropriate behavior and a second column 1304 listing different possible actions that can be taken for each type of behavior.
  • a first type of inappropriate behavior 1306 is behavior that falls outside of predetermined community standards. Examples of this type of behavior can be use of profane language, racial or ethnic slurs, types of gestures, and other types of behaviors that the community has identified as unacceptable. Examples of possible actions 1308 that can be take in response to these types of behaviors include issuing a warning, cutting off voice messaging capability; cutting off a user's subscription to the online activity, increasing the monitoring of an offending user, restricting access to portions of the online activity such as restricting access to portions of the online environment where children tend to visit, and the like.
  • a second type of inappropriate behavior 1310 listed in Figure 13 is cheating in an online game.
  • Examples of possible actions 1312 that can be take in response to cheating in an online game include issuing a warning, decreasing a players abilities in the game, penalizing the player such as decreasing their score, restricting a players access to game options such as not letting a player use particular game options, cutting off a player's subscription to the online game, increasing the monitoring of the cheater, and the like.
  • a third type of behavior 1314 listed in Figure 13 is questionable behavior. This type of behavior includes behavior that may not violate community standards, but many of the members of the community may complain about the behavior. Examples of this type of behavior may include derogatory language, or suspicious, or distrustful, behavior.
  • FIG. 14 is a block diagram illustrating an example network enabled device 1450 that may be used in connection with various embodiments described herein.
  • the network enabled device 650 may include one or more processors, such as processor 1452.
  • Additional processors may be provided, such as an auxiliary processor to manage input/output, an auxiliary processor to perform floating point mathematical operations, a special-purpose microprocessor having an architecture suitable for fast execution of signal processing algorithms (e.g., digital signal processor), a slave processor subordinate to the main processing system (e.g., back-end processor), an additional microprocessor or controller for dual or multiple processor systems, or a coprocessor, for example if parallel processing is to be implemented.
  • auxiliary processors or coprocessors may be discrete processors or may be integrated with the processor 1452.
  • the processor 1452 may be connected to a communication bus 1454.
  • the communication bus 1454 may include a data channel for facilitating information transfer between storage and other peripheral components of the computer system 1450.
  • the communication bus 1454 further may provide a set of signals used for communication with the processor 1452, including a data bus, address bus, and control bus (not shown).
  • the communication bus 1454 may comprise any standard or non-standard bus architecture such as, for example, bus architectures compliant with industry standard architecture ("ISA"), extended industry standard architecture ("EISA”), Micro Channel Architecture (“MCA”), peripheral component interconnect (“PCI”) local bus, or standards promulgated by the Institute of Electrical and Electronics Engineers (“IEEE”) including IEEE 488 general-purpose interface bus (“GPIB”), IEEE 696/S-100, and the like.
  • ISA industry standard architecture
  • EISA extended industry standard architecture
  • MCA Micro Channel Architecture
  • PCI peripheral component interconnect
  • IEEE Institute of Electrical and Electronics Engineers
  • the network enabled device 1450 may also include a main memory 1456 and may also include a secondary memory 1458.
  • the main memory 148 can provide a buffer to store online activity during an online session.
  • the buffer can provide a sliding window of memory that stores online activity of users in the online session.
  • the duration of the online session that is saved can be predetermined, set u a user, adjusted under program control, or by other techniques.
  • the main memory 1456 can also provide storage of instructions and data for programs executing on the processor 1452.
  • the main memory 1456 is typically semiconductor-based memory such as dynamic random access memory (“DRAM”) and/or static random access memory (“SRAM”).
  • DRAM dynamic random access memory
  • SRAM static random access memory
  • Other semiconductor-based memory types include, for example, synchronous dynamic random access memory (“SDRAM”), Rambus dynamic random access memory (“RDRAM”), ferroelectric random access memory (“FRAM”), and the like, including read only memory (“ROM”).
  • the secondary memory 1458 may optionally include a hard disk drive 1460 and/or a removable storage drive 1462, for example a floppy disk drive, a magnetic tape drive, a compact disc (“CD”) drive, a digital versatile disc (“DVD”) drive, a memory stick, etc.
  • the removable storage drive 1462 reads from and/or writes to a removable storage medium 1464 in a well-known manner.
  • Removable storage medium 1464 may be, for example, a CD, DVD, a flash drive, a memory stick, etc.
  • the removable storage medium 1464 is typically a computer readable medium having stored thereon computer executable code (i.e., software) and/or data.
  • the computer software or data stored on the removable storage medium 1464 may be read into the computer system 1450 as electrical communication signals 1478.
  • secondary memory 1458 may include other similar means for allowing computer programs or other data or instructions to be loaded into the computer system 1450. Such means may include, for example, an external storage medium 1472 and an interface 1470. Examples of external storage medium 1472 may include an external hard disk drive or an external optical drive, or and external magneto- optical drive. [00113] Other examples of secondary memory 1458 may include semiconductor-based memory such as programmable read-only memory (“PROM”), erasable programmable read-only memory (“EPROM”), electrically erasable read-only memory (“EEPROM”), or flash memory (block oriented memory similar to EEPROM).
  • PROM programmable read-only memory
  • EPROM erasable programmable read-only memory
  • EEPROM electrically erasable read-only memory
  • flash memory block oriented memory similar to EEPROM
  • the network enabled device 1450 may also include a communication interface 1474.
  • the communication interface 1474 allows software and data to be transferred between the network enabled device 450 and external devices, networks, or information sources. For example, computer software or executable code may be transferred to network enabled device 1450 from a network entity via communication interface 1474.
  • the communication interface 1474 can establish and maintain communications, both wired and wireless, to external networks, such as the Internet.
  • Examples of communication interface 1474 include a modem, a network interface card ("NIC"), a communications port, a PCMCIA slot and card, an infrared interface, and an IEEE 1394 fire-wire, a wireless LAN, an IEEE 802.11 interface, an IEEE 802.16 interface, a Blue Tooth interface, a mesh network interface, just to name a few.
  • NIC network interface card
  • PCMCIA slot and card an infrared interface
  • IEEE 1394 fire-wire
  • wireless LAN an IEEE 802.11 interface
  • IEEE 802.16 interface an IEEE 802.16 interface
  • Blue Tooth interface a mesh network interface
  • Communication interface 1474 typically can implement industry promulgated protocol standards, such as Ethernet IEEE 802 standards, Fiber Channel, digital subscriber line (“DSL”), asynchronous digital subscriber line (“ADSL”), frame relay, asynchronous transfer mode (“ATM”), integrated digital services network (“ISDN”), personal communications services (“PCS”), transmission control protocol/Internet protocol (“TCP/IP”), serial line Internet protocol/point to point protocol (“SLIP/PPP”), and so on, but may also implement customized or non-standard interface protocols as well.
  • Software and data transferred via the communication interface 1474 are generally in the form of electrical communication signals 1478. These signals 1478 may be provided to communication interface 1474 via a communication channel 1480.
  • the communication channel 1480 carries the signals 1478 and can be implemented using a variety of wired or wireless communication means including wire or cable, fiber optics, conventional phone line, cellular phone link, wireless data communication link, radio frequency (RF) link, or infrared link, just to name a few.
  • Computer executable code i.e., computer programs or software
  • Computer programs can also be received via the communication interface 1474 and stored in the main memory 1456 and/or the secondary memory 1458.
  • Such computer programs when executed, can enable the computer system 1450 to perform the various functions of the present invention as previously described.
  • computer readable medium is used to refer to any media used to store data and/or provide computer executable code (e.g., software and computer programs) to the network enabled device 1450.
  • Examples of these media include main memory 1456, secondary memory 1458 (including hard disk drive 1460, removable storage medium 1464, and external storage medium 1472), and any peripheral device communicatively coupled with communication interface 1474 (including other network devices).
  • These computer readable mediums are means for providing executable code, programming instructions, and software, or storing and/or recording data to the network enabled device 1450.
  • the network enabled device 1450 also includes a triggering mechanism 1476.
  • the triggering mechanism can be activated by a user to indicate the occurrence of an event. For example, if a user observes inappropriate behavior by another online user the triggering mechanism can be activated. Activation of the triggering mechanism can cause various operations by the network enabled device. For example, if a user activates the triggering mechanism a time-based history of an online session can be stored. In one embodiment, the triggering mechanism is a panic button.
  • Figure 15 is a block diagram illustrating an example game cheat monitoring entity that may be used in connection with various embodiments described herein.
  • a game cheat monitoring entity 1500 includes a network interface 1502 that receives an indication that there is cheating behavior. For example, a player in an online game can send an indication that another player in an online game is cheating.
  • the game cheat monitoring entity 1500 also includes a processor 1504 that collects game information of the play of at least the suspected cheating player.
  • the game cheat monitoring entity 1500 collects game information of the play of all of the players in the online game session.
  • the game information can include a time period that extends a desired duration before and after receiving the indication.
  • the game cheat monitoring entity can be a game server that collects game information as the players play the game.
  • the game cheat monitoring entity can be a separate network entity, or can be included with another network entity.
  • the cheat monitoring entity can receive game information from another network entity, such as a game server, or players in the game, or other source,
  • the processor 1504 uses the game information to recreate online game activity of players in the game session to determine if there was cheating activity by one or more of the players. If there was cheating by one or more players, the game cheat monitoring entity takes appropriate action. For example, the game cheat monitoring entity can restrict a player that has been identified as a "cheater” from access to the online game session, or other game sessions, or limit game options that are available to a player that has been identified as a cheater or other types of actions.
  • Figure 15 is a block diagram of a moderation entity that can allocate resources such as online resources or other network resources.
  • the moderation entity also referred to as a network allocation moderation entity, can be a separate entity in communication with a network, such as the network 104 illustrated in Figures 1-4, or the operations of the network resource allocation moderation entity can be implemented in another network entity, such as for example, a moderation entity 108, a server 106, a user 102, or other network entity, as shown in Figure 1.
  • the network entity 1500 includes a network interface 1502.
  • the network entity 1500 can receive an indication that an online user may be engaged in inappropriate activity.
  • the network entity 1500 also includes a processor that can capture a time based history of online activity of users in an online community when the indication is received.
  • the network entity recreates the online activity of the online community and determines if there has been inappropriate online activity by one or more of the online users and if there is inappropriate activity allocates online resources to achieve a desired level of monitoring of an offending users.
  • the functions of the network entity 1500 can be implemented in other entities, or across several network entities.
  • a moderation entity 108, or a server 106, or a user 102 can implement the operations of the network entity 1500.
  • a moderation entity can receive an indication of inappropriate activity and capture a time based history of the activity. The moderation entity can then send an indication of a desired level of monitoring that is desired to a network entity that adjusts a level of network resources allocated to monitoring the offending user.
  • Figure 16 is a flowchart illustrating an embodiment of detecting cheating in an online environment.
  • Flow begins in block 1602 and a player in an online game session detects suspected cheating behavior by another online game player.
  • Flow continues to block 1604 and game information is collected about the game play activity of players in the online game.
  • the game information can include the game activity of the suspected cheating player, or all of the players, or any desired number of players.
  • the game information includes a period of the game during which the suspected cheating behavior occurred.
  • the game information can include the actions of the game players. For example, where they move, how fast they move, do they seem to have more ability or powers than are typical, and the like.
  • Flow then continues to block 1606.
  • the game information is communicated to a game cheat monitoring entity.
  • the game cheat monitoring entity evaluates the game information to determine if there was cheating activity. If there was cheating activity the game cheat monitoring entity can take appropriate action.
  • a reward is provided to a game player that observes cheating behavior and communicates the game information to the game cheat monitoring entity.
  • capturing the game information of the online game session includes capturing online game session activity that occurred a predetermined amount of time before detecting the suspected cheating behavior.
  • capturing the game information includes associating an online game player's identity with the player's online activity.
  • FIG. 17 is a flowchart illustrating another embodiment of detecting cheating in an online environment.
  • Flow begins in block 1702 where an indication that a player in an online game session suspects that another player in the game session is engaging in cheating behavior is received.
  • a game cheat monitoring entity can receive the indication.
  • the game cheat monitoring entity collects game information of game activity around a time of the suspected cheating behavior.
  • the game cheat monitoring entity can be a game server and collect game information.
  • the game cheat monitoring entity receives the game information.
  • the game cheat monitoring entity can receive game information from a game server, or from players in the online game, or other network entity, or any combination of entities.
  • FIG. 1706 Flow continues to block 1706 and the game cheat monitoring entity recreating the game activity from the game information.
  • Figure 18 is a block diagram of another embodiment of a moderation entity that can allocate resources such as online resources or other network resources.
  • the moderation entity also referred to as a network allocation moderation entity, can be a separate entity in communication with a network, such as the network 104 illustrated in Figures 1-4, or the operations of the network resource allocation moderation entity can be implemented in another network entity, such as for example, a moderation entity 108, a server 106, a user 102, or other network entity, as shown in Figure 1.
  • the network entity 1800 includes a network interface 1802.
  • the network entity 1800 can receive an indication that an online user may be engaged in inappropriate activity.
  • the network entity 1800 also includes a processor 1804 that can capture a time based history of online activity of users in an online community when the indication is received.
  • the network entity recreates the online activity of the online community and determines if there has been inappropriate online activity by one or more of the online users and if there is inappropriate activity allocates online resources to achieve a desired level of monitoring of an offending users.
  • the functions of the network entity 1800 can be implemented in other entities, or across several network entities.
  • a moderation entity 108, or a server 106, or a user 102 can implement the operations of the network entity 1800.
  • a moderation entity can receive an indication of inappropriate activity and capture a time based history of the activity. The moderation entity can then send an indication of a desired level of monitoring that is desired to adjust a level of network resources allocated to monitoring the offending user.
  • Figure 19 is a flow chart illustrating aspects of online or other network resource allocations.
  • Flow begins in block 1902 where an indication that an online user may be engaging in inappropriate behavior is received.
  • Flow then continues to block 1904 and a time based history of an online session that includes the user's behavior is captured.
  • Flow continues to block 1906 where the online activities of the session are recreated.
  • it is determined if there was inappropriate activity by an offending online user.
  • Flow then continues to block 1908.
  • online resources are allocated for a desired level of monitoring of the offending online user.
  • capturing the time based history of the online session includes capturing online session activity which occurred a predetermined amount of time before receiving the indication that an online user may be engaging in inappropriate behavior.
  • capturing the time based history comprises associating online user identities with their online activity.
  • allocating online resources for a desired level of monitoring of the offending member includes assigning online resources to track the activities of the offending member.
  • a network resource allocation entity that captures the time based history.
  • a moderation entity that captures the time based history, recreates the online activity, and communicates the desired level of monitoring of the offending user to a network resource allocation entity that allocates network resources.
  • the time based history is received from another network entity.
  • Figure 20 is a flow chart illustrating additional aspects of allocating online or other network resources. Flow begins in block 2002 an indication of a triggering mechanism being activated by an online community member is received indicating suspected inappropriate behavior by another online community member.
  • modules may also be implemented primarily in hardware using, for example, components such as application specific integrated circuits ("ASICs"), or field programmable gate arrays ("FPGAs"). Implementation of a hardware state machine capable of performing the functions described herein will also be apparent to those skilled in the relevant art. Various embodiments may also be implemented using a combination of both hardware and software.
  • ASICs application specific integrated circuits
  • FPGAs field programmable gate arrays
  • module means, but is not limited to a software or hardware component, such as an FPGA or an ASIC, which performs certain tasks.
  • a module may advantageously be configured to reside on an addressable storage medium and configured to execute on one or more network enabled devices or processors.
  • a module may include, by way of example, components, processes, functions, attributes, procedures, subroutines, segments of program code, drivers, firmware, microcode, circuitry, data, databases, data structures, tables, arrays, variables, and the like.
  • the functionality provided for in the components and modules may be combined into fewer components and modules or further separated into additional components and modules. Additionally, the components and modules may advantageously be implemented to execute on one or more network enabled devices or computers.
  • DSP digital signal processor
  • a general-purpose processor can be a microprocessor, but in the alternative, the processor can be any processor, controller, microcontroller, or state machine.
  • a processor can also be implemented as a combination of computing devices, for example, a combination of a DSP and a microprocessor, a plurality of microprocessors, one or more microprocessors in conjunction with a DSP core, or any other such configuration.
  • a software module can reside in RAM memory, flash memory, ROM memory, EPROM memory, EEPROM memory, registers, hard disk, a removable disk, a CD-ROM, or any other form of storage medium including a network storage medium.
  • An exemplary storage medium can be coupled to the processor such the processor can read information from, and write information to, the storage medium.
  • the storage medium can be integral to the processor.
  • the processor and the storage medium can also reside in an ASIC.

Abstract

Methods, apparatuses, and techniques for moderating activity in an online community are described. Aspects include a triggering mechanism being activated by a community member in response to inappropriate activity by another community member. Receiving a time based history of community member's activity around a time of the triggering mechanism being activated. Recreating the community activity from the time based history. Evaluating activities of the community members to determine if there was inappropriate activity and if there is inappropriate activity by an offending community member, taking appropriate action against the offending community member.

Description

ON-LINE MONITORING OF RESOURCES
BACKGROUND
Field of the Invention
[0001] The present invention relates to on-line sessions, and more specifically, to: community based moderation of online sessions; moderation of cheating in an online sessions; allocation of on-line resources based on community based moderation of online sessions; and improving application integrity. Background [0002] In typical on-line session, such as virtual reality sessions, games, and other applications, users may interact and communication with other on-line users in the on-line community. During this interaction, the members of the on-line community may be subjected to inappropriate or offensive behavior from other members of the community. [0003] For example, one community member may begin sending chat messages that include profane or other inappropriate language to the other members of the community. Likewise, one member of the community may make obscene gestures or drawings that are visible to the other community members.
[0004] In addition, a community member may engage in illegal activity. For example, in a virtual reality environment one of the community members may post pornography or engage in other illegal activity. The illegal activity would be offensive to other members of the community.
[0005] In another example, members of the online community may be engaged in an online game. During the online game one, or more, or the game players may engage in cheating to take an unfair advantage over the other game players. The cheating activity can lead to dissatisfaction with the online game by the other online game players. [0006] Offensive, illegal, cheating, or other inappropriate actions by particular community members can decrease the enjoyment of the on-line session for the other community members. Thus, there is a need for improving moderation in on-line sessions.
SUMMARY [0007] Embodiments of the present invention provide methods, systems, apparatus, and programs for moderating online sessions. In one embodiment, a method for community moderation of an online session includes observing inappropriate behavior of a first online user by a second online user. The second online user activates or presses a triggering mechanism in response to the inappropriate behavior. A time based history of the online session is captured. Then the time based history is transmitted to a moderation entity. [0008] In an embodiment, the time based history of the online session includes online session activity that occurred a predetermined amount of time before the triggering mechanism is activated or pressed. The duration of the time based history can be set by the user, or it can be a predetermined period, or by a network entity, or by other techniques. The time based history can include information that associates online user identities with their online activity. In an embodiment, a reward can be issued to a user that observes inappropriate behavior and activates or presses the triggering mechanism. An example of a triggering mechanism is a panic button.
[0009] In another embodiment of a method of moderating activity in an online community, the method includes receiving an indication of a triggering mechanism being activated by a community member in response to inappropriate activity by another community member. Then receiving a time based history of community member's activity around a time of the triggering mechanism being activated. Recreating the community activity from the time based history. Then evaluating activities of the community members to determine if there was inappropriate activity and if there is inappropriate activity by an offending community member, taking appropriate action against the offending community member.
[0010] In still another embodiment of an online community there are at least two users that communicate in the online community, wherein a first user in the online community observes inappropriate behavior by a second user in the online community and presses a panic button in response to the inappropriate behavior, the pressing of the panic button initiating storing a time based history of online community activity, the time based history covering a period that extends a desired duration before the pressing of the panic button and a desired duration after pressing the panic button. The online community also includes a moderation entity that receives the time based history and recreates the online activity to determine if there was inappropriate activity by one of the users, and if there was inappropriate activity by one of the users, taking appropriate action against that user. [0011] In still another embodiment, a network enabled device includes a triggering mechanism. The device also includes a processor that captures a time based history of online activity of users in an online community. In addition there is a network interface that transmits the time based history to a moderation entity, the moderation entity determines if there has been inappropriate online activity by one of the online users.
[0012] In one embodiment, taking appropriate action against the offending community member includes one or more of issuing a warning to the offending community member, limiting available online options to the offending community member, and restricting access to the online community by the offending community member. The triggering mechanism being activated can be pressing a panic button.
[0013] Embodiments of the present invention also provide methods, systems, apparatus, and programs for detecting and discouraging cheating in an online game session are described. Aspects include playing an online game. During play of the game one of the players detects suspected cheating behavior by another online game player. Game information is collected about the activity of all players in the online game, the game information includes a period of the game during which the suspected cheating behavior occurred. The game information is communicated to a game cheat monitoring entity that evaluates the game information to determine if there was cheating activity, and take appropriate action if there was cheating activity. [0014] In one embodiment, capturing the game information of the online game session includes capturing online game session activity that occurred a predetermined amount of time before detecting the suspected cheating behavior. In an embodiment, capturing the game information includes associating an online game player's identity with the player's online activity. In one embodiment, a reward is provided to a game player that observes cheating behavior and communicates the game information to the game cheat monitoring entity. There can also be a triggering mechanism that a player activates in response to detecting suspected cheating activity.
[0015] In another embodiment, a method of moderating cheating activity in an online game community includes receiving an indication that a player in an online game session suspects that another player in the game session is engaging in cheating behavior. Receiving game information of game activity around a time of the suspected cheating behavior. Recreating the game activity from the game information. Evaluating activities -A-
of the players in the game to determine if there was cheating behavior and if there is cheating behavior by one of the game players, taking appropriate action against the cheating game player. One example of appropriate activity includes restricting access to the online game by the cheating game player. [0016] In another embodiment, an online game session includes at least two players that communicate in the online game session, wherein a first player in the online game session detects suspected cheating behavior by a second player in the online game session, the first player communicates an indication to a game cheat monitoring entity that there is suspected charting behavior. The game cheat monitoring entity, upon receiving an indication that there is cheating behavior, collects game information of the play of all of the players in the online game session, the game information includes a time period that extends a desired duration before and after receiving the indication, the game cheat monitoring entity uses the game information to recreate online game activity of the players to determine if there was cheating activity by one of the players, and if there was cheating activity by one of the players, the game cheat monitoring entity takes appropriate action. An example of appropriate action includes restricting access to the online game session by the cheating player.
[0017] In still another embodiment, a game cheat monitoring entity includes a network interface that receives an indication that there is cheating behavior. The game cheat monitoring entity also includes a processor that collects game information of all of the players in the online game session, the game information includes a time period that extends a desired duration before and after receiving the indication, the processor uses the game information to recreate online game activity of players in the game session to determine if there was cheating activity by one or more of the players, and if there was cheating activity by one or more of the players, the game cheat monitoring entity takes appropriate action.
[0018] Embodiments of the present invention also provide methods, systems, apparatus, and programs for allocating online or other network resources to monitor an online community. In one embodiment, a method of allocating online resources to monitor online community members that have been identified as engaging in inappropriate behavior includes receiving an indication that an online user may be engaging in inappropriate behavior. Capturing a time based history of an online session that includes the user's behavior. Recreating the online activity and determining if there was inappropriate activity by an offending online user.
[0019] Allocating online resources for a desired level of monitoring of the offending online user. In one embodiment, the time based history of the online session includes capturing online session activity that occurred a predetermined amount of time before receiving the indication that an online user may be engaging in inappropriate behavior. In another embodiment, capturing the time based history includes associating online user identities with their online activity. In one embodiment, allocating online resources for a desired level of monitoring of the offending online user includes assigning online resources to track the activities of the offending online user. Another embodiment includes a network resource allocation entity that captures the time based history. In one embodiment, a moderation entity captures the time based history, recreates the online activity, and communicates the desired level of monitoring of the offending user to a network resource allocation entity that allocates network resources. [0020] In another embodiment, a method of allocating online resources to monitor online community members that have been identified as engaging in inappropriate behavior includes receiving an indication of a triggering mechanism being activated by an online community member in response to suspected inappropriate activity by another online community member. Receiving a time based history of community members' online activity around a time of the triggering mechanism being activated. Recreating the community activity from the time based history. Evaluating activities of the community members to determine if there was inappropriate activity and if there is inappropriate activity by an offending community member allocating online resources to monitor community members that have been identified as engaging in inappropriate behavior. [0021] In another embodiment, an online community with online resources that are allocated to monitor members of the online community include at least two users that communicate in the online community, wherein a first user in the online community observes suspected inappropriate behavior by one or more other users in the online community, the first user presses a panic button in response to the inappropriate behavior, the pressing of the panic button initiating storing a time based history of online community activity, the time based history covering a period that extends a desired duration before the pressing of the panic button and a desired duration after pressing the panic button. A moderation entity that receives the time based history and recreates the online activity to determine if there was inappropriate activity by one of the users, and if there was inappropriate activity by one of the users, determining a desired level of monitoring to track an offending users' activity. A network allocation resource allocation entity that allocates online resources to track the activities of the offending user.
[0022] In one embodiment, a network entity includes a network interface that receives an indication that an online user may be engaged in inappropriate activity. A processor that captures a time based history of online activity of users in an online community when the indication is received, recreates the online activity of the online community and determines if there has been inappropriate online activity by one or more of the online users and if there is inappropriate activity allocates online resources to achieve a desired level of monitoring of an offending users.
[0023] In an embodiment, the time based history of the online session includes online session activity that occurred a predetermined amount of time before the triggering mechanism is pressed. The duration of the time based history can be set by the user, or it can be a predetermined period, or by a network entity, or by other techniques. The time based history can include information that associates online user identities with their online activity. In an embodiment, a reward can be issued to a user that observes inappropriate behavior and presses the triggering mechanism. An example of a triggering mechanism is a panic button.
[0024] Embodiments of the present invention also provide methods, systems, apparatus, and programs for improving application integrity. In one embodiment, a method for improving the integrity of an application includes interacting with the application. Observing unexpected operation of the application. Activating a triggering mechanism in response to the unexpected operation. Capturing a time-based history of the application session. Communicating the time-based history to a network entity for evaluation. [0025] In one embodiment, the application comprises testing an online game, and capturing the time-based history comprises capturing online game session activity that occurred a predetermined amount of time before the triggering mechanism is pressed. [0026] In one embodiment, activating a triggering mechanism includes pressing a panic button. In an embodiment, observing unexpected operation of the application comprises observing a glitch in the operation of the application. In one embodiment, the network entity includes a server, or a moderation entity, or other network entity. In another embodiment, communicating the time-based history includes transmitting the time -based history over a local area network, or a wide area network such as the Internet, or any combinations of networks. [0027] In another embodiment, a method of testing an online game includes receiving an indication of a triggering mechanism being activated in response to unexpected operation of an online game. Receiving a time-based history of online game activity around a time of the triggering mechanism being activated. Recreating game activity from the time- based history. Evaluating the game activity to determine if there is a malfunction in the operation of the game. In another embodiment, testing the online game includes troubleshooting the malfunction in the operation of the game.
[0028] In yet another embodiment, an online game test unit includes a triggering mechanism. The test unit also includes a processor that captures a time-based history of game activity when the triggering mechanism is activated. The test unit includes a network interface that transmits the time-based history to a network entity, the network entity determines if there is a malfunction in the operation of the online game. [0029] Other features and advantages of the present invention will become more readily apparent to those of ordinary skill in the art after reviewing the following detailed description and accompanying drawings.
BRIEF DESCRIPTION OF THE DRAWINGS
[0030] Figure 1 is a block diagram illustrating an exemplary architecture for moderating online user activity.
[0031] Figure 2 is a block diagram of another embodiment of network architecture for moderating online user activity.
[0032] Figure 3A is a block diagram of a peer-to-peer communication network illustrating aspects of community moderation.
[0033] Figure 3B is a block diagram illustrating an indicating that there is inappropriate behavior by another user in the network of Figure 3 A. [0034] Figure 3C is a block diagram of the peer-to-peer network of Figure 3A showing the moderation entity 108 taking preventive action. [0035] Figure 4A is a block diagram of a client server communication network illustrating aspects of community moderation.
[0036] Figure 4B illustrates the network of Figure 4A where the server transmits the audio chat message from the first user to other users. [0037] Figure 4C illustrates the network of Figure 4A where a user sends an inappropriate message.
[0038] Figure 4D illustrates the network of Figure 4A showing the server taking appropriate action for the inappropriate message sent by a user.
[0039] Figure 5 is a flowchart illustrating a method of detecting and preventing inappropriate online activity.
[0040] Figure 6 is a flowchart of another embodiment of detecting inappropriate online behavior.
[0041] Figure 7 is a flowchart illustrating aspects of taking appropriate action in response to inappropriate activity. [0042] Figure 8 is a flow diagram illustrating an embodiment of using community moderation to prevent cheating in an online video game.
[0043] Figure 9 is a flow diagram illustrating aspect of moderating online behavior.
[0044] Figure 10 is a flow diagram of another embodiment of evaluating user online activity. [0045] Figure 11 is a block diagram of a test environment.
[0046] Figure 12A is a flow diagram of an online test environment as illustrated in
Figure 12 A.
[0047] Figure 12B is a flow diagram of an embodiment of a test environment as illustrated in Figure 11. [0048] Figure 13 is a table illustrating examples of different types of actions that can be taken in response to a user's inappropriate behavior.
[0049] Figure 14 is a block diagram illustrating an example network enabled device
1450 that may be used in connection with various embodiments described herein.
[0050] Figure 15 is a block diagram illustrating an example game cheat monitoring entity that may be used in connection with various embodiments described herein.
[0051] Figure 16 is a flowchart illustrating an embodiment of detecting cheating in an online environment. [0052] Figure 17 is a flowchart illustrating another embodiment of detecting cheating in an online environment.
[0053] Figure 18 is a block diagram of another embodiment of a moderation entity that can allocate resources. [0054] Figure 19 is a flow chart illustrating aspects of online or other network resource allocations.
[0055] Figure 20 is a flow chart illustrating additional aspects of allocating online or other network resources.
DETAILED DESCRIPTION
[0056] After reading the following description it will be apparent to one skilled in the art how to implement the invention in various alternative embodiments and alternative applications. However, although various embodiments of the present invention will be described herein, it is to be understood that these embodiments are presented by way of example only, and not limitations. As such, this detailed description of various embodiments should not be construed to limit the scope or breadth of the present invention.
[0057] Figure 1 is a block diagram illustrating an exemplary architecture for moderating online user activity. As shown on Figure 1, one or more users or clients 102a-c are in communication with a network 104. In one embodiment, the users 102a-c communicate via the network with each other in an ad hoc communication network. In another embodiment the users communicate via the network with a server 106. The users 102 may use a network enabled device, such as game console such as a Sony play station 3, a laptop computing device, a portable game device such as a play station portable, a desktop computing device, a cellular telephone, or any other device capable of interfacing to the communication network 104.
[0058] In one embodiment the architecture includes a moderation entity 108 which is also in communication with the network 104. The moderation entity 108 can be used to take appropriate action if the one of the users 102a-c is engaged in inappropriate or unacceptable behavior. For example, as discussed further below, the moderation entity 108 may interrupt communications from one user to another or may restrict an offending user's access to the network for a desired period of time. [0059] In one embodiment, the moderation entity 108 is a separate network node. In other embodiments, the moderation entity 108 may be incorporated within another network node, such as one or more of the users 102a-c or the server 106 or other network entity. It should be understood that reference to a user 102a-c and a server 106 and moderation entity 108 are merely for convenience of understanding various embodiments. For example, embodiments of the present invention may be implemented in the context of a peer-to-peer network, a client server network, or within a peer group. Therefore, in some instances a client or user may function as a server or moderation entity and vice versa, depending on the timing and the nature of the data exchange. For example, various clients in a peer-to-peer network may each comprise a portion of an online activity such as a virtual reality and may send and receive data related to the online activity. Thus, any reference to a user or a server or a moderation entity is meant to be inclusive of operations performed by one or any of the operating entities unless specified otherwise by specific limitations. In some instances a device with user/server functionality may be referred to in a generic moniker such as network node, computing node or network device. In that regard user, server and moderation entity may each be considered network computing nodes or a network device.
[0060] In one example embodiment, one user 102c may monitor the activity of other online users 102a and 102b as they interact in the online environment. When one of the users 102c believes one of the other users 102a and 102b is engaged in inappropriate conduct for the online environment, they can, for example, press a panic button or some other indication that inappropriate activity is taking place. Although this discussion describes one user 102c monitoring other users 102a-b, in other embodiments all users are monitoring the activities of all other users. In other embodiments selected users or groups of users can be authorized to monitor other online users.
[0061] When the panic button is pressed, a snapshot of the online environment is captured and sent to the moderation entity 108 for evaluation. The snapshot of the online activity includes the activity that was occurring when the panic button was pressed as well as a desired period of time prior to the panic button being pressed. In other words, each user device 102 that is monitoring online activity includes a buffer or other type of memory where duration of all of the users' activity that is being monitored in the online environment is being stored. In this way when the panic button is pressed, the contents of -l i¬
the buffer which includes a period of time prior to the pressing of the panic button as well as a desired period of time following the pressing of the panic button is sent to the moderation entity 108 for evaluation. The duration of the time based history can be set by the user, or it can be a predetermined period, or by a network entity, or by other techniques.
[0062] The moderation entity receives the stored online activity of the users. The moderation entity 108 then evaluates the online activity against a set of norms or rules that have been pre-established. If the moderation entity 108 determines that one of the users' behavior is inappropriate, the moderation entity 108 can take appropriate action. For example, if a user is using offensive language the moderation entity 108 could disable that user's microphone. In another example, the moderation entity 108 could warn the user to stop using the offensive language or the moderation entity 108 could restrict the user and only allow the user to access portions of the online environment where the language is acceptable, such as an adult only portion of the environment, or the users could be restricted from the online environment entirely. In anther example, if a user is cheating in a game, the moderation entity 108 could warn the user to stop the cheating activity or the moderation entity 108 could restrict the user and not allow the cheating user to participate in the game. [0063] In one embodiment, users that identify inappropriate behavior can be reworded. For example, if a user identifies a cheater in a game, the user can be given a reward. Rewards encourage user to identify inappropriate behavior, such as cheating, and because appropriate action is taken the online experience for all of the other users is improved. Of course users can abuse the reward feature by identifying others that are not involved in inappropriate behavior. To discourage these types of false identification a user can receive demerits for making fake identifications.
[0064] Figure 2 is a block diagram of another embodiment of network architecture for moderating online user activity. As shown in Figure 2, multiple users 102a, 102b, and 102c are in communication with a network 104. Also in communication with the network is a server 106. In the embodiment of Figure 2 there are multiple moderation entities 108a through 108n. In this embodiment each moderation entity is configured to evaluate a specific type of inappropriate behavior. For example, one moderation entity could be configured to evaluate offensive language in the online environment. A different moderation entity can be configured to evaluate cheating activity in an online game. Still another moderation entity can be configured to evaluate online illegal activity such as distribution of pornographic or other illegal materials. In other embodiments, other moderation entities are configured to evaluate other types of inappropriate online behavior. Similar to the communication network of Figure 1, once the inappropriate online activity has been determined by the moderation entity, appropriate action can be taken.
[0065] Figure 3A is a block diagram of a peer-to-peer communication network illustrating aspects of community moderation. As show in Figure 3A, the community includes three users 102a, 102b and 102c in communication with each other through the communication network 104. Also in communication with the network 104 is the moderation entity 108. In the example shown in Figure 3 A, the first user 102a is communicating by sending voice messages to the other users 102b and 102c. In the example of Figure 3 A the voice message sent by the first user 102a includes inappropriate or profane language.
[0066] Figure 3B is a block diagram illustrating a indicating that there is inappropriate behavior by another user in the network of Figure 3 A. In one embodiment, the user 102c presses a panic button to indicate there is inappropriate behavior. As shown in Figure 3B, the third user 102c upon hearing the inappropriate and profane message from the first user 102a presses a panic button or other triggering device to indicate inappropriate behavior is, or has, occurred. While engaged in the online activity, the users network enabled devices have been buffering a time segment, or time based history, of online activity thereby recording the online activity of all of the monitored users in the community. In other words, a buffer in the third user's device 102c has a sliding window of memory that is always recording a portion of previous online activity by the users. When the panic button is pressed, that previous activity on the network is saved as well as the present and future activity for a desired duration. This entire buffer can then be sent to the moderation entity 108. In addition to sending the recorded online activity, a message sent to the moderation entity 108 can include an indication of the type of offensive or inappropriate behavior that the third user 102c is reporting. Examples of the type of online activity that can be buffered include a time-based history of online activity such as text chat, audio chat, the state of the characters and/or online participants as well as other types of online activity. [0067] In another embodiment the sights and sounds of Avatars that are engaged in an online game can be captured and stored in the time-based history. The moderation entity 108 can then evaluate the time -based history of the online activity of the users and determine if the first user's 102a behavior is inappropriate such as if the first user is cheating.
[0068] Figure 3C is a block diagram of the peer-to-peer network of Figure 3A showing the moderation entity 108 taking preventive action. As shown in the example of Figure 3C, upon determining that the first user's 102a activity is inappropriate, the moderation entity 108 can take preventive action. For example, the moderation entity 108 can send a warning to the first user 102a indicating that their behavior is inappropriate and to not engage in such behavior in the future. Other types of preventive action can also be taken. For example, the moderation entity 108 can send a command to the first user's 102a device and disable the first user's 102a communication capability such as disabling the first user's microphone. [0069] In other embodiments the moderation entity 108 could take actions such as cutting off the offending user's subscription so that they can no longer engage in the online activity. The moderation entity 108 could also add or increase monitoring of a particular user who has been engaged in inappropriate activity. In other embodiments these types of corrective actions can be used individually or in any combination. [0070] While the examples illustrated in Figures 3A-C show three users, in other embodiments there may be different number of users. Also, different numbers, and groups of users may monitor and be monitored in other embodiments.
[0071] Figure 4A is a block diagram of a client server communication network illustrating aspects of community moderation. As shown in Figure 4A, three users 102a, 102b and 102c use network enabled devices to communicate through a server 106 while engaged in an online activity. In Figure 4A the first user 102a is engaged in an audio chat session with second and third users, 102b and 102c. The audio message from 102a is routed to the server 106. [0072] Figure 4B illustrates the network of Figure 4A where the server transmits the audio chat message from the first user to other users. In the example of Figure 4B, the server 106 transmits the audio chat message from the first user 102a to the second and third users, 102b and 102c. In other embodiments, there can be a number of other users in the network. For example, the first user's message can be transmitted to one other user or to any number of other users.
[0073] Figure 4C illustrates the network of Figure 4A where a user sends an inappropriate message. In this example, the first user 102a sends an audio chat message intended for the second and third users, 102b and 102c, and the message includes inappropriate content.
[0074] Figure 4D illustrates the network of Figure 4A showing the server taking appropriate action for the inappropriate message sent by a user. As shown in Figure 4D, the server 106 detects that the audio message sent by the first user 102a and determines that it is inappropriate. Because the message includes inappropriate material the seriverlOβ does not transmit it to the second and third users, 102b and 102c. The server 106 can also take other actions such as warning the first user 102a that his audio message and behavior is inappropriate, cutting off the subscription of the first user, as well as additional or increased monitoring of the first user, and other types of actions. [0075] In the embodiments illustrated in Figure 4 A to 4D functionality of the moderation entity has been incorporated into the server 106. In other embodiments the functionality of the moderation entity can be incorporated into other network entities, for example, a user device, or other network device. [0076] While the examples illustrated in Figures 4A-D show three users, in other embodiments there may be different number of users. Also, different numbers, and groups of users may monitor and be monitored in other embodiments.
[0077] Figure 5 is a flowchart illustrating a method of detecting and preventing inappropriate online activity. Flow begins in block 502 where an online user observes offensive or inappropriate behavior. The types of behavior that are considered offensive or inappropriate can be based on an individual user's perception of inappropriate behavior, or based on community norms of what is appropriate and inappropriate behavior. Various techniques for establishing what is appropriate and inappropriate behavior are disclosed in US Patent Application No. 11/502,265 filed August 9, 2006, and entitled "Dynamic Rating of Content" which is incorporated herein by reference in its entirety. [0078] Flow continues to block 504 where a user presses the panic button or performs another action to indicate or response to observing the offensive or inappropriate online behavior. Flow then continues to block 506 where a time-based history of all the community members' activity is captured. The time-based history can be stored in a user's device and includes a sliding window of online activity. In other words, a portion of the past online activity is continually recorded in a buffer such that when the panic button is pressed, the previous online activity is stored as well as the present and a portion of a future period of online activity. In this way evidence indicating a user's inappropriate or offensive online activity is captured in the time-based history.
[0079] Flow continues to block 508. In block 508 the time-based history is sent to a moderation entity. In addition to the time-based history, an optional indication of the type of offensive behavior can also be sent to the moderation entity. For example, an indication can be sent showing that the user believes the inappropriate activity is offensive language or illegal activity such as online pornography or a player cheating in a game or other inappropriate activity.
[0080] Flow then continues to block 510. In block 510 the moderation entity evaluates the time-based history to determine if the activity is offensive or inappropriate. Optionally, if an indication of the type of offensive behavior was included in the message sent to the moderation entity, the time-based history could be routed to a particular engine within the moderation entity or to an appropriate moderation entity based upon the types of activity. In other words, one moderation entity, or engine within a moderation entity, can be optimized to identify and take appropriate action for a particular type of inappropriate activity, for example, profane language. A different engine or moderation entity can be optimized to detect and take action for other types of inappropriate activity, for example, illegal online activity or game cheats or the like.
[0081] Flow then continues to block 512 where the moderation entity takes appropriate action. During evaluation if the moderation entity determines that the activity is not inappropriate that may take no action. If the moderation entity determines that the behavior is offensive or inappropriate then the moderation entity can take appropriate action. For example, the moderation entity could warn the user about his behavior or it could cut off the user's subscription or increase or add monitoring to track the online activities of the offending user. [0082] Optionally, if it is determined that there has been inappropriate activity then the user reporting the activity may receive an award. If it is determined that there is no inappropriate activity then the user reporting the activity may receive demerits. In this way users are encouraged to report inappropriate activity while discouraged from making false reports.
[0083] Figure 6 is a flowchart of another embodiment of detecting inappropriate online behavior. Flow begins in block 602 where a user joins an online community activity. For example, a user could join an online game activity or they could engage in online virtual reality sessions or other online activities, for example, such as Sony Home ® environment. Flow continues to block 604 where the user interacts with other members of the online community. Flow then continues to block 606 where the user becomes aware of inappropriate activity of one of the other community members. Flow then continues to block 608 where the user presses a panic button or otherwise indicates that inappropriate activity has been observed. Flow then continues to block 610 where a time -based history of inappropriate activity of the online environment is captured and sent to a moderation entity. As noted previously, the time-based history includes a sliding window that records activity prior to the pushing of the panic button as well as after the pushing of the panic button. In this way the online activity when the offensive behavior occurred is captured and sent to the moderation entity. Optionally users reporting inappropriate activity can receive rewards while users making false reports can receive demerits. [0084] Figure 7 is a flowchart illustrating aspects of taking appropriate action in response to inappropriate activity. In one embodiment, the action may be taken by a network entity such as a moderation entity 108 or server 106 in Figures 1 and 2. Flow begins in block 702 where an indication of the occurrence of inappropriate activity, such as the pressing of a panic button, is received. Flow then continues to block 704 where a time -based history of the online community members' activity is received. Flow then continues to block 706. In block 706 the online community members' activity is evaluated. In block 708 any inappropriate activity recorded in the time-based history of the online community is identified. Flow then continues to block 710 where appropriate action is taken. If in block 708 there was no inappropriate activity identified, then in block 710 no action is taken. If in block 708 inappropriate activity was identified, then in block 710 an appropriate action is taken. For example, a warning could be issued to the offending user or the offending user could have his subscription cut off or there could additional or increased monitoring of the offending user. Optionally users reporting inappropriate activity can receive rewards while users making false reports can receive demerits.
[0085] Figure 8 is a flow diagram illustrating an embodiment of using community moderation to prevent cheating in an online video game. In one embodiment, preventing cheating in an online video game may be accomplished by a network entity such as a moderation entity 108 or server 106 in Figures 1 and 2. Flow begins in block 802 where an online game user observes questionable game play of one of the other participants. Flow continues to block 804 where the user observing the questionable play indicates that they believe another player may be cheating by, for example, presses the panic button or a triggering mechanism, or other type of indication. Flow then continues to block 806 where a time-based history of the online game members' activity is captured. The time- based history includes duration of game play that has been stored prior to the pressing of the panic button as well as a period of game play following the pressing of the panic button. In this way a sliding window of time surrounding the pressing of the panic button has been recorded. Types of activity that can be included in the time-based history include text chat, audio chat, the state of all characters, their positions, and any other data that will be useful in recreating the online environment. Flow then continues to block 810. In block 810 the history is sent to a moderation entity. In one embodiment an optional indication of the type of inappropriate behavior observed is also included. For example, if a player has observed the suspected cheating player of disappearing, or having exceptional strength, or being resistant to attacks from other players, that information can be included and sent along with the time-based history.
[0086] Flow then continues to block 812. In block 812 the moderation entity evaluates the online behavior of the game participants. Using the time-based history, the moderation entity can play back the scenario leading up to the pressing of the panic button. In this way it can be determined whether or not someone was cheating. Various techniques for detecting cheating in an online game are described in pending US Patent Application Serial No. 11/386,039 entitled "Active Validation of Network Devices" filed March 20, 2006, Serial No. 11/415,881 entitled "Passive Validation of Network Devices" filed May 1, 2006, Serial No. 11/449,141 entitled "Game Metrics" filed on June 7, 2006, and Serial No. 11/725,175 entitled "Maintaining Community Integrity" filed March 16, 2007, all of which are incorporated herein in their entirety. [0087] Following evaluation of the online behavior in block 812, flow continues to block 814. In block 814 the moderation entity can take appropriate action based on the severity of the inappropriate behavior. In one embodiment if no inappropriate behavior is detected, then the moderation entity will take no action. In other embodiments if inappropriate behavior is detected, then the moderation entity can take any of a range of appropriate actions including warning, cutting of a user's subscription, adding increased monitoring, or any combination of the above. Optionally a user reporting cheating can receive rewards while a user making false reports can receive demerits. [0088] While Figures 3 through 7 describe embodiments associated with inappropriate online activity such as offensive language, the same techniques can be applied to prevent cheating in online gaming. For example, in Figures 3A-C instead of a user detecting offensive language and reporting to the moderation entity, a user could detect suspected cheating in an online game environment and report that to the moderation entity where appropriate action will be taken. Likewise in Figures 4A-D in a server/client-based architecture, the server could detect suspected online cheating by a user and take appropriate action. Likewise in Figures 5 to 7, the offensive or inappropriate behavior could be cheating in an online game environment.
[0089] Figure 9 is a flow diagram illustrating aspect of moderating online behavior. In one embodiment, the aspects of Figure 9 can be implemented by a moderation entity or a server as illustrated in Figures 1 and 2. Flow begins in block 902 where an indication that inappropriate behavior has been observed, such as that a panic button has been pressed, is received. Flow continues to block 904 where a time-based history of activity of community members around the time the panic button was pressed is received. Then, in block 906, community members' activity is evaluated to determine if it is inappropriate activity. Inappropriate activity could include profane or inappropriate language, distribution of, or showing of, pornography to other online users, cheating in an online game, and the like. If in block 906 it is determined that the activity is not inappropriate, flow continues to block 910. In block 910 the complaint against the user is logged in a user's file. This user file may be maintained to keep track of the number of indications of other users' believing there was inappropriate activity being performed by the suspect user. [0090] Flow then continues to block 912. In block 912 the number of complaints is compared against a predetermined value or threshold. If it is determined that the number of complaints against this user do not exceed the threshold level, flow continues back to block 902 and the system waits for the next pressing of a panic button. Returning to block 912, if it is determined that the number of complaints exceeds the threshold, then flow continues to block 914. Because the number of complaints has exceeded the threshold, it is believed that there may be some inappropriate behavior or at least, some type of behavior that is offensive to the other members of the community being practiced by the suspect user. As such, in block 914, appropriate action can be taken. This action could be merely to warn or inform the suspect user that the other members of the community find their behavior unacceptable or the action could be more severe such as cutting off subscription. In addition, there may be increased monitoring of the user because the other members of the community find his behavior offensive. Returning to block 906, if it is determined that the user's activity is inappropriate, then flow continues to block 914 and appropriate action is taken. Again, this action can range from warning the user that his activity is inappropriate to cutting off subscription to adding increased monitoring and the like.
[0091] Flow then continues to block 916. In block 916 the user's file is updated indicating there is inappropriate activity or that there has been an action taken. For example, the user file may indicate that a warning has been issued to this user about his activity. When a later action is taken in 916 against the same user, it may increase the severity of the action in response to the previous action taken.
[0092] As shown in Figure 9, if a number of users press the panic button indicating a particular type of activity is unacceptable to other members of the community, even though the standards that the moderation entity currently uses to evaluate inappropriate behavior indicated the behavior is not inappropriate, the number of complaints logged for a particular type of behavior can be used to modify the standards and rules set used by the moderation entity in evaluating behavior. For example, if a particular type of behavior is not originally considered inappropriate, but the majority of other online users find a particular activity to be inappropriate, as indicated by a large number of complaints for that activity, the moderation entity can modify the standards that it evaluates activity against and set this new activity as being inappropriate. In this way, as the community changes and evolves over time, the standards by which activity is considered inappropriate will evolve with the community.
[0093] Figure 10 is a flow diagram of another embodiment of evaluating user online activity. In one embodiment, the aspects of Figure 10 can be implemented by a moderation entity or a server as illustrated in Figures 1 and 2. Flow begins in block 1002 where an indication that inappropriate activity has taken place, such as that a panic button has been pressed is received. Flow continues to block 1004 and a time-based history of activity of the community members around the time the panic button was pressed is received. This time-based history can include data used to recreate the online activity around the time the panic button was pressed so that a moderator can evaluate if the online activity of a particular user is inappropriate or not.
[0094] Flow continues to block 1006 and the time-based history is evaluated to see if there is inappropriate activity. If the inappropriate activity exceeds a threshold, flow continues to block 1008. In block 1006 the threshold could be set such that the first time a particular inappropriate activity is done an appropriate action is taken. For example, if there is an illegal activity such as pornography or some other illegal behavior, flow will continue to block 1008 where appropriate action is immediately taken due to the severity of the activity. In addition to taking appropriate action, a level of monitoring of a particular user may be adjusted. For example, the level of monitoring could be increased such that this particular offending user's activity online is monitored at all times by the moderation entity. The user's file is also updated to indicate his inappropriate activity [0095] Adjusting the level of monitoring allows a system with limited resources to more effectively allocate those resources across the community members. For example, if there is a large community with many members, the moderation entity may be able to monitor all of the members' online activity. By increasing the level of monitoring of particular, identified, individuals that have been identified as engaging in inappropriate behavior, limited system resources can be applied more effectively.
[0096] Flow then continues to block 1002 and the online activity continues to be monitored. Returning to block 1006, if the inappropriate activity does not exceed a threshold, then flow continues to block 1010. In block 1010 the member's file is evaluated to see if there have been previous complaints against this particular member. Flow continues to block 1012 and the accumulated inappropriate activity is evaluated to see if it exceeds a threshold. If the accumulated inappropriate activity by this particular member does not exceed the threshold, flow continues to block 1014. [0097] In block 1014 the level of monitoring of this user can be adjusted. For example, the level of monitoring can be increased to more closely monitor the particular member's activities. In addition, the member's file is updated to indicate that there is possible inappropriate behavior. Flow then continues to block 1002 and monitoring if there is an indication of inappropriate activity such as pressing a panic button continues. Returning to block 1012, if the accumulated inappropriate activity exceeds the threshold, then flow continues to block 1016 and the level of monitoring of this particular user will be adjusted in accordance with the number and severity of instances that have been accumulated. For example, the level of monitoring could be increased due to the number of instances that other members have complained about this particular user's activity. The member's file is also updated and flow continues to block 1002 where monitoring of network activity continues. [0098] Figure 11 is a block diagram of a test environment. For example, Figure 11 can be a test environment for testing of an online game or other online application. As shown in Figure 11, there are multiple testers 1102 A, 1102B and 1102C. In other embodiment, there may be any desired number of testers, for example, one, two, or any number of testers. These online testers communicate with a network 1104 and a server 1106. As the testers interact and evaluate the online activity, they will find bugs or glitches which they wish to report to the server, for trouble shooting and updating the application. When one of the testers comes across a glitch, he can trigger an indicate such as pressing a panic button which will record the online environment for a duration around the time the panic button was pressed. For example, the duration of time can extend from before the button was pressed until after the button was pressed for a desired period of time. In this way the online environment can be captured for evaluation as to the cause of the glitch. [0099] In another embodiment of Figure 11 , the testers communicate with a network 1104. The network 1104 can be a local area network, a wide area network such as the Internet, or other type of network. Also in communication with the network are other network entities. For example, a server 1106, or a moderation entity 1108, or other network entities, in any combination can be in communication with the network 1104. In one embodiment, a tester 1102a includes a network interface 1110, a processor 1112, and a triggering mechanism 1114, such as a panic button. In one embodiment the triggering mechanism 1114 can be pressed and the processor 1112 captures a time-based history of activity, such as game activity, when the triggering mechanism is activated. The time- based history can be communicated via the network interface 1110 to another network entity. For example the time -based history can be communicated to the server 1106, or the moderation entity 1108, or other network entity.
[00100] In one embodiment, as the testers interact and evaluate the application, such as an online game, a non-online game, or other application, the testers will find bugs or glitches which they wish to report to the server, for trouble shooting and updating of the application. When testers come across a glitch, they can trigger a mechanism, such as pressing a panic button, to provide an indication of the glitch. A time-based history of the test environment is recorded for duration around the time the triggering mechanism was activated. For example, the duration of time can extend from before the triggering mechanism was activated until a period of time after the triggering mechanism was. In this way the activity and parameters of the application can be captured for evaluation as to the cause of the glitch.
[00101] Figure 12A is a flow diagram of an online test environment as illustrated in Figure 11. Flow begins in block 1202 where testers engage in testing of an online environment or application. Flow continues to block 1204 where a tester identifies an instance of interest during testing. For example, they may identify a glitch or some discontinuity in the application which they wish to report. Flow continues to block 1206 where the tester presses the panic button at the time of the point of interest. Flow then continues to block 1208 where a time-based history of the online environment during the testing activity is captured. In one embodiment, the time-based history is a sliding window of memory beginning before the pressing of the panic button through and after pressing of the panic button. Flow then continues to block 1210 where the time-based history is stored for evaluation and trouble shooting of the application. [00102] Figure 12B is a flow diagram of another embodiment of a test environment as illustrated in Figure 11. Flow begins in block 1212 where testers engage in testing of an application. For example, the application can be a non-online game, an online game, or other application. Flow continues to block 1214 where a tester identifies an instance of interest during testing. For example, the tester may identify a glitch or some discontinuity in the application which they wish to report. Flow continues to block 1216 where the tester activates a triggering mechanism. For example, the tester can press a panic button, or other type of mechanism to indicate the time of the point of interest. Flow then continues to block 1218 where a time-based history of the online environment during the testing activity is captured. In one embodiment, the time-based history is a sliding window of memory beginning before the activating of the triggering mechanism through and after activating the triggering mechanism. Flow then continues to block 1220 where the time -based history is evaluated. In one embodiment, the time-based history is communicated to a server via a local area network. In another embodiment, the time- based history is communicated to a server via a wide area network, such as the Internet. In one embodiment, the time-based history is used for troubleshooting of the application. [00103] Figure 13 is a table indicating examples of possible actions that can be take against a user as a result of a user's inappropriate behavior. The table shown in Figure 13 has a first column 1302 listing different types of inappropriate behavior and a second column 1304 listing different possible actions that can be taken for each type of behavior. For example, a first type of inappropriate behavior 1306 is behavior that falls outside of predetermined community standards. Examples of this type of behavior can be use of profane language, racial or ethnic slurs, types of gestures, and other types of behaviors that the community has identified as unacceptable. Examples of possible actions 1308 that can be take in response to these types of behaviors include issuing a warning, cutting off voice messaging capability; cutting off a user's subscription to the online activity, increasing the monitoring of an offending user, restricting access to portions of the online activity such as restricting access to portions of the online environment where children tend to visit, and the like. [00104] A second type of inappropriate behavior 1310 listed in Figure 13 is cheating in an online game. Examples of possible actions 1312 that can be take in response to cheating in an online game include issuing a warning, decreasing a players abilities in the game, penalizing the player such as decreasing their score, restricting a players access to game options such as not letting a player use particular game options, cutting off a player's subscription to the online game, increasing the monitoring of the cheater, and the like. [00105] A third type of behavior 1314 listed in Figure 13 is questionable behavior. This type of behavior includes behavior that may not violate community standards, but many of the members of the community may complain about the behavior. Examples of this type of behavior may include derogatory language, or suspicious, or distrustful, behavior. Examples of possible actions 1316 that can be take in response to questionable behavior include issuing a warning, increasing the monitoring of the user, and the like. [00106] A fourth type of inappropriate behavior 1318 listed in Figure 13 is illegal activity. An example of this type of activity can be displaying pornography to children online. Examples of possible actions 1320 that can be take in response to illegal activity online can include cutting off a player's subscription to the online game, reporting the activity to proper authorities, increasing the monitoring of the cheater, and the like. [00107] Figure 14 is a block diagram illustrating an example network enabled device 1450 that may be used in connection with various embodiments described herein. The network enabled device 650 may include one or more processors, such as processor 1452. Additional processors may be provided, such as an auxiliary processor to manage input/output, an auxiliary processor to perform floating point mathematical operations, a special-purpose microprocessor having an architecture suitable for fast execution of signal processing algorithms (e.g., digital signal processor), a slave processor subordinate to the main processing system (e.g., back-end processor), an additional microprocessor or controller for dual or multiple processor systems, or a coprocessor, for example if parallel processing is to be implemented. Such auxiliary processors or coprocessors may be discrete processors or may be integrated with the processor 1452.
[00108] The processor 1452 may be connected to a communication bus 1454. The communication bus 1454 may include a data channel for facilitating information transfer between storage and other peripheral components of the computer system 1450. The communication bus 1454 further may provide a set of signals used for communication with the processor 1452, including a data bus, address bus, and control bus (not shown). The communication bus 1454 may comprise any standard or non-standard bus architecture such as, for example, bus architectures compliant with industry standard architecture ("ISA"), extended industry standard architecture ("EISA"), Micro Channel Architecture ("MCA"), peripheral component interconnect ("PCI") local bus, or standards promulgated by the Institute of Electrical and Electronics Engineers ("IEEE") including IEEE 488 general-purpose interface bus ("GPIB"), IEEE 696/S-100, and the like. [00109] The network enabled device 1450 may also include a main memory 1456 and may also include a secondary memory 1458. The main memory 148 can provide a buffer to store online activity during an online session. For example, the buffer can provide a sliding window of memory that stores online activity of users in the online session. The duration of the online session that is saved can be predetermined, set u a user, adjusted under program control, or by other techniques. The main memory 1456 can also provide storage of instructions and data for programs executing on the processor 1452. The main memory 1456 is typically semiconductor-based memory such as dynamic random access memory ("DRAM") and/or static random access memory ("SRAM"). Other semiconductor-based memory types include, for example, synchronous dynamic random access memory ("SDRAM"), Rambus dynamic random access memory ("RDRAM"), ferroelectric random access memory ("FRAM"), and the like, including read only memory ("ROM").
[00110] The secondary memory 1458 may optionally include a hard disk drive 1460 and/or a removable storage drive 1462, for example a floppy disk drive, a magnetic tape drive, a compact disc ("CD") drive, a digital versatile disc ("DVD") drive, a memory stick, etc. The removable storage drive 1462 reads from and/or writes to a removable storage medium 1464 in a well-known manner. Removable storage medium 1464 may be, for example, a CD, DVD, a flash drive, a memory stick, etc.
[00111] The removable storage medium 1464 is typically a computer readable medium having stored thereon computer executable code (i.e., software) and/or data. The computer software or data stored on the removable storage medium 1464 may be read into the computer system 1450 as electrical communication signals 1478.
[00112] In alternative embodiments, secondary memory 1458 may include other similar means for allowing computer programs or other data or instructions to be loaded into the computer system 1450. Such means may include, for example, an external storage medium 1472 and an interface 1470. Examples of external storage medium 1472 may include an external hard disk drive or an external optical drive, or and external magneto- optical drive. [00113] Other examples of secondary memory 1458 may include semiconductor-based memory such as programmable read-only memory ("PROM"), erasable programmable read-only memory ("EPROM"), electrically erasable read-only memory ("EEPROM"), or flash memory (block oriented memory similar to EEPROM). Also included are any other removable storage units 1472 and interfaces 1470, which allow software and data to be transferred from the removable storage unit 1472 to the network enabled device 1450. [00114] The network enabled device 1450 may also include a communication interface 1474. The communication interface 1474 allows software and data to be transferred between the network enabled device 450 and external devices, networks, or information sources. For example, computer software or executable code may be transferred to network enabled device 1450 from a network entity via communication interface 1474. In addition, the communication interface 1474 can establish and maintain communications, both wired and wireless, to external networks, such as the Internet. Examples of communication interface 1474 include a modem, a network interface card ("NIC"), a communications port, a PCMCIA slot and card, an infrared interface, and an IEEE 1394 fire-wire, a wireless LAN, an IEEE 802.11 interface, an IEEE 802.16 interface, a Blue Tooth interface, a mesh network interface, just to name a few.
[00115] Communication interface 1474 typically can implement industry promulgated protocol standards, such as Ethernet IEEE 802 standards, Fiber Channel, digital subscriber line ("DSL"), asynchronous digital subscriber line ("ADSL"), frame relay, asynchronous transfer mode ("ATM"), integrated digital services network ("ISDN"), personal communications services ("PCS"), transmission control protocol/Internet protocol ("TCP/IP"), serial line Internet protocol/point to point protocol ("SLIP/PPP"), and so on, but may also implement customized or non-standard interface protocols as well. [00116] Software and data transferred via the communication interface 1474 are generally in the form of electrical communication signals 1478. These signals 1478 may be provided to communication interface 1474 via a communication channel 1480. The communication channel 1480 carries the signals 1478 and can be implemented using a variety of wired or wireless communication means including wire or cable, fiber optics, conventional phone line, cellular phone link, wireless data communication link, radio frequency (RF) link, or infrared link, just to name a few. [00117] Computer executable code (i.e., computer programs or software) can be stored in the main memory 1456 and/or the secondary memory 1458. Computer programs can also be received via the communication interface 1474 and stored in the main memory 1456 and/or the secondary memory 1458. Such computer programs, when executed, can enable the computer system 1450 to perform the various functions of the present invention as previously described.
[00118] In this description, the term "computer readable medium" is used to refer to any media used to store data and/or provide computer executable code (e.g., software and computer programs) to the network enabled device 1450. Examples of these media include main memory 1456, secondary memory 1458 (including hard disk drive 1460, removable storage medium 1464, and external storage medium 1472), and any peripheral device communicatively coupled with communication interface 1474 (including other network devices). These computer readable mediums are means for providing executable code, programming instructions, and software, or storing and/or recording data to the network enabled device 1450.
[00119] The network enabled device 1450 also includes a triggering mechanism 1476. The triggering mechanism can be activated by a user to indicate the occurrence of an event. For example, if a user observes inappropriate behavior by another online user the triggering mechanism can be activated. Activation of the triggering mechanism can cause various operations by the network enabled device. For example, if a user activates the triggering mechanism a time-based history of an online session can be stored. In one embodiment, the triggering mechanism is a panic button.
[00120] In one embodiment, Figure 15 is a block diagram illustrating an example game cheat monitoring entity that may be used in connection with various embodiments described herein. As shown in Figure 15 a game cheat monitoring entity 1500 includes a network interface 1502 that receives an indication that there is cheating behavior. For example, a player in an online game can send an indication that another player in an online game is cheating. The game cheat monitoring entity 1500 also includes a processor 1504 that collects game information of the play of at least the suspected cheating player. In another embodiment, the game cheat monitoring entity 1500 collects game information of the play of all of the players in the online game session. The game information can include a time period that extends a desired duration before and after receiving the indication. For example, in one embodiment, the game cheat monitoring entity can be a game server that collects game information as the players play the game. In another embodiment, the game cheat monitoring entity can be a separate network entity, or can be included with another network entity. In still another embodiment, the cheat monitoring entity can receive game information from another network entity, such as a game server, or players in the game, or other source,
[00121] The processor 1504 uses the game information to recreate online game activity of players in the game session to determine if there was cheating activity by one or more of the players. If there was cheating by one or more players, the game cheat monitoring entity takes appropriate action. For example, the game cheat monitoring entity can restrict a player that has been identified as a "cheater" from access to the online game session, or other game sessions, or limit game options that are available to a player that has been identified as a cheater or other types of actions. [00122] In another embodiment, Figure 15 is a block diagram of a moderation entity that can allocate resources such as online resources or other network resources. The moderation entity, also referred to as a network allocation moderation entity, can be a separate entity in communication with a network, such as the network 104 illustrated in Figures 1-4, or the operations of the network resource allocation moderation entity can be implemented in another network entity, such as for example, a moderation entity 108, a server 106, a user 102, or other network entity, as shown in Figure 1. As shown in Figure 15 the network entity 1500 includes a network interface 1502. The network entity 1500 can receive an indication that an online user may be engaged in inappropriate activity. [00123] The network entity 1500 also includes a processor that can capture a time based history of online activity of users in an online community when the indication is received. The network entity recreates the online activity of the online community and determines if there has been inappropriate online activity by one or more of the online users and if there is inappropriate activity allocates online resources to achieve a desired level of monitoring of an offending users. [00124] In another embodiment, the functions of the network entity 1500 can be implemented in other entities, or across several network entities. For example, a moderation entity 108, or a server 106, or a user 102 can implement the operations of the network entity 1500. For example, a moderation entity can receive an indication of inappropriate activity and capture a time based history of the activity. The moderation entity can then send an indication of a desired level of monitoring that is desired to a network entity that adjusts a level of network resources allocated to monitoring the offending user. [00125] Figure 16 is a flowchart illustrating an embodiment of detecting cheating in an online environment. Flow begins in block 1602 and a player in an online game session detects suspected cheating behavior by another online game player. Flow continues to block 1604 and game information is collected about the game play activity of players in the online game. The game information can include the game activity of the suspected cheating player, or all of the players, or any desired number of players. In one embodiment, the game information includes a period of the game during which the suspected cheating behavior occurred. The game information can include the actions of the game players. For example, where they move, how fast they move, do they seem to have more ability or powers than are typical, and the like. [00126] Flow then continues to block 1606. In block 1606, the game information is communicated to a game cheat monitoring entity. Flow continues to block 1608 and the game cheat monitoring entity evaluates the game information to determine if there was cheating activity. If there was cheating activity the game cheat monitoring entity can take appropriate action. In one embodiment, a reward is provided to a game player that observes cheating behavior and communicates the game information to the game cheat monitoring entity. There can also be a triggering mechanism that a player activates in response to detecting suspected cheating activity
[00127] In one embodiment, capturing the game information of the online game session includes capturing online game session activity that occurred a predetermined amount of time before detecting the suspected cheating behavior. In an embodiment, capturing the game information includes associating an online game player's identity with the player's online activity.
[00128] Figure 17 is a flowchart illustrating another embodiment of detecting cheating in an online environment. Flow begins in block 1702 where an indication that a player in an online game session suspects that another player in the game session is engaging in cheating behavior is received. For example, a game cheat monitoring entity can receive the indication. Flow continues to block 1704 where the game cheat monitoring entity collects game information of game activity around a time of the suspected cheating behavior. For example, the game cheat monitoring entity can be a game server and collect game information. In another embodiment, the game cheat monitoring entity receives the game information. For example, the game cheat monitoring entity can receive game information from a game server, or from players in the online game, or other network entity, or any combination of entities. Flow continues to block 1706 and the game cheat monitoring entity recreating the game activity from the game information. [00129] Flow continues to block 1708 and the game cheat monitoring entity evaluates the activities of the players in the game to determine if there was cheating behavior. If there is cheating behavior by one or more of the game players, the game cheat monitoring entity can take appropriate action against the cheating game players. One example of appropriate activity includes restricting access to the online game by the cheating game player. [00130] Figure 18 is a block diagram of another embodiment of a moderation entity that can allocate resources such as online resources or other network resources. The moderation entity, also referred to as a network allocation moderation entity, can be a separate entity in communication with a network, such as the network 104 illustrated in Figures 1-4, or the operations of the network resource allocation moderation entity can be implemented in another network entity, such as for example, a moderation entity 108, a server 106, a user 102, or other network entity, as shown in Figure 1. As shown in Figure 18 the network entity 1800 includes a network interface 1802. The network entity 1800 can receive an indication that an online user may be engaged in inappropriate activity. [00131] The network entity 1800 also includes a processor 1804 that can capture a time based history of online activity of users in an online community when the indication is received. The network entity recreates the online activity of the online community and determines if there has been inappropriate online activity by one or more of the online users and if there is inappropriate activity allocates online resources to achieve a desired level of monitoring of an offending users. [00132] In another embodiment, the functions of the network entity 1800 can be implemented in other entities, or across several network entities. For example, a moderation entity 108, or a server 106, or a user 102 can implement the operations of the network entity 1800. For example, a moderation entity can receive an indication of inappropriate activity and capture a time based history of the activity. The moderation entity can then send an indication of a desired level of monitoring that is desired to adjust a level of network resources allocated to monitoring the offending user. [00133] Figure 19 is a flow chart illustrating aspects of online or other network resource allocations. Flow begins in block 1902 where an indication that an online user may be engaging in inappropriate behavior is received. Flow then continues to block 1904 and a time based history of an online session that includes the user's behavior is captured. Flow continues to block 1906 where the online activities of the session are recreated. In block 1906 it is determined if there was inappropriate activity by an offending online user. Flow then continues to block 1908. In block 1908 online resources are allocated for a desired level of monitoring of the offending online user.
[00134] In one embodiment, capturing the time based history of the online session includes capturing online session activity which occurred a predetermined amount of time before receiving the indication that an online user may be engaging in inappropriate behavior. In another embodiment, capturing the time based history comprises associating online user identities with their online activity.
[00135] In one embodiment, allocating online resources for a desired level of monitoring of the offending member includes assigning online resources to track the activities of the offending member. In an embodiment, a network resource allocation entity that captures the time based history. In another embodiment, a moderation entity that captures the time based history, recreates the online activity, and communicates the desired level of monitoring of the offending user to a network resource allocation entity that allocates network resources. In still another embodiment the time based history is received from another network entity. [00136] Figure 20 is a flow chart illustrating additional aspects of allocating online or other network resources. Flow begins in block 2002 an indication of a triggering mechanism being activated by an online community member is received indicating suspected inappropriate behavior by another online community member. Flow continues to block 2004 where a time based history of community members online activity is received. Flow continues to block 2006 where the community activity from the time based history is recreated. Flow continues to block 2008 where activities of the community members are evaluated to determine if there was inappropriate activity and if there is inappropriate activity by an offending community member online resources are allocated to monitor community members that have been identified as engaging in inappropriate behavior.
[00137] Various embodiments may also be implemented primarily in hardware using, for example, components such as application specific integrated circuits ("ASICs"), or field programmable gate arrays ("FPGAs"). Implementation of a hardware state machine capable of performing the functions described herein will also be apparent to those skilled in the relevant art. Various embodiments may also be implemented using a combination of both hardware and software. [00138] The term "module" as used herein means, but is not limited to a software or hardware component, such as an FPGA or an ASIC, which performs certain tasks. A module may advantageously be configured to reside on an addressable storage medium and configured to execute on one or more network enabled devices or processors. Thus, a module may include, by way of example, components, processes, functions, attributes, procedures, subroutines, segments of program code, drivers, firmware, microcode, circuitry, data, databases, data structures, tables, arrays, variables, and the like. The functionality provided for in the components and modules may be combined into fewer components and modules or further separated into additional components and modules. Additionally, the components and modules may advantageously be implemented to execute on one or more network enabled devices or computers.
[00139] Furthermore, those of skill in the art will appreciate that the various illustrative logical blocks, modules, circuits, and method steps described in connection with the above described figures and the embodiments disclosed herein can often be implemented as electronic hardware, computer software, or combinations of both. To clearly illustrate this interchangeability of hardware and software, various illustrative components, blocks, modules, circuits, and steps have been described above generally in terms of their functionality. Whether such functionality is implemented as hardware or software depends upon the particular application and design constraints imposed on the overall system. Skilled persons can implement the described functionality in varying ways for each particular application, but such implementation decisions should not be interpreted as causing a departure from the scope of the invention. In addition, the grouping of functions within a module, block, circuit or step is for ease of description. Specific functions or steps can be moved from one module, block or circuit to another without departing from the invention.
[00140] Moreover, the various illustrative logical blocks, modules, and methods described in connection with the embodiments disclosed herein can be implemented or performed with a general purpose processor, a digital signal processor ("DSP"), an ASIC, FPGA or other programmable logic device, discrete gate or transistor logic, discrete hardware components, or any combination thereof designed to perform the functions described herein. A general-purpose processor can be a microprocessor, but in the alternative, the processor can be any processor, controller, microcontroller, or state machine. A processor can also be implemented as a combination of computing devices, for example, a combination of a DSP and a microprocessor, a plurality of microprocessors, one or more microprocessors in conjunction with a DSP core, or any other such configuration. [00141] Additionally, the steps of a method or process described in connection with the embodiments disclosed herein can be embodied directly in hardware, in a software module executed by a processor, or in a combination of the two. A software module can reside in RAM memory, flash memory, ROM memory, EPROM memory, EEPROM memory, registers, hard disk, a removable disk, a CD-ROM, or any other form of storage medium including a network storage medium. An exemplary storage medium can be coupled to the processor such the processor can read information from, and write information to, the storage medium. In the alternative, the storage medium can be integral to the processor. The processor and the storage medium can also reside in an ASIC.
[00142] While the above is a complete description of the preferred embodiment of the present invention, it is possible to use various alternatives, modifications and equivalents. Therefore, the scope of the present invention should be determined not with reference to the above description but should, instead, be determined with reference to the appended claims, along with their full scope of equivalents. Any feature described herein, whether preferred or not, may be combined with any other feature described herein, whether preferred or not. Thus, the invention is not intended to be limited to the embodiment shown herein but is to be accorded the widest scope consistent with the principal and novel features disclosed herein.

Claims

[00143]CLAIMSWHAT IS CLAIMED IS:
1. A method for community moderation of an online session, the method comprising: observing inappropriate behavior by an online user; activating a triggering mechanism in response to the inappropriate behavior; capturing a time based history of the online session; and transmitting the time based history to a moderation entity.
2. The method of claim 1 , wherein the time based history of the online session comprises online session activity that occurred a predetermined amount of time before the triggering mechanism is pressed.
3. The method of claim 1 , wherein the time based history comprises associating online user identities with their online activity.
4. The method of claim 1 , further comprising providing a reward to a user that observes inappropriate behavior and presses the triggering mechanism.
5. The method of claim 1 , wherein activating a triggering mechanism comprises pressing a panic button.
6. A method of moderating activity in an online community, the method comprising: receiving an indication of a triggering mechanism being activated by a community member in response to inappropriate activity by another community member; receiving a time based history of community members activity around a time of the triggering mechanism being activated; recreating the community activity from the time based history; and evaluating activities of the community members to determine if there was inappropriate activity and if there is inappropriate activity by an offending community member, taking appropriate action against the offending community member.
7. The method of claim 6, wherein taking appropriate action against the offending community member comprises one or more of issuing a warning to the offending community member, limiting available online options to the offending community member, and restricting access to the online community by the offending community member.
8. An online community comprising: at least two users that communicate in the online community, wherein a first user in the online community observes inappropriate behavior by a second user in the online community and presses a panic button in response to the inappropriate behavior, the pressing of the panic button initiating storing a time based history of online community activity, the time based history covering a period that extends a desired duration before the pressing of the panic button and a desired duration after pressing the panic button; and a moderation entity that receives the time based history and recreates the online activity to determine if there was inappropriate activity by one of the users, and if there was inappropriate activity by one of the users, taking appropriate action against that user.
9. The online community of claim 8, wherein taking appropriate action comprises one or more of issuing a warning to the offending community member, limiting available online options to the offending community member, and restricting access to the online community by the offending community member.
10. An network enabled device comprising: a triggering mechanism; a processor that captures a time based history of online activity of users in an online community when the triggering mechanism is activated; a network interface that transmits the time based history to a moderation entity, the moderation entity determines if there has been inappropriate online activity by one of the online users.
11. A method for moderating cheating in an online game, the method comprising: playing an online game; detecting suspected cheating behavior by an online game player; collecting game information about the activity of all players in the online game, the game information including the suspected cheating behavior; and communicating the game information to a game cheat monitoring entity that evaluates the game information to determine if there was cheating activity and if there was cheating activity, taking appropriate action.
12. The method of claim 11 , wherein capturing the game information of the online game session comprises capturing online game session activity that occurred a predetermined amount of time before detecting the suspected cheating behavior.
13. The method of claim 11 , wherein capturing the game information comprises associating an online game player's identity with the player's online activity.
14. The method of claim 11 , further comprising providing a reward to a game player that observes cheating behavior and communicates the game information to the game cheat monitoring entity.
15. The method of claim 11 , further comprising activating a triggering mechanism in response to detecting suspected cheating activity.
16. A method of moderating cheating activity in an online game community, the method comprising: receiving an indication that a player in an online game session suspects that another player in the game session is engaging in cheating behavior; collecting game information of game activity around a time of the suspected cheating behavior; recreating the game activity from the game information; and evaluating activities of the players in the game to determine if there was cheating behavior and if there is cheating behavior by one of the game players, taking appropriate action against the cheating game player.
17. The method of claim 16, wherein taking appropriate action against the cheating game player comprises restricting access to the online game by the cheating game player.
18. An online game session comprising: at least two players that communicate in the online game session, wherein a first player in the online game session detects suspected cheating behavior by a second player in the online game session, the first player communicates an indication to a game cheat monitoring entity that there is suspected charting behavior; and the game cheat monitoring entity, upon receiving an indication that there is cheating behavior, collects game information of players in the online game session, the game information includes a time period that extends a desired duration before and after receiving the indication, the game cheat monitoring entity uses the game information to recreate online game activity of the players to determine if there was cheating activity by one of the players, and if there was cheating activity by one of the players, the game cheat monitoring entity takes appropriate action.
19. The online game session of claim 18, wherein taking appropriate action comprises restricting access to the online game session by the cheating player.
20. A game cheat monitoring entity comprising: a network interface that receives an indication that there is cheating behavior; a processor that collects game information of players in the online game session, the game information includes a time period that extends a desired duration before and after receiving the indication, the processor uses the game information to recreate online game activity of players in the game session to determine if there was cheating activity by one or more of the players, and if there was cheating activity by one or more of the players, the game cheat monitoring entity takes appropriate action.
21. A method of allocating online resources to monitor online community members that have been identified as engaging in inappropriate behavior, the method comprising: receiving an indication that an online user may be engaging in inappropriate behavior; capturing a time based history of an online session that includes the user's behavior; recreating the online activity and determining if there was inappropriate activity by an offending online user; and allocating online resources for a desired level of monitoring of the offending online user.
22. The method of claim 21 , wherein capturing the time based history of the online session comprises capturing online session activity that occurred a predetermined amount of time before receiving the indication that an online user may be engaging in inappropriate behavior.
23. The method of claim 21 , wherein capturing the time based history comprises associating online user identities with their online activity.
24. The method of claim 21 , wherein allocating online resources for a desired level of monitoring of the offending online user comprises assigning online resources to track the activities of the offending online user.
25. The method of claim 21 , further comprising a network resource allocation entity that captures the time based history.
26. The method of claim 21 , further comprising a moderation entity that captures the time based history, recreates the online activity, and communicates the desired level of monitoring of the offending user to a network resource allocation entity that allocates network resources.
27. A method of allocating online resources to monitor online community members that have been identified as engaging in inappropriate behavior, the method comprising: receiving an indication of a triggering mechanism being activated by an online community member in response to suspected inappropriate activity by another online community member; receiving a time based history of community members online activity around a time of the triggering mechanism being activated; recreating the community activity from the time based history; and evaluating activities of the community members to determine if there was inappropriate activity and if there is inappropriate activity by an offending community member allocating online resources to monitor community members that have been identified as engaging in inappropriate behavior.
28. An online community with online resources that are allocated to monitor members of the online community, the online community comprising: at least two users that communicate in the online community, wherein a first user in the online community observes suspected inappropriate behavior by one or more other users in the online community, the first user presses a panic button in response to the inappropriate behavior, the pressing of the panic button initiating storing a time based history of online community activity, the time based history covering a period that extends a desired duration before the pressing of the panic button and a desired duration after pressing the panic button; a moderation entity that receives the time based history and recreates the online activity to determine if there was inappropriate activity by one of the users, and if there was inappropriate activity by one of the users, determining a desired level of monitoring to track an offending users' activity; and a network allocation resource allocation entity that allocates online resources to track the activities of the offending user.
29. A network entity comprising: a network interface that receives an indication that an online user may be engaged in inappropriate activity; a processor that captures a time based history of online activity of users in an online community when the indication is received, recreates the online activity of the online community and determines if there has been inappropriate online activity by one or more of the online users and if there is inappropriate activity allocates online resources to achieve a desired level of monitoring of an offending users.
30. A method for improving an integrity of an application, the method comprising: interacting with the application; observing unexpected operation of the application; activating a triggering mechanism in response to the unexpected operation; capturing a time-based history of the application session; and communicating the time-based history to a network entity for evaluation.
31. The method of claim 30, wherein the application comprises an online game.
32. The method of claim 31 , wherein capturing the time-based history of the application session comprises capturing online game session activity that occurred a predetermined amount of time before the triggering mechanism is pressed.
33. The method of claim 31 , further testing the online game.
34. The method of claim 30, wherein activating a triggering mechanism comprises pressing a panic button.
35. The method of claim 30, wherein observing unexpected operation of the application comprises observing a glitch in the operation of the application.
36. The method of claim 30, wherein the network entity comprises a server.
37. The method of claim 30, wherein communicating the time-based history comprises transmitting the time -based history over a local area network.
38. The method of claim 30, wherein communicating the time-based history comprises transmitting the time -based history over a wide area network.
39. The method of claim 38, wherein the wide area network comprises the Internet.
40. A method of testing an online game, the method comprising: receiving an indication of a triggering mechanism being activated in response to unexpected operation of an online game; receiving a time-based history of online game activity around a time of the triggering mechanism being activated; recreating game activity from the time-based history; and evaluating the game activity to determine if there is a malfunction in the operation of the game.
41. The method of claim 40, further comprising troubleshooting the malfunction in the operation of the game.
42. An online game test unit comprising: a triggering mechanism; a processor that captures a time-based history of game activity when the triggering mechanism is activated; a network interface that transmits the time-based history to a network entity, the network entity determines if there is a malfunction in the operation of the online game.
EP08843017A 2007-10-26 2008-10-20 On-line monitoring of resources Ceased EP2227301A4 (en)

Applications Claiming Priority (5)

Application Number Priority Date Filing Date Title
US11/925,570 US7865590B2 (en) 2007-10-26 2007-10-26 Community based moderation in on-line sessions
US11/927,357 US8490199B2 (en) 2007-10-29 2007-10-29 Moderation of cheating in on-line gaming sessions
US11/929,617 US8204983B2 (en) 2007-10-30 2007-10-30 Allocation of on-line monitoring resources
US11/932,863 US20090111583A1 (en) 2007-10-31 2007-10-31 Systems and method for improving application integrity
PCT/US2008/080527 WO2009055342A1 (en) 2007-10-26 2008-10-20 On-line monitoring of resources

Publications (2)

Publication Number Publication Date
EP2227301A1 true EP2227301A1 (en) 2010-09-15
EP2227301A4 EP2227301A4 (en) 2012-02-29

Family

ID=40579943

Family Applications (1)

Application Number Title Priority Date Filing Date
EP08843017A Ceased EP2227301A4 (en) 2007-10-26 2008-10-20 On-line monitoring of resources

Country Status (5)

Country Link
EP (1) EP2227301A4 (en)
JP (1) JP5037692B2 (en)
KR (1) KR101390214B1 (en)
CN (1) CN101909711B (en)
WO (1) WO2009055342A1 (en)

Families Citing this family (18)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US7920983B1 (en) 2010-03-04 2011-04-05 TaKaDu Ltd. System and method for monitoring resources in a water utility network
US9245177B2 (en) * 2010-06-02 2016-01-26 Microsoft Technology Licensing, Llc Limiting avatar gesture display
CN101931534A (en) * 2010-08-30 2010-12-29 中兴通讯股份有限公司 Management method and device of operator resource usage license
US8814697B2 (en) * 2011-04-19 2014-08-26 Sony Computer Entertainment America Llc Method and apparatus for use in preserving a game state
EP2525587B1 (en) * 2011-05-17 2017-07-05 Alcatel Lucent Method for streaming video content, node in a network for monitoring video content streaming
JP2013111106A (en) * 2011-11-25 2013-06-10 Nintendo Co Ltd Communication system, communication program, information processing apparatus, server, and communication method
US8341106B1 (en) 2011-12-07 2012-12-25 TaKaDu Ltd. System and method for identifying related events in a resource network monitoring system
US9053519B2 (en) 2012-02-13 2015-06-09 TaKaDu Ltd. System and method for analyzing GIS data to improve operation and monitoring of water distribution networks
US10242414B2 (en) 2012-06-12 2019-03-26 TaKaDu Ltd. Method for locating a leak in a fluid network
CN104836714A (en) * 2014-02-08 2015-08-12 湖北金像无人航空科技服务有限公司 Method of avoiding network chess game cheating through text chatting
JP5936748B1 (en) * 2015-05-20 2016-06-22 株式会社Cygames Information processing system, server and program, and terminal and program
WO2017191696A1 (en) 2016-05-06 2017-11-09 ソニー株式会社 Information processing system and information processing method
JP6836379B2 (en) * 2016-12-05 2021-03-03 株式会社コロプラ An information processing method, a device, and a program that causes a computer to execute the information processing method.
US10994209B2 (en) * 2017-11-27 2021-05-04 Sony Interactive Entertainment America Llc Shadow banning in social VR setting
JP6721727B1 (en) * 2019-01-08 2020-07-15 ソフトバンク株式会社 Information processing apparatus control program, information processing apparatus control method, and information processing apparatus
JP2022541019A (en) * 2019-07-15 2022-09-21 ロンザ ウォーカーズヴィル,インコーポレーテッド Process control system for automated cell engineering systems
KR20210115442A (en) 2020-03-13 2021-09-27 주식회사 하이퍼커넥트 Report evaluation device and operating method thereof
US10817961B1 (en) * 2020-06-10 2020-10-27 Coupang Corp. Computerized systems and methods for tracking dynamic communities

Citations (10)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US6256663B1 (en) * 1999-01-22 2001-07-03 Greenfield Online, Inc. System and method for conducting focus groups using remotely loaded participants over a computer network
US20040053675A1 (en) * 2002-09-13 2004-03-18 Nguyen Binh T. Method and apparatus for independently verifying game outcome
WO2004071601A2 (en) * 2003-02-11 2004-08-26 Waterleaf Limited Collusion detection
US20040242321A1 (en) * 2003-05-28 2004-12-02 Microsoft Corporation Cheater detection in a multi-player gaming environment
US20050137016A1 (en) * 2003-12-17 2005-06-23 Multimedia Games, Inc. Method, apparatus, and program product for detecting money laundering activities in gaming systems
US20060205489A1 (en) * 2004-12-30 2006-09-14 Jerome Carpenter Methods for game player identification
US20060235966A1 (en) * 2005-04-15 2006-10-19 Imoderate Research Technologies Predefined live chat session
US20070168511A1 (en) * 2006-01-17 2007-07-19 Brochu Jason M Method and apparatus for user moderation of online chat rooms
FR2896648A1 (en) * 2006-01-23 2007-07-27 France Telecom Multimedia conversation system e.g. television, has multi-point control unit establishing audio and video communication between computer and transmission unit, where computer is provided with teleconferencing bridge and moderation interface
US20070232398A1 (en) * 2006-03-31 2007-10-04 Aikin Jeffrey C System and method for detecting collusion in online gaming via conditional behavior

Family Cites Families (11)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US7881944B2 (en) * 2002-05-20 2011-02-01 Microsoft Corporation Automatic feedback and player denial
JP2004021549A (en) * 2002-06-14 2004-01-22 Hitachi Information Systems Ltd Network monitoring system and program
US7169050B1 (en) * 2002-08-28 2007-01-30 Matthew George Tyler Online gaming cheating prevention system and method
US7287052B2 (en) * 2002-11-09 2007-10-23 Microsoft Corporation Challenge and response interaction between client and server computing devices
KR100932483B1 (en) 2002-11-20 2009-12-17 엘지전자 주식회사 Mobile communication terminal and avatar remote control method using the same
CN1558574A (en) * 2004-02-05 2004-12-29 浙江大学 Method and system for realizing wireless industrial monitoring by means of cell phones
JP4385863B2 (en) * 2004-06-23 2009-12-16 株式会社セガ Online game fraud detection method
US7165050B2 (en) 2004-09-20 2007-01-16 Aaron Marking Media on demand via peering
CN1783068B (en) * 2005-09-09 2010-04-28 浙江大学 Method for implementing fault diagnosis and monitoring database service
US7753795B2 (en) 2006-03-20 2010-07-13 Sony Computer Entertainment America Llc Maintaining community integrity
US8771061B2 (en) * 2006-03-20 2014-07-08 Sony Computer Entertainment America Llc Invalidating network devices with illicit peripherals

Patent Citations (10)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US6256663B1 (en) * 1999-01-22 2001-07-03 Greenfield Online, Inc. System and method for conducting focus groups using remotely loaded participants over a computer network
US20040053675A1 (en) * 2002-09-13 2004-03-18 Nguyen Binh T. Method and apparatus for independently verifying game outcome
WO2004071601A2 (en) * 2003-02-11 2004-08-26 Waterleaf Limited Collusion detection
US20040242321A1 (en) * 2003-05-28 2004-12-02 Microsoft Corporation Cheater detection in a multi-player gaming environment
US20050137016A1 (en) * 2003-12-17 2005-06-23 Multimedia Games, Inc. Method, apparatus, and program product for detecting money laundering activities in gaming systems
US20060205489A1 (en) * 2004-12-30 2006-09-14 Jerome Carpenter Methods for game player identification
US20060235966A1 (en) * 2005-04-15 2006-10-19 Imoderate Research Technologies Predefined live chat session
US20070168511A1 (en) * 2006-01-17 2007-07-19 Brochu Jason M Method and apparatus for user moderation of online chat rooms
FR2896648A1 (en) * 2006-01-23 2007-07-27 France Telecom Multimedia conversation system e.g. television, has multi-point control unit establishing audio and video communication between computer and transmission unit, where computer is provided with teleconferencing bridge and moderation interface
US20070232398A1 (en) * 2006-03-31 2007-10-04 Aikin Jeffrey C System and method for detecting collusion in online gaming via conditional behavior

Non-Patent Citations (1)

* Cited by examiner, † Cited by third party
Title
See also references of WO2009055342A1 *

Also Published As

Publication number Publication date
WO2009055342A1 (en) 2009-04-30
JP2011502307A (en) 2011-01-20
JP5037692B2 (en) 2012-10-03
KR101390214B1 (en) 2014-06-26
CN101909711B (en) 2014-12-24
KR20100076046A (en) 2010-07-05
EP2227301A4 (en) 2012-02-29
CN101909711A (en) 2010-12-08

Similar Documents

Publication Publication Date Title
US8490199B2 (en) Moderation of cheating in on-line gaming sessions
US7865590B2 (en) Community based moderation in on-line sessions
EP2227301A1 (en) On-line monitoring of resources
US20090111583A1 (en) Systems and method for improving application integrity
US8204983B2 (en) Allocation of on-line monitoring resources
US10092845B2 (en) Detecting lag switch cheating in game
US8359632B2 (en) Centralized account reputation
CN100362805C (en) Multifunctional management system for detecting erotic images and unhealthy information in network
US8553867B2 (en) User-defined system-enforced session termination in a unified telephony environment
CN105323221B (en) The method and system of the anti-cheating of online game
CN111465016A (en) Control method and device
KR100813680B1 (en) Method, Server and System for Providing Game Management Right to User in Game Service
CN111478860A (en) Network control method, device, equipment and machine readable storage medium
CN112887105A (en) Conference security monitoring method and device, electronic equipment and storage medium
CN108014499A (en) A kind of application parameter method of adjustment and device
US20220184501A1 (en) Video game center for a controlled environment facility
KR20050076003A (en) Sysem and its method for internet game usage analysis and access optional control
KR100635552B1 (en) Online game broadcasting system
Aime et al. A wireless distributed intrusion detection system and a new attack model
CN111932290A (en) Request processing method, device, equipment and storage medium
TWI336631B (en) Online game managing method and online game managing device using the same
CN111338502B (en) Method and device for detecting touch abnormity, storage medium and electronic device
KR100443557B1 (en) Internet game service monitoring system and method thereof
CN115037646A (en) Method for managing network users and edge computing node
CN114225423A (en) Game report processing method, device, equipment and storage medium

Legal Events

Date Code Title Description
PUAI Public reference made under article 153(3) epc to a published international application that has entered the european phase

Free format text: ORIGINAL CODE: 0009012

17P Request for examination filed

Effective date: 20100526

AK Designated contracting states

Kind code of ref document: A1

Designated state(s): AT BE BG CH CY CZ DE DK EE ES FI FR GB GR HR HU IE IS IT LI LT LU LV MC MT NL NO PL PT RO SE SI SK TR

AX Request for extension of the european patent

Extension state: AL BA MK RS

DAX Request for extension of the european patent (deleted)
RAP1 Party data changed (applicant data changed or rights of an application transferred)

Owner name: SONY COMPUTER ENTERTAINMENT AMERICA LLC

A4 Supplementary search report drawn up and despatched

Effective date: 20120127

RIC1 Information provided on ipc code assigned before grant

Ipc: A63F 9/24 20060101AFI20120123BHEP

17Q First examination report despatched

Effective date: 20121106

REG Reference to a national code

Ref country code: DE

Ref legal event code: R003

STAA Information on the status of an ep patent application or granted ep patent

Free format text: STATUS: THE APPLICATION HAS BEEN REFUSED

18R Application refused

Effective date: 20130927