WO2016149929A1 - Method, apparatus and computer program product for identifying a vulnerable friend for privacy protection in a social network - Google Patents

Method, apparatus and computer program product for identifying a vulnerable friend for privacy protection in a social network Download PDF

Info

Publication number
WO2016149929A1
WO2016149929A1 PCT/CN2015/075102 CN2015075102W WO2016149929A1 WO 2016149929 A1 WO2016149929 A1 WO 2016149929A1 CN 2015075102 W CN2015075102 W CN 2015075102W WO 2016149929 A1 WO2016149929 A1 WO 2016149929A1
Authority
WO
WIPO (PCT)
Prior art keywords
user
probability
friend
privacy
object user
Prior art date
Application number
PCT/CN2015/075102
Other languages
French (fr)
Inventor
Ye Tian
Yunjuan YANG
Wendong Wang
Original Assignee
Nokia Technologies Oy
Navteq (Shanghai) Trading Co., Ltd.
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Nokia Technologies Oy, Navteq (Shanghai) Trading Co., Ltd. filed Critical Nokia Technologies Oy
Priority to PCT/CN2015/075102 priority Critical patent/WO2016149929A1/en
Publication of WO2016149929A1 publication Critical patent/WO2016149929A1/en

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06QINFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR ADMINISTRATIVE, COMMERCIAL, FINANCIAL, MANAGERIAL OR SUPERVISORY PURPOSES; SYSTEMS OR METHODS SPECIALLY ADAPTED FOR ADMINISTRATIVE, COMMERCIAL, FINANCIAL, MANAGERIAL OR SUPERVISORY PURPOSES, NOT OTHERWISE PROVIDED FOR
    • G06Q50/00Information and communication technology [ICT] specially adapted for implementation of business processes of specific business sectors, e.g. utilities or tourism
    • G06Q50/01Social networking
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F21/00Security arrangements for protecting computers, components thereof, programs or data against unauthorised activity
    • G06F21/60Protecting data
    • G06F21/62Protecting access to data via a platform, e.g. using keys or access control rules
    • G06F21/6218Protecting access to data via a platform, e.g. using keys or access control rules to a system of files or objects, e.g. local or distributed file system or database
    • G06F21/6245Protecting personal data, e.g. for financial or medical purposes
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06QINFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR ADMINISTRATIVE, COMMERCIAL, FINANCIAL, MANAGERIAL OR SUPERVISORY PURPOSES; SYSTEMS OR METHODS SPECIALLY ADAPTED FOR ADMINISTRATIVE, COMMERCIAL, FINANCIAL, MANAGERIAL OR SUPERVISORY PURPOSES, NOT OTHERWISE PROVIDED FOR
    • G06Q10/00Administration; Management
    • G06Q10/10Office automation; Time management

Definitions

  • Embodiments of the disclosure generally relate to data processing, and, more particularly, to privacy protection in online social network.
  • the private information of a user may include information about travel arrangement, luxury consumption, illness record or drunk driving record.
  • the user generally does not wish the private information to be disseminated to other people. Accordingly, privacy protection is one of the major concerns in face of the fast information propagation in social networks.
  • a user In general, a user’s privacy-protection can be done through the privacy setting in a social network application.
  • many users often configure their privacy setting based on their experience. This may lead to inadequate privacy-protection. For example, a user may not precisely know whom he can trust and who are more vulnerable or contributed more toward dissemination of privacy information in his social network.
  • the existing social network applications or platforms usually do not provide suggestion or recommendation regarding privacy setting for a user. As a result, privacy information is posted every day in social networks with the risk of privacy leakage, while users are unaware that their privacy information is divulged unconsciously. Therefore, it is desirable to provide an improved technical solution for privacy protection.
  • a method for identifying a vulnerable friend for privacy protection of an object user in the object user’s online social network wherein the object user has a plurality of friends in the social network, the method comprising.
  • Said method comprises: for at least some of the object user’s friends, estimating their contributions towards dissemination of the object user’s privacy information based on their behaviors in the social network; and identifying the vulnerable friend based on the estimated contributions towards dissemination of the privacy information.
  • an apparatus comprising means configured to carry out the above-described method.
  • a computer program product embodied on a distribution medium readable by a computer and comprising program instructions which, when loaded into a computer, execute the above-described method.
  • a non-transitory computer readable medium having encoded thereon statements and instructions to cause a processor to execute the above-described method.
  • an apparatus for identifying a vulnerable friend for privacy protection of an object user in the object user’s online social network wherein the object user has a plurality of friends in the social network.
  • Said apparatus comprises: an estimator configured to, for at least some of the object user’s friends, estimate their contributions towards dissemination of the object user’s privacy information based on their behaviors in the social network; and an identifying element configured to identify the vulnerable friend based on the estimated contributions towards dissemination of the privacy information.
  • Figure 1 shows a schematic operating environment that can utilize some embodiments of the present disclosure
  • Figure 2 shows a schematic diagram of a directed graph
  • Figure 3 is a schematic diagram depicting a process of privacy information dissemination within privacy-receiving-disseminating (PRD) model according to an embodiment of the present disclosure
  • Figure 4 is a simplified block diagram illustrating an apparatus according to an embodiment of the present disclosure
  • Figure 5 is a simplified block diagram illustrating an apparatus according to another embodiment of the present disclosure.
  • Figure 6 is a flow chart of a process for identifying a vulnerable friend of an object user in a social network according to an embodiment of the present disclosure.
  • an aspect of the disclosure includes providing a technical solution for indentifying one or more vulnerable friends of an object user in a social network.
  • Figure 1 shows a schematic operating environment 100 in which some embodiments of the present disclosure can be implemented.
  • the operating environment 100 may comprise one or more social network platforms or applications 111-11n each operably connected to an identifying apparatus 110 through one or more networks.
  • the social network platforms or applications 111-11n can be any kind of social network platforms or applications capable of running on any type of computing device such as cloud computer, distributed computing system, virtual computer, smart phones, tablets, laptops, servers, thin clients, set-top boxes and PCs.
  • the social network platforms or applications 111-11n may include, but not limited to, LinkedIn, Facebook, Twitter, YouTube, WeChat, QQ space and WEIBO.
  • the social network platforms or applications 111-11n can be a server-client architecture or a distributed architecture or a peer to peer architecture or other appropriate architecture.
  • the social network platforms or applications 111-11n may maintain a social network graph containing all the users in the social network platforms or applications and/or respective social network graphs centered with each user in the social network or other suitable social network graphs.
  • the social network platforms or applications 111-11n may also store users’ interaction information and other useful information such as user profile and his privacy setting or the like.
  • the operating environment 100 may also comprise an identifying apparatus 110 which can be implemented in form of hardware, software or their combination, including but not limited to, cloud computer, distributed computing system, virtual computer, smart phones, tablets, laptops, servers, thin clients, set-top boxes and PCs.
  • the identifying apparatus 110 may run with any kind of operating system including, but not limited to, Windows, Linux, UNIX, Android, iOS and their variants. It is noted that although it is shown one identifying apparatus in Figure 1, but the operating environment 100 may comprise components physically separated and operably working together. For example, the identifying apparatus may be implemented as a distributed system.
  • the operating environment 100 may comprise network 108 such as any wired or wireless network or their combination, including, but not limited to, a wireless cellular telephone network (such as the global system for mobile communications (GSM) network, 3rd generation (3G) network, 3.5th generation (3.5G) network, 4th generation (4G) network, universal mobile telecommunications system (UMTS) , code division multiple access (CDMA) network etc) , a wireless local area network (WLAN) such as defined by any of the Institute of Electrical and Electronic Engineers (IEEE) 802. x standards, an Ethernet local area network, a token ring local area network, a wide area network, and the Internet.
  • GSM global system for mobile communications
  • 3G 3rd generation
  • 3.5G 3.5th generation
  • 4G 4th generation
  • UMTS universal mobile telecommunications system
  • CDMA code division multiple access
  • WLAN wireless local area network
  • IEEE Institute of Electrical and Electronic Engineers
  • the network 108 may include one or more communication devices for relaying or routing the information to be exchanged among the identifying apparatus 110 and the one or more social network platforms or applications 111-11n.
  • the identifying apparatus 110 and the one or more social network platforms or applications 111-11n may exchange information directly through communication media such as wireline media or wireless media.
  • the identifying apparatus 110 may be integrated with each of the one or more social network platforms or applications 111-11n or as a separated apparatus serving the one or more social network platforms or applications 111-11n or any combination thereof.
  • both the identifying apparatus 110 and the social network platforms or applications 111-11n may be capable of operating a connectivity program.
  • the connectivity program may allow the identifying apparatus 110 and the social network platforms or applications 111-11n to transmit and receive web content, such as user privacy information, according to a protocol, such as Wireless Application Protocol (WAP) , Hyper Text Transfer Protocol (HTTP) , Hyper Text Transfer Protocol over Secure Socket Layer (HTTPS) , Transmission Control Protocol/Internet Protocol (TCP/IP) and/or User Datagram Protocol (UDP) and/or the like.
  • WAP Wireless Application Protocol
  • HTTP Hyper Text Transfer Protocol
  • HTTPS Hyper Text Transfer Protocol over Secure Socket Layer
  • TCP/IP Transmission Control Protocol/Internet Protocol
  • UDP User Datagram Protocol
  • users in a social network can form one or more social network graphs depending on their relationships.
  • a user may be represented as a node in a social network graph.
  • the terms "user”and “node” are often used interchangeably in the present disclosure.
  • the social relationship or social tie between a user and one of his followers or friends may be represented as a link in the graph.
  • a schematic graph 200 of a social network centered with the object user “o” is shown in Figure 2.
  • E o is a set of directed edge e ⁇ u, v> ⁇ E o from u to his follower v.
  • U o is a set of nodes contained in the friend-network.
  • An arrow line represents a following relationship between two users. For example, users ⁇ 1 ⁇ 2 ⁇ 3 ⁇ 4 are four friends of the object user o.
  • the friends of a user mean that the user and his friends have direct connection in the social network graph 200, namely they can directly exchange information without an intermediate user. For example, user f’s friends are user j and e in the graph 200.
  • the route originating from a user such as user o to another user such as user h may be one of a number of paths such as o- ⁇ 1-h, o- ⁇ 1-v-h, or o- ⁇ 1-v-i-h in the graph 200. Therefore, the information from the user o may go to the user h through different paths. Additionally, there may be some loops in the social network graph, such as the loops h-i-v-h, h-i-v- ⁇ 1-h in the graph 200. Thus, the route originating from a user such as user o to another user such as user i may be a loop route, such as the path o- ⁇ 1-h-v- ⁇ 1-h-i.
  • the social network graph 200 and other information, such as the interaction information and the UGC, used by the identifying apparatus 110 can be stored in a centralized or distributed database, such as, RDBMS, SQL, NoSQL, or as one or more files on any storage medium, such as, RAM, HDD, diskette, CD, DVD, Blue-ray Disc, EEPROM, SSD.
  • a centralized or distributed database such as, RDBMS, SQL, NoSQL, or as one or more files on any storage medium, such as, RAM, HDD, diskette, CD, DVD, Blue-ray Disc, EEPROM, SSD.
  • the information posted by a user may be classified as two kinds of information, namely, public information and privacy information. While the public information may be shared with every follower, the privacy information is should be accessible only to the user himself or to certain particular followers.
  • the information dissemination in a social network is often as follows. An object user posts a content such as privacy information, then a follower (friend) of the object user, if allowed, may obtain this content by accessing the object user’s posts or using information push service. Then the friend may forward or post this information to his followers (friends) , and so on.
  • FIG. 3 is a schematic diagram 300 depicting a process of privacy information dissemination within privacy-receiving-disseminating (PRD) model according to an embodiment of the present disclosure.
  • the PRD model is a discrete time dissemination model based on the classical cascade model. Its parameters are of definite practical significance so that it can well imitate the real process of information propagation in social network.
  • each node is associated with two correlative probabilities, namely receiving probability (i.e., the second probability) and disseminating or diffusing probability (i.e., the first probability) , which impact the dissemination process of privacy information m o posted by an object user “o” .
  • the receiving probability of a user may be calculated based on at least the disseminating probability of an upper-stream user in the disseminating path and that user’s concern frequency towards the upper-stream user.
  • the disseminating path may be a loop-free path originating from the object user and passing through the user.
  • the concerns frequency ⁇ uv of user v to his friend u may mean the frequency that user v expresses towards u or user v views u’s personal page, etc. It can be conspicuously discovered from the frequency of interactive behaviors between them.
  • the concerns frequency ⁇ uv is illustrated in the following equation:
  • the receiving probability that user v obtains m o from his friend u is illustrated in the following equation:
  • disseminating probability D u (t n-1 ) of u indicates the possibility that u has forwarded m o before, which will be describe in detail in the following.
  • t n-1 is a discrete time.
  • concerns frequency R uv (t n ) can also be calculted by using any other suitable method, and the present embodiment has no limitation on it.
  • the receiving probability of user v may be asynchronously updated during a certain period. Let be the different time that user v knows m o from each his friend u, and t n be the latest time among The total receiving probability of user v may be illustrated in the following equation:
  • DF u is the set of the friends of u who have a directed edge with u
  • the minimized operator is intend to avoid the probability value greater than 1. It is noted that the total receiving probability can also be calculted by using any other suitable method, and the present embodiment has no limitation on it.
  • the disseminating probability of a user may be calculated based at least partly on the user’s privacy-protection consciousness, privacy leaking tendency and attitude of worship towards the object user.
  • a user who does not possess enough consciousness to protect himself is far less likely to safeguard the privacy of other users.
  • the privacy-protection consciousness of user v may be quantified, regarded as I v , by the relative non-accessibility of personal profiles in v’s privacy setting. It may be illustrated in the following equation:
  • n is the total number of personal profiles provided by the privacy setting of a social network application.
  • w i is the weight of i-th personal profile, which shows the relative sensitivity of it and is defined as a percentage of users who set i -th personal profiles non-accessible among the whole uses.
  • the degree of privacy leaking tendency determines the extent to which one user disseminates certain privacy information. Since the behavior tendency is an inherent personality, it can be assessed from the abundant historical records of online behaviors. In an example, supposing that the leaking tendency towards privacy information impose the same influence on both posting and forwarding behaviors.
  • the privacy leaking tendency L v of user v may be estimated by the average leakage probability of privacy information that user v ever posted himself or forwarded from others. It may be illustrated in the following equation:
  • the attitude of worship towards the object user such as user o can be reflected by o's authority A o .
  • the authority A o may be illustrated in the following equation:
  • disseminating probability that user v forwards privacy information m o of an object user o can be illustrated in the following equation:
  • d ov is the diameter (i.e. topological distance of the shortest route) between the object user user o and the user v. is the total receiving probability of user v . t n is the latest time of a series of discrete time. It is noted that the disseminating probability can also be calculted by using any other suitable method, and the present embodiment has no limitation on it.
  • Time t 0 is the initial time when an object user o posts the privacy information m o . t 1 , t 2 , ... are a series of discrete time that represent the continuous rounds of dissemination from node to node.
  • each of the object user o's friends ⁇ i ⁇ DF o may know m o from such as o's posts with a receiving probability, where DF o is the set of the object user o's friends who have a directed edge with o.
  • the receiving probabilities and disseminating probabilities of nodes may be asynchronously updated with the approaching of m o from disparate routes at different time.
  • the dissemination process continues until that the receiving probability of each terminal node of different routes, regarded as ⁇ i , is bellow a first threshlod ⁇ at time separately, i. e, each terminal node ⁇ i satisfies at time separately.
  • the first threshlod ⁇ is small enough to let each terminal node ⁇ i scarcely possible to know m o at time
  • Figure 4 is a simplified block diagram 400 illustrating an apparatus for identifying a vulnerable friend for privacy protection of an object user in the object user’s online social network according to an embodiment of the present disclosure, wherein the object user has a plurality of friends in the social network.
  • the apparatus 400 comprises a estimator 402 and an identifying element 404.
  • the estimator 402 is configured to, for at least some of the object user’s friends, estimate their contributions towards dissemination of the object user’s privacy information based on their behaviors in the social network.
  • the behaviors may include privacy-protection consciousness, privacy leaking tendency, attitude of worship towards the object user, user’s concern frequency towards the object user, etc.
  • the at least some of the object user’s friends may include those that the object user will wish to estimate their contributions towards dissemination of the object user’s privacy information.
  • the at least some of the object user’s friends may include all the object user’s friends.
  • the estimator 402 is further configured to determine an ultimate circle of disseminating (UCD) that includes those users who involve in dissemination of the privacy information.
  • UCD ultimate circle of disseminating
  • the ultimate circle of disseminating may include those users who have disseminated the privacy information.
  • the estimator 402 can generate the UCD from the social network centered with the object user by removing those users who have not disseminated the privacy information.
  • the ultimate circle of disseminating may include those users whose respective receiving probabilities of the privacy information originating from the object user are below a threshold.
  • the estimator 402 can generate the UCD from the social network centered with the object user by removing those users whose respective receiving probabilities are below the threshold.
  • the ultimate circle of disseminating may include those users whose respective disseminating probabilities of the privacy information originating from the object user are below a threshold.
  • the estimator 402 can generate the UCD from the social network centered with the object user by removing those users whose respective disseminating probabilities are below the threshold.
  • the estimator 402 may perform the following actions to generate the UCD.
  • the estimator 402 may obtain from the directed graph a set of constrained routes each of which takes the object user as a source node, wherein each constrained route of the set of constrained routes contains a particular order of nodes without repetition.
  • a constrained route from the object user to a destination user may be a route without loop.
  • the constrained routes from the object user o to a destination user i are o- ⁇ 1-h-i, o- ⁇ 1-v-i, o- ⁇ 1-h-v-i and o- ⁇ 1-v-h-i.
  • the estimator 402 can use any of route calculating algorithems to compute the constrained routes, and the present embodiment has no limitation on it.
  • the estimator 402 may calculate receiving probability that a user will disseminate the privacy information if that user receives the information.
  • the estimator 402 may calculate disseminating probability that a user will access the privacy information.
  • the receiving probability may be calculated based on based on at least the first probability of an upper-stream user in the disseminating path and that user’s concern frequency towards the upper-stream user.
  • the disseminating probability may be calculated based on the user’s privacy-protection consciousness, privacy leaking tendency and attitude of worship towards the object user.
  • the receiving probability and disseminating probability may be calculated by using equations (1) and (2) seperately and considering the length of the longest route among the set of constrained routes.
  • the estimator 402 can use any other suitable approach to calculate them, and the present embodiment has no limitation on it.
  • the estimator 402 may repeat to calculate a receiving probability and to calculate a disseminating probability for users in a possible disseminating path until the calculated receiving probability is less than a first threshold.
  • the estimator 402 may generate an updated set of constrained routes by removing the constrained route from the set of constrained routes, and recalculate the receiving probability and the disseminating probability by using equations (1) and (2) and considering the length of the longest route among the updated set of constrained routes.
  • the first threshold can be differently defined in different contexts. For example, if the number of constrained routes in the set of constrained routes is very large and the apparatus 400 is implemented in a mobile phone, then the first threshold may be defined relatively large to speed up the calculation process and not overload the mobile phone. By contrast, if the apparatus 400 is implemented in a server farm or cloud conputing platform, then the first threshold can be defined relatively small to include as many users as possible to improve the accuracy. In another embodiment, the first threshold can be determined through machine learning based on training or historical data. Further, the first threshold can be modified or updated after a period of time or when one or more predefined conditions are satisfied. In addition, the first threshold is configured in order to balance between computation efficiency and accuracy.
  • the estimator 402 may check a node in a constrained route in the set of constrained routes to determine whether the node’s receiving probability is below the first threshold. If yes, then the estimator 402 removes this constrained route from the set of constrained routes and checks a node in another constrained route in the set of constrained routes. If no, then the estimator 402 checks another unchecked node in this constrained route. The same process is done on all the constrained routes in the set of constrained routes.
  • the estimator 402 After checking all the nodes in all the constrained routes in the set of constrained routes, if the estimator 402 has removed at least one route from the set of constrained routes, then it will generate an updated set of constrained routes and recalculate the receiving probability and the disseminating probability. The estimator 402 can iterately perform the above actions until the receiving probability of each node of all nodes in all constrained routes is equal to or above the first threshold.
  • the estimator 402 may construct the UCD from the set of constrained routes or the updated set of constrained routes.
  • the estimator 402 may use any of the graph generating algorithems known in the art to construct the UCD, and the present embodiment has no limitation on it.
  • step 1-2 it abstracts all constrained routes ⁇ ′ o from the direct graph G o centered with the object user o.
  • step 3-21 an iterative computation is carried out with an inner iteration. In corresponding routes of each round, the receiving probability and disseminating probability are successively updated at step 5-10.Then it will find out the nodes that have smaller receiving probabilities than the first threshold ⁇ (i.e., the first threshold) , and remove involved routes from ⁇ ′ o . The outer iteration is reexecuted until that all involved nodes have a steady receiving probability larger than the ⁇ .
  • step 22 it build the subgraph G′ o (i.e., UCD) from ⁇ ′ o .
  • step 23 it return G′ o .
  • the estimator 402 may assess impact of a direct friend of the object user on the overall dissemination of the privacy information in the UCD.
  • the direct friend is a friend having a link with the object user in the UCD.
  • the estimator 402 can assess impact of the direct friend on disseminating intensity of the UCD from at least the disseminating probability and media capacity of each user in the UCD.
  • the disseminating intensity may measure the intensity of information originating from an object user propagating in the UCD.
  • the disseminating intensity may involve all the nodes’ contributions to the propagating in the UCD.
  • the disseminating probability can be determined by using equation (2) .
  • the media capacity may determine how widely one user makes a certain privacy information visible to others.
  • the media capacity may relate to the user’s topological status within the UCD he belongs to, which may be generally quantified by network centrality or any other appropriate measure.
  • the disseminating intensity of the UCD may be used to measure how widely and how deeply privacy information originating from the object user is propagated within the UCD.
  • media capacity S v of a user v may be based on his topological status in the UCD. It may be illustrated in the following equation:
  • h is the total number of nodes within the UCD containing user v, in which users a, b, and v satisfy a ⁇ b ⁇ v and a ⁇ b.
  • r ab (v) is the routing ratio of the amount of routes through user v to the total amount of routes between users a and b. It is noted that the media capacity can also be calculted by using any other suitable method, and the present embodiment has no limitation on it.
  • the disseminating intensity of the UCD may be calculated by considering both the disseminating probability D ⁇ i and media capacity S ⁇ i of each node ⁇ i 's in the UCD.
  • the disseminating intensity of the UCD centered with an object user o may be illustrated in the following equation:
  • (S ⁇ i +1) is intend to avoid the zero value of disseminating intensity when the nodes for example are with zero media capacities. It is noted that the disseminating intensity can also be calculted by using any other suitable method, and the present embodiment has no limitation on it.
  • the estimator 402 may construct a second UCD by removing only one direct friend such as ⁇ i of the object user o from the UCD. Then the estimator 402 may calculate a second disseminating intensity of the second UCD by using equation (4) .
  • the estimator 402 may assess impact of the direct friend ⁇ i of the object user o on the overall dissemination of the privacy information in the UCD by the following equation:
  • the estimator 402 may assess impact or disseminating contribution of each direct friend ⁇ i of the object user o on the overall dissemination of the privacy information in the UCD.
  • the estimator 402 can provide it to the identifying element 404.
  • the identifying element 404 may be configured to identify the vulnerable friend based on the estimated contributions towards dissemination of the privacy information.
  • the identifying element 404 can compare the disseminating contribution of each direct friend of the object user with a second threshold to indentify the vulnerable friend of the object user. If the disseminating contribution of a direct friend of the object user is above the second threshold, then this direct friend will be indentified as a vulnerable friend. In another example, the identifying element 404 can rank the disseminating contributions of the direct friends of the object user in ascending order, and then indentify top n direct friends as vulnerable friends.
  • FIG 5 is a simplified block diagram illustrating an apparatus 500 according to another embodiment of the present disclosure. Similar components are denoted with similar numbers in Figure 4. For brevity, the description of similar components is omitted here.
  • the apparatus 500 further comprises a storage device 512 configured to store the information regarding the vulnerable friend of the object user.
  • the information can contain the identifier of the vulnerable friend, his disseminating contribution or other suitable information.
  • the storage device 512 can be any kind of computer readable storage, such as a hard disk, CDROM, DVD, SSD, a phase change memory (PCM) , a random access memory (RAM) , a read-only memory (ROM) , an erasable programmable read-only memory (EPROM) or Flash memory.
  • PCM phase change memory
  • RAM random access memory
  • ROM read-only memory
  • EPROM erasable programmable read-only memory
  • Flash memory Flash memory
  • the storage device 512 can store this information from the identifying element 500 if the object user is not online currently or this information can be delayed to provide to the object user for example. In this way, the apparatus 500 can perform the indentifying process at any suitable time.
  • the apparatus 500 further comprises a provider 508 configured to provide the information regarding the vulnerable friend of the object user to the object user and/or one or more applications.
  • the information can contain the identifier of the vulnerable friend, his disseminating contribution or other suitable information.
  • the provider 508 can obtian this information from the identifying element 404 or from the storage device 512.
  • the providing may be as a result of a pull or push action. For example, if the object user may wish to evaluate the vulnerability of his friends, he can request this information from the apparatus 500.
  • the provider 508 can provide this information by retrieving it in the storage device 512 if the identifying element 404 has stored it in the storage device 512, or can wait for the identifying element 404 to provide it if the identifying element 404 calculates it in real time.
  • the provider 508 may also push this information to the object user for example when the information has been updated or satisfying a predefined time, etc.
  • the object user and/or one or more applications can for example set the privacy setting of the object user based on the information regarding the vulnerable friend of the object user.
  • the apparatus 500 further comprises a configurator 510 configured to configure the privacy setting of the object user based on the vulnerable friend of the object user.
  • the configurator 510 can unfriend a friend of the object user with the highest disseminating contribution or set more strict privacy setting for this friend, such as hiding some private information to this friend. It will be appreciated to those of ordinary skill in the art that there are other ways to configure the privacy setting of the object user based on the vulnerable friend of the object user.
  • the apparatus 500 further comprises an updater (not shown) configured to update the information regarding the vulnerable friend of the object user when satisfying a predefined condition.
  • the predefined condition may include at a predetermined time, at a certain interval, a trigger by an object user or an application, at a time when resources being abundant, at a time when power cost being low, a large change in the social network, etc.
  • the updater may update the information regarding the vulnerable friend of the object user at middle night since the power cost may be low and the computing resources may be abundant. It will be appreciated to those of ordinary skill in the art that there are other ways to update the information regarding the vulnerable friend of the object user.
  • Figure 6 is a flow chart of a process 600 for identifying a vulnerable friend for privacy protection of an object user in the object user’s online social network according to an embodiment of the present disclosure, wherein the object user has a plurality of friends in the social network.
  • the process 600 can be performed by the apparatus 400 shown in Figure 4.
  • the process 600 starts at step 602.
  • the apparatus 400 may estimate their contributions towards dissemination of the object user’s privacy information based on their behaviors in the social network.
  • the behaviors may include privacy-protection consciousness, privacy leaking tendency, attitude of worship towards the object user, user’s concern frequency towards the object user, etc.
  • the at least some of the object user’s friends may include those that the object user will wish to estimate their contributions towards dissemination of the object user’s privacy information.
  • the at least some of the object user’s friends may include all the object user’s friends.
  • the step 602 may include a step of determining an ultimate circle of disseminating (UCD) that includes those users who involve in dissemination of the privacy information.
  • UCD ultimate circle of disseminating
  • the step of determining the UCD may include a step of calculating a first probability that a user will disseminate the privacy information if that user receives the information; and a step of calculating a second probability that the user will access the privacy information.
  • the step of calculating a first probability comprises calculating the first probability based on the user’s privacy-protection consciousness, privacy leaking tendency and attitude of worship towards the object user.
  • the step of calculating a second probability comprises: calculating the second probability of the user based on at least the first probability of an upper-stream user in the disseminating path and that user’s concern frequency towards the upper-stream user.
  • the step of determining UCD comprises: repeating the steps of calculating a second probability and calculating a first probability for users in a possible disseminating path until the calculated second probability is less than a first threshold.
  • the step 602 may include a step of assessing impact of a direct friend of the object user on the overall dissemination of the privacy information in the UCD.
  • This aspect has been described above with other embodiments. For brevity, the description of this aspect is omitted here.
  • the step of assessing impact comprises: assessing impact of the direct friend on disseminating intensity of the UCD, which is determined from at least the first probability and media capacity of each user in the UCD.
  • the apparatus 400 may identify the vulnerable friend based on the estimated contributions towards dissemination of the privacy information.
  • the apparatus 400 can compare the disseminating contribution of each direct friend of the object user with a second threshold to indentify the vulnerable friend of the object user. If the disseminating contribution of a direct friend of the object user is above the second threshold, then this direct friend will be indentified as a vulnerable friend. In another example, the apparatus 400 can rank the disseminating contributions of the direct friends of the object user in ascending order, and then indentify top n direct friends as vulnerable friends.
  • the process 600 can include a storing step for storing the information regarding the vulnerable friend of the object user.
  • the information can contain the identifier of the vulnerable friend, his disseminating contribution or other suitable information.
  • the storing step may store the information regarding the vulnerable friend of the object user if the object user is not online currently or this information can be delayed to provide to the object user for example. In this way, the process 600 can be performed at any suitable time.
  • the process 600 can include a providing step for providing the information regarding the vulnerable friend of the object user to the object user and/or one or more applications.
  • the information can contain the identifier of the vulnerable friend, his disseminating contribution or other suitable information.
  • this information can be obtianed from the step 604 or from a storage such as the storage device 512 shown in Figure 5.
  • the providing may be as a result of a pull or push action. For example, if an object user may wish to evaluate the vulnerability of his friends, he can request this information.
  • this information can be provided by retrieving it in the storage if it has been stored in the storage device 512, or receiving it from step 604 if this information is calculated in real time.
  • this information may also be pushed to the object user for example when the information has been updated or satisfying a predefined time, etc.
  • the object user and/or one or more applications can for example set the privacy setting of the object user based on the information regarding the vulnerable friend of the object user.
  • the process 600 can include a configuring step for configuring the privacy setting of the object user based on the vulnerable friend of the object user.
  • the apparatus 500 can unfriend a friend of the object user with the highest disseminating contribution or set more strict privacy setting for this friend, such as hiding some private information to this friend. It will be appreciated to those of ordinary skill in the art that there are other ways to configure the privacy setting of the object user based on the one or more vulnerable friends of the object user.
  • the process 600 can include an updating step for updating the information regarding the vulnerable friend of the object user in the social network when satisfying a predefined condition.
  • the predefined condition may include at a predetermined time, at a certain interval, a trigger by an object user or an application, at a time when resources being abundant, at a time when power cost being low, a large change in the social network, etc.
  • the apparatus 500 may update the information regarding the vulnerable friend of the object user in the social network at middle night since the power cost may be low and the computing resources may be abundant. It will be appreciated to those of ordinary skill in the art that there are other ways to update the one or more vulnerable friends of the object user in the social network.
  • an apparatus for identifying a vulnerable friend for privacy protection of an object user in the object user’s online social network wherein the object user has a plurality of friends in the social network.
  • Said apparatus comprises means configured to carry out the methods or processes described above.
  • the apparatus comprises means configured to, for at least some of the object user’s friends, estimate their contributions towards dissemination of the object user’s privacy information based on their behaviors in the social network and means configured to identify the vulnerable friend based on the estimated contributions towards dissemination of the privacy information.
  • the apparatus can further comprise means configured to determine an ultimate circle of disseminating (UCD) that includes those users who involve in dissemination of the privacy information; and means configured to assess impact of a direct friend of the object user on the overall dissemination of the privacy information in the UCD.
  • UCD ultimate circle of disseminating
  • the apparatus further comprise means configured to calculate a first probability that a user will disseminate the privacy information if that user receives the information; and means configured to calculate a second probability that the user will access the privacy information.
  • the apparatus further comprise means configured to calculate the first probability based on the user’s privacy-protection consciousness, privacy leaking tendency and attitude of worship towards the object user.
  • the apparatus further comprise means configured to calculate the second probability of the user based on at least the first probability of an upper-stream user in the disseminating path and that user’s concern frequency towards the upper-stream user.
  • the apparatus further comprise means configured to repeat the steps of calculating a second probability and calculating a first probability for users in a possible disseminating path until the calculated second probability is less than a first threshold.
  • the apparatus further comprise means configured to assess impact of the direct friend on disseminating intensity of the UCD, which is determined from at least the first probability and media capacity of each user in the UCD.
  • the apparatus further comprises means configured to store the information regarding the vulnerable friend.
  • the apparatus further comprises means configured to update the information regarding the vulnerable friend when satisfying a predefined condition.
  • the apparatus further comprises means configured to provide the information regarding the vulnerable friend to the object user and/or one or more applications.
  • the apparatus further comprises means configured to configure the privacy setting of the object user based on the vulnerable friend.
  • any of the components of the apparatus 400, 500 depicted in Figure 4-5 can be implemented as hardware or software modules.
  • software modules they can be embodied on a tangible computer-readable recordable storage medium. All of the software modules (or any subset thereof) can be on the same medium, or each can be on a different medium, for example.
  • the software modules can run, for example, on a hardware processor. The method steps can then be carried out using the distinct software modules, as described above, executing on a hardware processor.
  • an aspect of the disclosure can make use of software running on a general purpose computer or workstation.
  • a general purpose computer or workstation Such an implementation might employ, for example, a processor, a memory, and an input/output interface formed, for example, by a display and a keyboard.
  • the term “processor” as used herein is intended to include any processing device, such as, for example, one that includes a CPU (central processing unit) and/or other forms of processing circuitry. Further, the term “processor” may refer to more than one individual processor.
  • memory is intended to include memory associated with a processor or CPU, such as, for example, RAM (random access memory) , ROM (read only memory) , a fixed memory device (for example, hard drive) , a removable memory device (for example, diskette) , a flash memory and the like.
  • the processor, memory, and input/output interface such as display and keyboard can be interconnected, for example, via bus as part of a data processing unit. Suitable interconnections, for example via bus, can also be provided to a network interface, such as a network card, which can be provided to interface with a computer network, and to a media interface, such as a diskette or CD-ROM drive, which can be provided to interface with media.
  • computer software including instructions or code for performing the methodologies of the disclosure, as described herein, may be stored in associated memory devices (for example, ROM, fixed or removable memory) and, when ready to be utilized, loaded in part or in whole (for example, into RAM) and implemented by a CPU.
  • Such software could include, but is not limited to, firmware, resident software, microcode, and the like.
  • aspects of the disclosure may take the form of a computer program product embodied in a computer readable medium having computer readable program code embodied thereon.
  • computer readable media may be a computer readable signal medium or a computer readable storage medium.
  • a computer readable storage medium may be, for example, but not limited to, an electronic, magnetic, optical, electromagnetic, infrared, or semiconductor system, apparatus, or device, or any suitable combination of the foregoing.
  • a computer readable storage medium may be any tangible medium that can contain, or store a program for use by or in connection with an instruction execution system, apparatus, or device.
  • Computer program code for carrying out operations for aspects of the disclosure may be written in any combination of at least one programming language, including an object oriented programming language such as Java, Smalltalk, C++ or the like and conventional procedural programming languages, such as the "C" programming language or similar programming languages.
  • the program code may execute entirely on the user's computer, partly on the user's computer, as a stand-alone software package, partly on the user's computer and partly on a remote computer or entirely on the remote computer or server.
  • each block in the flowchart or block diagrams may represent a module, component, segment, or portion of code, which comprises at least one executable instruction for implementing the specified logical function (s) .
  • the functions noted in the block may occur out of the order noted in the figures. For example, two blocks shown in succession may, in fact, be executed substantially concurrently, or the blocks may sometimes be executed in the reverse order, depending upon the functionality involved.

Landscapes

  • Engineering & Computer Science (AREA)
  • Business, Economics & Management (AREA)
  • Theoretical Computer Science (AREA)
  • Physics & Mathematics (AREA)
  • General Health & Medical Sciences (AREA)
  • Health & Medical Sciences (AREA)
  • Human Resources & Organizations (AREA)
  • Strategic Management (AREA)
  • General Physics & Mathematics (AREA)
  • General Business, Economics & Management (AREA)
  • Entrepreneurship & Innovation (AREA)
  • Bioethics (AREA)
  • Tourism & Hospitality (AREA)
  • Marketing (AREA)
  • Economics (AREA)
  • Primary Health Care (AREA)
  • Computing Systems (AREA)
  • Data Mining & Analysis (AREA)
  • Operations Research (AREA)
  • Quality & Reliability (AREA)
  • Medical Informatics (AREA)
  • Databases & Information Systems (AREA)
  • Computer Hardware Design (AREA)
  • Computer Security & Cryptography (AREA)
  • Software Systems (AREA)
  • General Engineering & Computer Science (AREA)
  • Management, Administration, Business Operations System, And Electronic Commerce (AREA)

Abstract

Method, apparatus, system, computer program product and computer readable medium are disclosed for identifying a vulnerable friend for privacy protection of an object user in the object user's online social network, wherein the object user has a plurality of friends in the social network. The method comprises: for at least some of the object user's friends, estimating their contributions towards dissemination of the object user's privacy information based on their behaviors in the social network; and identifying the vulnerable friend based on the estimated contributions towards dissemination of the privacy information.

Description

METHOD, APPARATUS AND COMPUTER PROGRAM PRODUCT FOR IDENTIFYING A VULNERABLE FRIEND FOR PRIVACY PROTECTION IN A SOCIAL NETWORK Field of the Invention
Embodiments of the disclosure generally relate to data processing, and, more particularly, to privacy protection in online social network.
Background
The fast growth of online social network applications has dramatically changed people's daily life. Users of the social networks are immersed in their roles as information producers and/or propagation pushers. Every user can be deemed as a potential propagation pusher in a social network. Therefore, a lot of user generated content (UGC) , such as public information and/or private information, may be fast propagated in the social network.
The private information of a user may include information about travel arrangement, luxury consumption, illness record or drunk driving record. The user generally does not wish the private information to be disseminated to other people. Accordingly, privacy protection is one of the major concerns in face of the fast information propagation in social networks.
In general, a user’s privacy-protection can be done through the privacy setting in a social network application. However, many users often configure their privacy setting based on their experience. This may lead to inadequate privacy-protection. For example, a user may not precisely know whom he can trust and who are more vulnerable or contributed more toward dissemination of privacy information in his social network. Additionally, the existing social network applications or platforms  usually do not provide suggestion or recommendation regarding privacy setting for a user. As a result, privacy information is posted every day in social networks with the risk of privacy leakage, while users are unaware that their privacy information is divulged unconsciously. Therefore, it is desirable to provide an improved technical solution for privacy protection.
Summary
This summary is provided to introduce a selection of concepts in a simplified form that are further described below in the detailed description. This summary is not intended to identify key features or essential features of the claimed subject matter, nor is it intended to be used to limit the scope of the claimed subject matter.
According to one aspect of the disclosure, it is provided a method for identifying a vulnerable friend for privacy protection of an object user in the object user’s online social network, wherein the object user has a plurality of friends in the social network, the method comprising. Said method comprises: for at least some of the object user’s friends, estimating their contributions towards dissemination of the object user’s privacy information based on their behaviors in the social network; and identifying the vulnerable friend based on the estimated contributions towards dissemination of the privacy information.
According to another aspect of the present disclosure, it is provided an apparatus comprising means configured to carry out the above-described method.
According to another aspect of the present disclosure, it is provided a computer program product embodied on a distribution medium readable by a computer and comprising program instructions which, when loaded into a computer, execute the above-described method.
According to still another aspect of the present disclosure, it is provided a non-transitory computer readable medium having encoded thereon statements and instructions to cause a processor to execute the above-described method.
According to still another aspect of the present disclosure, it is provided an apparatus for identifying a vulnerable friend for privacy protection of an object user in the object user’s online social network, wherein the object user has a plurality of friends in the social network. Said apparatus comprises: an estimator configured to, for at least some of the object user’s friends, estimate their contributions towards dissemination of the object user’s privacy information based on their behaviors in the social network; and an identifying element configured to identify the vulnerable friend based on the estimated contributions towards dissemination of the privacy information.
These and other objects, features and advantages of the disclosure will become apparent from the following detailed description of illustrative embodiments thereof, which is to be read in connection with the accompanying drawings.
Brief Description of the Drawings
Figure 1 shows a  schematic operating environment that can utilize some embodiments of the present disclosure;
Figure 2 shows a schematic diagram of a directed graph;
Figure 3 is a schematic diagram depicting a process of privacy information dissemination within privacy-receiving-disseminating (PRD) model according to an embodiment of the present disclosure;
Figure 4 is a simplified block diagram illustrating an apparatus according to an embodiment of the present disclosure;
Figure 5 is a simplified block diagram illustrating an apparatus according to another embodiment of the present disclosure; and
Figure 6 is a flow chart of a process for identifying a vulnerable friend of an object user in a social network according to an embodiment of the present disclosure.
Detailed Description
For the purpose of explanation, details are set forth in the following description in order to provide a thorough understanding of the embodiments disclosed. It is apparent, however, to those skilled in the art that the embodiments may be implemented without these specific details or with an equivalent arrangement.
As described herein, an aspect of the disclosure includes providing a technical solution for indentifying one or more vulnerable friends of an object user in a social network. Figure 1 shows a schematic operating environment 100 in which some embodiments of the present disclosure can be implemented.
As shown in Figure 1, the operating environment 100 may comprise one or more social network platforms or applications 111-11n each operably connected to an identifying apparatus 110 through one or more networks. The social network platforms or applications 111-11n can be any kind of social network platforms or applications capable of running on any type of computing device such as cloud computer, distributed computing system, virtual computer, smart phones, tablets, laptops, servers, thin clients, set-top boxes and PCs. The social network platforms or  applications 111-11n may include, but not limited to, LinkedIn, Facebook, Twitter, YouTube, WeChat, QQ space and WEIBO. The social network platforms or applications 111-11n can be a server-client architecture or a distributed architecture or a peer to peer architecture or other appropriate architecture. In general, the social network platforms or applications 111-11n may maintain a social network graph containing all the users in the social network platforms or applications and/or respective social network graphs centered with each user in the social network or other suitable social network graphs. The social network platforms or applications 111-11n may also store users’ interaction information and other useful information such as user profile and his privacy setting or the like.
The operating environment 100 may also comprise an identifying apparatus 110 which can be implemented in form of hardware, software or their combination, including but not limited to, cloud computer, distributed computing system, virtual computer, smart phones, tablets, laptops, servers, thin clients, set-top boxes and PCs. The identifying apparatus 110 may run with any kind of operating system including, but not limited to, Windows, Linux, UNIX, Android, iOS and their variants. It is noted that although it is shown one identifying apparatus in Figure 1, but the operating environment 100 may comprise components physically separated and operably working together. For example, the identifying apparatus may be implemented as a distributed system.
The operating environment 100 may comprise network 108 such as any wired or wireless network or their combination, including, but not limited to, a wireless cellular telephone network (such as the global system for mobile communications (GSM) network, 3rd generation (3G) network, 3.5th generation (3.5G) network, 4th generation (4G) network, universal mobile telecommunications system (UMTS) , code  division multiple access (CDMA) network etc) , a wireless local area network (WLAN) such as defined by any of the Institute of Electrical and Electronic Engineers (IEEE) 802. x standards, an Ethernet local area network, a token ring local area network, a wide area network, and the Internet.
It is noted that the network 108 may include one or more communication devices for relaying or routing the information to be exchanged among the identifying apparatus 110 and the one or more social network platforms or applications 111-11n. Alternatively, the identifying apparatus 110 and the one or more social network platforms or applications 111-11n may exchange information directly through communication media such as wireline media or wireless media.
It is noted that the identifying apparatus 110 may be integrated with each of the one or more social network platforms or applications 111-11n or as a separated apparatus serving the one or more social network platforms or applications 111-11n or any combination thereof.
Further, both the identifying apparatus 110 and the social network platforms or applications 111-11n may be capable of operating a connectivity program. The connectivity program may allow the identifying apparatus 110 and the social network platforms or applications 111-11n to transmit and receive web content, such as user privacy information, according to a protocol, such as Wireless Application Protocol (WAP) , Hyper Text Transfer Protocol (HTTP) , Hyper Text Transfer Protocol over Secure Socket Layer (HTTPS) , Transmission Control Protocol/Internet Protocol (TCP/IP) and/or User Datagram Protocol (UDP) and/or the like.
In general, users in a social network can form one or more social network graphs depending on their relationships. A user may be represented as a node in a  social network graph. The terms "user"and "node"are often used interchangeably in the present disclosure. The social relationship or social tie between a user and one of his followers or friends may be represented as a link in the graph. A schematic graph 200 of a social network centered with the object user “o” is shown in Figure 2.
As shown in Figure 2, a friend-network centered with an object user o can be abstracted as a directed graph Go= (Uo, Eo) . Eo is a set of directed edge e<u, v>∈Eo from u to his follower v. Uo is a set of nodes contained in the friend-network. An arrow line represents a following relationship between two users. For example, users η1、η2、η3、η4 are four friends of the object user o. The friends of a user mean that the user and his friends have direct connection in the social network graph 200, namely they can directly exchange information without an intermediate user. For example, user f’s friends are user j and e in the graph 200. It can be found one or more routes between any two users in a social network graph. For example, the route originating from a user such as user o to another user such as user h may be one of a number of paths such as o-η1-h, o-η1-v-h, or o-η1-v-i-h in the graph 200. Therefore, the information from the user o may go to the user h through different paths. Additionally, there may be some loops in the social network graph, such as the loops h-i-v-h, h-i-v-η1-h in the graph 200. Thus, the route originating from a user such as user o to another user such as user i may be a loop route, such as the path o-η1-h-v-η1-h-i.
The social network graph 200 and other information, such as the interaction information and the UGC, used by the identifying apparatus 110 can be stored in a centralized or distributed database, such as, RDBMS, SQL, NoSQL, or as one or more files on any storage medium, such as, RAM, HDD, diskette, CD, DVD, Blue-ray Disc, EEPROM, SSD.
In general, the information posted by a user may be classified as two kinds of information, namely, public information and privacy information. While the public information may be shared with every follower, the privacy information is should be accessible only to the user himself or to certain particular followers. The information dissemination in a social network is often as follows. An object user posts a content such as privacy information, then a follower (friend) of the object user, if allowed, may obtain this content by accessing the object user’s posts or using information push service. Then the friend may forward or post this information to his followers (friends) , and so on.
Figure 3 is a schematic diagram 300 depicting a process of privacy information dissemination within privacy-receiving-disseminating (PRD) model according to an embodiment of the present disclosure. The PRD model is a discrete time dissemination model based on the classical cascade model. Its parameters are of definite practical significance so that it can well imitate the real process of information propagation in social network. In PRD model, each node is associated with two correlative probabilities, namely receiving probability (i.e., the second probability) and disseminating or diffusing probability (i.e., the first probability) , which impact the dissemination process of privacy information mo posted by an object user “o” .
In an example embodiment, the receiving probability of a user may be calculated based on at least the disseminating probability of an upper-stream user in the disseminating path and that user’s concern frequency towards the upper-stream user. The disseminating path may be a loop-free path originating from the object user and passing through the user.
The concerns frequency αuv of user v to his friend u may mean the frequency that user v expresses towards u or user v views u’s personal page, etc. It can be conspicuously discovered from the frequency of interactive behaviors between them. In an example, the concerns frequency αuv is illustrated in the following equation:
Figure PCTCN2015075102-appb-000001
where |mu| is the total number of u’s posts, and |muv| is the number of u’s posts which user v has commented on, forwarded or the like. It is noted that the concerns frequency αuv may also be calculted by using any other suitable method, and the present embodiment has no limitation on it.
In an example embodiment, the receiving probability that user v obtains mo from his friend u is illustrated in the following equation:
Figure PCTCN2015075102-appb-000002
Where the disseminating probability Du (tn-1) of u indicates the possibility that u has forwarded mo before, which will be describe in detail in the following. tn-1 is a discrete time.
It is noted that the concerns frequency Ruv (tn) can also be calculted by using any other suitable method, and the present embodiment has no limitation on it.
Since user v can obtain mo from more than one his friend, the more users v follows, the more likely he can obtain mo. The receiving probability of user v may be asynchronously updated during a certain period. Let
Figure PCTCN2015075102-appb-000003
be the different time that user  v knows mo from each his friend u, and tn be the latest time among
Figure PCTCN2015075102-appb-000004
The total receiving probability of user v may be illustrated in the following equation:
Figure PCTCN2015075102-appb-000005
where DFu is the set of the friends of u who have a directed edge with u, and the minimized operator is intend to avoid the probability value greater than 1. It is noted that the total receiving probability
Figure PCTCN2015075102-appb-000006
can also be calculted by using any other suitable method, and the present embodiment has no limitation on it.
In an example embodiment, the disseminating probability of a user may be calculated based at least partly on the user’s privacy-protection consciousness, privacy leaking tendency and attitude of worship towards the object user.
(1) Privacy-Protection Consciousness
The accessibility of personal profiles, particularly those privacy information hidden by most users, can incur varying degrees of risk for privacy protection. A user who does not possess enough consciousness to protect himself is far less likely to safeguard the privacy of other users. The privacy-protection consciousness of user v may be quantified, regarded as Iv, by the relative non-accessibility of personal profiles in v’s privacy setting. It may be illustrated in the following equation:
Figure PCTCN2015075102-appb-000007
where n is the total number of personal profiles provided by the privacy setting of a social network application. wi is the weight of i-th personal profile, which shows the relative sensitivity of it and is defined as a percentage of users who set i -th personal  profiles non-accessible among the whole uses. av, i is the indicator of i -th personal profiles for v, av, i=1 if v makes i -th personal profiles accessible to any other users, otherwise av, i=0. Iv∈ [0, 1] , where Iv=1 indicates that v is quite cautious to adjust his privacy setting to be not accessible at all, and Iv=0 indicates that v exposes all his personal profiles to the public. It is noted that the privacy-protection consciousness Iv of user v can also be calculted by using any other suitable method, and the present embodiment has no limitation on it.
(2) Privacy Leaking Tendency
The degree of privacy leaking tendency determines the extent to which one user disseminates certain privacy information. Since the behavior tendency is an inherent personality, it can be assessed from the abundant historical records of online behaviors. In an example, supposing that the leaking tendency towards privacy information impose the same influence on both posting and forwarding behaviors. Thus, the privacy leaking tendency Lv of user v may be estimated by the average leakage probability of privacy information that user v ever posted himself or forwarded from others. It may be illustrated in the following equation:
Figure PCTCN2015075102-appb-000008
where |mv| is the total number of user v’s posts, and |mv, p| is the number of user v’s posts which contain privacy information. It is noted that the privacy leaking tendency Lv of user v can also be calculted by using any other suitable method, and the present embodiment has no limitation on it.
(3) Attitude of worship towards the object user
The attitude of worship towards the object user such as user o can be reflected by o's authority Ao. The authority Ao may be illustrated in the following equation:
Figure PCTCN2015075102-appb-000009
where |DFo| is the number of o 's followers, 
Figure PCTCN2015075102-appb-000010
is the number of whom o follows, and |mo| is the total number of o’s posts. It is noted that the authority Ao can also be calculted by using any other suitable method, and the present embodiment has no limitation on it.
In an example, the disseminating probability that user v forwards privacy information mo of an object user o can be illustrated in the following equation:
Figure PCTCN2015075102-appb-000011
where dov is the diameter (i.e. topological distance of the shortest route) between the object user user o and the user v. 
Figure PCTCN2015075102-appb-000012
is the total receiving probability of user v . tn is the latest time of a series of discrete time. It is noted that the disseminating probability can also be calculted by using any other suitable method, and the present embodiment has no limitation on it.
Now we describe the process of privacy information dissemination within privacy-receiving-disseminating (PRD) model with reference to Figure 3. Time t0 is the initial time when an object user o posts the privacy information mo. t1, t2, ... are a series of discrete time that represent the continuous rounds of dissemination from node to node. At time t1, each of the object user o's friends ηi∈DFo may know mo  from such as o's posts with a receiving probability, where DFo is the set of the object user o's friends who have a directed edge with o. Then he will forward or post it with a certanin disseminating probability to propel the propagation course of mo if he has reicieved mo . So does each of user ηi's friends v∈DFηi at time t2, and so on. The receiving probabilities and disseminating probabilities of nodes may be asynchronously updated with the approaching of mo from disparate routes at different time. The dissemination process continues until that the receiving probability of each terminal node of different routes, regarded as θi, is bellow a first threshlod ε at time 
Figure PCTCN2015075102-appb-000013
separately, i. e, each terminal node θi satisfies
Figure PCTCN2015075102-appb-000014
at time
Figure PCTCN2015075102-appb-000015
separately. In an example, the first threshlod ε is small enough to let each terminal node θi scarcely possible to know mo at time
Figure PCTCN2015075102-appb-000016
Figure 4 is a simplified block diagram 400 illustrating an apparatus for identifying a vulnerable friend for privacy protection of an object user in the object user’s online social network according to an embodiment of the present disclosure, wherein the object user has a plurality of friends in the social network. The apparatus 400 comprises a estimator 402 and an identifying element 404.
According to the embodiment, the estimator 402 is configured to, for at least some of the object user’s friends, estimate their contributions towards dissemination of the object user’s privacy information based on their behaviors in the social network. The behaviors may include privacy-protection consciousness, privacy leaking tendency, attitude of worship towards the object user, user’s concern frequency towards the object user, etc. The at least some of the object user’s friends may include those that the object user will wish to estimate their contributions  towards dissemination of the object user’s privacy information. In an example, the at least some of the object user’s friends may include all the object user’s friends.
In this embodiment, the estimator 402 is further configured to determine an ultimate circle of disseminating (UCD) that includes those users who involve in dissemination of the privacy information.
In an example, the ultimate circle of disseminating (UCD) may include those users who have disseminated the privacy information. In this case, the estimator 402 can generate the UCD from the social network centered with the object user by removing those users who have not disseminated the privacy information.
In another example, the ultimate circle of disseminating (UCD) may include those users whose respective receiving probabilities of the privacy information originating from the object user are below a threshold. In this case, the estimator 402 can generate the UCD from the social network centered with the object user by removing those users whose respective receiving probabilities are below the threshold.
In another example, the ultimate circle of disseminating (UCD) may include those users whose respective disseminating probabilities of the privacy information originating from the object user are below a threshold. In this case, the estimator 402 can generate the UCD from the social network centered with the object user by removing those users whose respective disseminating probabilities are below the threshold.
In an example where the social network is represented by a directed graph, in which a node represents a user and an edge represents a following relationship  between two users, the estimator 402 may perform the following actions to generate the UCD.
According to this example, the estimator 402 may obtain from the directed graph a set of constrained routes each of which takes the object user as a source node, wherein each constrained route of the set of constrained routes contains a particular order of nodes without repetition. A constrained route from the object user to a destination user may be a route without loop. For example, with reference to Figure 2, the constrained routes from the object user o to a destination user i are o-η1-h-i, o-η1-v-i, o-η1-h-v-i and o-η1-v-h-i. While, the routes such as o-η1-h-v-η1-h-i and o-η1-v-h-η1-v-i are not the constrained routes due to loop. More generally, the constrained routes can be represented as Γ′o= {γ: γ∈Γo, vi∈γ, vj∈γ, i≠j, vi≠vj} , where Γo= {γ: γ={o,b1, b2, …} , e<vi, bi+1>∈Eo} , γ are disparate routes of graph Go and take the object user o as a source node. The estimator 402 can use any of route calculating algorithems to compute the constrained routes, and the present embodiment has no limitation on it.
After obtaining the set of constrained routes, the estimator 402 may calculate receiving probability that a user will disseminate the privacy information if that user receives the information. The estimator 402 may calculate disseminating probability that a user will access the privacy information. The receiving probability may be calculated based on based on at least the first probability of an upper-stream user in the disseminating path and that user’s concern frequency towards the upper-stream user. The disseminating probability may be calculated based on the user’s privacy-protection consciousness, privacy leaking tendency and attitude of worship towards the object user. In an example, the receiving probability and disseminating probability may be calculated by using equations (1) and (2) seperately and considering the  length of the longest route among the set of constrained routes. In other embodiment, The estimator 402 can use any other suitable approach to calculate them, and the present embodiment has no limitation on it.
In an example, the estimator 402 may repeat to calculate a receiving probability and to calculate a disseminating probability for users in a possible disseminating path until the calculated receiving probability is less than a first threshold.
For example, after calculating the receiving probability and the disseminating probability, in response to a constrained route containing at least one node except for the last node of the constrained route whose receiving probability is below a first threshold, the estimator 402 may generate an updated set of constrained routes by removing the constrained route from the set of constrained routes, and recalculate the receiving probability and the disseminating probability by using equations (1) and (2) and considering the length of the longest route among the updated set of constrained routes.
For example, the first threshold can be differently defined in different contexts. For example, if the number of constrained routes in the set of constrained routes is very large and the apparatus 400 is implemented in a mobile phone, then the first threshold may be defined relatively large to speed up the calculation process and not overload the mobile phone. By contrast, if the apparatus 400 is implemented in a server farm or cloud conputing platform, then the first threshold can be defined relatively small to include as many users as possible to improve the accuracy. In another embodiment, the first threshold can be determined through machine learning based on training or historical data. Further, the first threshold can be modified or  updated after a period of time or when one or more predefined conditions are satisfied. In addition, the first threshold is configured in order to balance between computation efficiency and accuracy.
In an example, the estimator 402 may check a node in a constrained route in the set of constrained routes to determine whether the node’s receiving probability is below the first threshold. If yes, then the estimator 402 removes this constrained route from the set of constrained routes and checks a node in another constrained route in the set of constrained routes. If no, then the estimator 402 checks another unchecked node in this constrained route. The same process is done on all the constrained routes in the set of constrained routes. After checking all the nodes in all the constrained routes in the set of constrained routes, if the estimator 402 has removed at least one route from the set of constrained routes, then it will generate an updated set of constrained routes and recalculate the receiving probability and the disseminating probability. The estimator 402 can iterately perform the above actions until the receiving probability of each node of all nodes in all constrained routes is equal to or above the first threshold.
If none of the constrained routes in the set of constrained routes or in the updated set of constrained routes contains at least one node except for the last node of the constrained route whose receiving probability is below the first threshold, then the estimator 402 may construct the UCD from the set of constrained routes or the updated set of constrained routes. The estimator 402 may use any of the graph generating algorithems known in the art to construct the UCD, and the present embodiment has no limitation on it.
The detailed process of constructing the UCD according to an embodiment is described in algorithm 1.
Figure PCTCN2015075102-appb-000017
Algorithm 1
In algorithm 1, at step 1-2, it abstracts all constrained routes Γ′o from the direct graph Go centered with the object user o. At step 3-21, an iterative computation  is carried out with an inner iteration. In corresponding routes of each round, the receiving probability and disseminating probability are successively updated at step 5-10.Then it will find out the nodes that have smaller receiving probabilities than the first threshold ε (i.e., the first threshold) , and remove involved routes from Γ′o. The outer iteration is reexecuted until that all involved nodes have a steady receiving probability larger than the ε. At step 22, it build the subgraph G′o (i.e., UCD) from Γ′o. At step 23, it return G′o.
After obtaining the UCD, the estimator 402 may assess impact of a direct friend of the object user on the overall dissemination of the privacy information in the UCD. The direct friend is a friend having a link with the object user in the UCD.
In an example, the estimator 402 can assess impact of the direct friend on disseminating intensity of the UCD from at least the disseminating probability and media capacity of each user in the UCD. The disseminating intensity may measure the intensity of information originating from an object user propagating in the UCD. For example, the disseminating intensity may involve all the nodes’ contributions to the propagating in the UCD. The disseminating probability can be determined by using equation (2) . The media capacity may determine how widely one user makes a certain privacy information visible to others. The media capacity may relate to the user’s topological status within the UCD he belongs to, which may be generally quantified by network centrality or any other appropriate measure. The disseminating intensity of the UCD may be used to measure how widely and how deeply privacy information originating from the object user is propagated within the UCD.
In an example, media capacity Sv of a user v may be based on his topological status in the UCD. It may be illustrated in the following equation:
Figure PCTCN2015075102-appb-000018
where h is the total number of nodes within the UCD containing user v, in which users a, b, and v satisfy a≠b≠v and a<b. rab (v) is the routing ratio of the amount of routes through user v to the total amount of routes between users a and b. It is noted that the media capacity can also be calculted by using any other suitable method, and the present embodiment has no limitation on it.
In an example, the disseminating intensity of the UCD may be calculated by considering both the disseminating probability Dφi and media capacity Sφi of each node φi's in the UCD. The disseminating intensity
Figure PCTCN2015075102-appb-000019
of the UCD centered with an object user o may be illustrated in the following equation:
Figure PCTCN2015075102-appb-000020
where (Sφi+1) is intend to avoid the zero value of disseminating intensity when the nodes for example are with zero media capacities. It is noted that the disseminating intensity
Figure PCTCN2015075102-appb-000021
can also be calculted by using any other suitable method, and the present embodiment has no limitation on it.
After calculating the disseminating intensity
Figure PCTCN2015075102-appb-000022
of the UCD, the estimator 402 may construct a second UCD by removing only one direct friend such as ηi of the object user o from the UCD. Then the estimator 402 may calculate a second disseminating intensity of the second UCD by using equation (4) . The estimator 402 may assess impact of the direct friendηi of the object user o on the overall dissemination of the privacy information in the UCD by the following equation:
Figure PCTCN2015075102-appb-000023
where
Figure PCTCN2015075102-appb-000024
represents the second disseminating intensity of the second UCD generated by removing only one direct friend ηi of the object user o from the UCD. In this way, the estimator 402 may assess impact or disseminating contribution of each direct friend ηi of the object user o on the overall dissemination of the privacy information in the UCD.
After estimating the disseminating contribution of each direct friend of the object user, the estimator 402 can provide it to the identifying element 404.
The identifying element 404 may be configured to identify the vulnerable friend based on the estimated contributions towards dissemination of the privacy information.
For example, the identifying element 404 can compare the disseminating contribution of each direct friend of the object user with a second threshold to indentify the vulnerable friend of the object user. If the disseminating contribution of a direct friend of the object user is above the second threshold, then this direct friend will be indentified as a vulnerable friend. In another example, the identifying element 404 can rank the disseminating contributions of the direct friends of the object user in ascending order, and then indentify top n direct friends as vulnerable friends.
Figure 5 is a simplified block diagram illustrating an apparatus 500 according to another embodiment of the present disclosure. Similar components are denoted with similar numbers in Figure 4. For brevity, the description of similar components is omitted here.
As shown in Figure 5, the apparatus 500 further comprises a storage device 512 configured to store the information regarding the vulnerable friend of the object user. The information can contain the identifier of the vulnerable friend, his disseminating contribution or other suitable information. The storage device 512 can be any kind of computer readable storage, such as a hard disk, CDROM, DVD, SSD, a phase change memory (PCM) , a random access memory (RAM) , a read-only memory (ROM) , an erasable programmable read-only memory (EPROM) or Flash memory. For example, the storage device 512 can store this information from the identifying element 500 if the object user is not online currently or this information can be delayed to provide to the object user for example. In this way, the apparatus 500 can perform the indentifying process at any suitable time.
The apparatus 500 further comprises a provider 508 configured to provide the information regarding the vulnerable friend of the object user to the object user and/or one or more applications. The information can contain the identifier of the vulnerable friend, his disseminating contribution or other suitable information. According to an embedment, the provider 508 can obtian this information from the identifying element 404 or from the storage device 512. The providing may be as a result of a pull or push action. For example, if the object user may wish to evaluate the vulnerability of his friends, he can request this information from the apparatus 500. In this case, the provider 508 can provide this information by retrieving it in the storage device 512 if the identifying element 404 has stored it in the storage device 512, or can wait for the identifying element 404 to provide it if the identifying element 404 calculates it in real time. In another example, the provider 508 may also push this information to the object user for example when the information has been updated or satisfying a predefined time, etc. After obtaining this information, the object user and/or one or  more applications can for example set the privacy setting of the object user based on the information regarding the vulnerable friend of the object user.
The apparatus 500 further comprises a configurator 510 configured to configure the privacy setting of the object user based on the vulnerable friend of the object user. For example, the configurator 510 can unfriend a friend of the object user with the highest disseminating contribution or set more strict privacy setting for this friend, such as hiding some private information to this friend. It will be appreciated to those of ordinary skill in the art that there are other ways to configure the privacy setting of the object user based on the vulnerable friend of the object user.
The apparatus 500 further comprises an updater (not shown) configured to update the information regarding the vulnerable friend of the object user when satisfying a predefined condition. The predefined condition may include at a predetermined time, at a certain interval, a trigger by an object user or an application, at a time when resources being abundant, at a time when power cost being low, a large change in the social network, etc. For example, the updater (not shown) may update the information regarding the vulnerable friend of the object user at middle night since the power cost may be low and the computing resources may be abundant. It will be appreciated to those of ordinary skill in the art that there are other ways to update the information regarding the vulnerable friend of the object user.
Figure 6 is a flow chart of a process 600 for identifying a vulnerable friend for privacy protection of an object user in the object user’s online social network according to an embodiment of the present disclosure, wherein the object user has a plurality of friends in the social network. The process 600 can be performed by the apparatus 400 shown in Figure 4. As shown in the Figure 6, the process 600 starts at  step 602. At step 602, for at least some of the object user’s friends, the apparatus 400 may estimate their contributions towards dissemination of the object user’s privacy information based on their behaviors in the social network. The behaviors may include privacy-protection consciousness, privacy leaking tendency, attitude of worship towards the object user, user’s concern frequency towards the object user, etc. The at least some of the object user’s friends may include those that the object user will wish to estimate their contributions towards dissemination of the object user’s privacy information. In an example, the at least some of the object user’s friends may include all the object user’s friends.
In this embodiment, the step 602 may include a step of determining an ultimate circle of disseminating (UCD) that includes those users who involve in dissemination of the privacy information. This aspect has been described above with other embodiments. For brevity, the description of this aspect is omitted here.
In an example where the social network is represented by a directed graph, in which a node represents a user and an edge represents a following relationship between two users, the step of determining the UCD may include a step of calculating a first probability that a user will disseminate the privacy information if that user receives the information; and a step of calculating a second probability that the user will access the privacy information. This aspect has been described above with other embodiments. For brevity, the description of this aspect is omitted here.
In an example, the step of calculating a first probability comprises calculating the first probability based on the user’s privacy-protection consciousness, privacy leaking tendency and attitude of worship towards the object user. This aspect has been  described above with other embodiments. For brevity, the description of this aspect is omitted here.
In an example, the step of calculating a second probability comprises: calculating the second probability of the user based on at least the first probability of an upper-stream user in the disseminating path and that user’s concern frequency towards the upper-stream user. This aspect has been described above with other embodiments. For brevity, the description of this aspect is omitted here.
In an example, the step of determining UCD comprises: repeating the steps of calculating a second probability and calculating a first probability for users in a possible disseminating path until the calculated second probability is less than a first threshold. This aspect has been described above with other embodiments. For brevity, the description of this aspect is omitted here.
According to an example, the step 602 may include a step of assessing impact of a direct friend of the object user on the overall dissemination of the privacy information in the UCD. This aspect has been described above with other embodiments. For brevity, the description of this aspect is omitted here.
In an example, the step of assessing impact comprises: assessing impact of the direct friend on disseminating intensity of the UCD, which is determined from at least the first probability and media capacity of each user in the UCD. This aspect has been described above with other embodiments. For brevity, the description of this aspect is omitted here.
After estimating the impact contribution of each direct friend of the object user, at step 604, the apparatus 400 may identify the vulnerable friend based on the estimated contributions towards dissemination of the privacy information.
For example, at step 604, the apparatus 400 can compare the disseminating contribution of each direct friend of the object user with a second threshold to indentify the vulnerable friend of the object user. If the disseminating contribution of a direct friend of the object user is above the second threshold, then this direct friend will be indentified as a vulnerable friend. In another example, the apparatus 400 can rank the disseminating contributions of the direct friends of the object user in ascending order, and then indentify top n direct friends as vulnerable friends.
In an embodiment, the process 600 can include a storing step for storing the information regarding the vulnerable friend of the object user. The information can contain the identifier of the vulnerable friend, his disseminating contribution or other suitable information. For example, the storing step may store the information regarding the vulnerable friend of the object user if the object user is not online currently or this information can be delayed to provide to the object user for example. In this way, the process 600 can be performed at any suitable time.
In an embodiment, the process 600 can include a providing step for providing the information regarding the vulnerable friend of the object user to the object user and/or one or more applications. The information can contain the identifier of the vulnerable friend, his disseminating contribution or other suitable information. According to an embedment, at the providing step, this information can be obtianed from the step 604 or from a storage such as the storage device 512 shown in Figure 5. The providing may be as a result of a pull or push action. For example, if an object  user may wish to evaluate the vulnerability of his friends, he can request this information. In this case, at the providing step, this information can be provided by retrieving it in the storage if it has been stored in the storage device 512, or receiving it from step 604 if this information is calculated in real time. In another example, at the providing step, this information may also be pushed to the object user for example when the information has been updated or satisfying a predefined time, etc. After obtaining this information, the object user and/or one or more applications can for example set the privacy setting of the object user based on the information regarding the vulnerable friend of the object user.
In an embodiment, the process 600 can include a configuring step for configuring the privacy setting of the object user based on the vulnerable friend of the object user. For example, at the configuring step, the apparatus 500 can unfriend a friend of the object user with the highest disseminating contribution or set more strict privacy setting for this friend, such as hiding some private information to this friend. It will be appreciated to those of ordinary skill in the art that there are other ways to configure the privacy setting of the object user based on the one or more vulnerable friends of the object user.
In an embodiment, the process 600 can include an updating step for updating the information regarding the vulnerable friend of the object user in the social network when satisfying a predefined condition. The predefined condition may include at a predetermined time, at a certain interval, a trigger by an object user or an application, at a time when resources being abundant, at a time when power cost being low, a large change in the social network, etc. For example, at the updating step, the apparatus 500 may update the information regarding the vulnerable friend of the object user in the social network at middle night since the power cost may be low and  the computing resources may be abundant. It will be appreciated to those of ordinary skill in the art that there are other ways to update the one or more vulnerable friends of the object user in the social network.
According to an aspect of the disclosure it is provided an apparatus for identifying a vulnerable friend for privacy protection of an object user in the object user’s online social network, wherein the object user has a plurality of friends in the social network. Said apparatus comprises means configured to carry out the methods or processes described above. In an embodiment, the apparatus comprises means configured to, for at least some of the object user’s friends, estimate their contributions towards dissemination of the object user’s privacy information based on their behaviors in the social network and means configured to identify the vulnerable friend based on the estimated contributions towards dissemination of the privacy information.
According to an embodiment, the apparatus can further comprise means configured to determine an ultimate circle of disseminating (UCD) that includes those users who involve in dissemination of the privacy information; and means configured to assess impact of a direct friend of the object user on the overall dissemination of the privacy information in the UCD.
According to an embodiment, wherein the social network is represented by a directed graph, in which a node represents a user and an edge represents a following relationship between two users, the apparatus further comprise means configured to calculate a first probability that a user will disseminate the privacy information if that user receives the information; and means configured to calculate a second probability that the user will access the privacy information.
According to an embodiment, wherein the apparatus further comprise means configured to calculate the first probability based on the user’s privacy-protection consciousness, privacy leaking tendency and attitude of worship towards the object user.
According to embodiment, wherein the apparatus further comprise means configured to calculate the second probability of the user based on at least the first probability of an upper-stream user in the disseminating path and that user’s concern frequency towards the upper-stream user.
According to embodiment, wherein the apparatus further comprise means configured to repeat the steps of calculating a second probability and calculating a first probability for users in a possible disseminating path until the calculated second probability is less than a first threshold.
According to embodiment, wherein the apparatus further comprise means configured to assess impact of the direct friend on disseminating intensity of the UCD, which is determined from at least the first probability and media capacity of each user in the UCD.
According to an embodiment, the apparatus further comprises means configured to store the information regarding the vulnerable friend.
According to an embodiment, the apparatus further comprises means configured to update the information regarding the vulnerable friend when satisfying a predefined condition.
In an embodiment, the apparatus further comprises means configured to provide the information regarding the vulnerable friend to the object user and/or one or more applications.
In an embodiment, the apparatus further comprises means configured to configure the privacy setting of the object user based on the vulnerable friend.
It is noted that any of the components of the apparatus 400, 500 depicted in Figure 4-5 can be implemented as hardware or software modules. In the case of software modules, they can be embodied on a tangible computer-readable recordable storage medium. All of the software modules (or any subset thereof) can be on the same medium, or each can be on a different medium, for example. The software modules can run, for example, on a hardware processor. The method steps can then be carried out using the distinct software modules, as described above, executing on a hardware processor.
Additionally, an aspect of the disclosure can make use of software running on a general purpose computer or workstation. Such an implementation might employ, for example, a processor, a memory, and an input/output interface formed, for example, by a display and a keyboard. The term “processor” as used herein is intended to include any processing device, such as, for example, one that includes a CPU (central processing unit) and/or other forms of processing circuitry. Further, the term “processor” may refer to more than one individual processor. The term “memory” is intended to include memory associated with a processor or CPU, such as, for example, RAM (random access memory) , ROM (read only memory) , a fixed memory device (for example, hard drive) , a removable memory device (for example, diskette) , a flash memory and the like. The processor, memory, and input/output  interface such as display and keyboard can be interconnected, for example, via bus as part of a data processing unit. Suitable interconnections, for example via bus, can also be provided to a network interface, such as a network card, which can be provided to interface with a computer network, and to a media interface, such as a diskette or CD-ROM drive, which can be provided to interface with media.
Accordingly, computer software including instructions or code for performing the methodologies of the disclosure, as described herein, may be stored in associated memory devices (for example, ROM, fixed or removable memory) and, when ready to be utilized, loaded in part or in whole (for example, into RAM) and implemented by a CPU. Such software could include, but is not limited to, firmware, resident software, microcode, and the like.
As noted, aspects of the disclosure may take the form of a computer program product embodied in a computer readable medium having computer readable program code embodied thereon. Also, any combination of computer readable media may be utilized. The computer readable medium may be a computer readable signal medium or a computer readable storage medium. A computer readable storage medium may be, for example, but not limited to, an electronic, magnetic, optical, electromagnetic, infrared, or semiconductor system, apparatus, or device, or any suitable combination of the foregoing. More specific examples (anon-exhaustive list) of the computer readable storage medium would include the following: an electrical connection having one or more wires, a portable computer diskette, a hard disk, a random access memory (RAM) , a read-only memory (ROM) , an erasable programmable read-only memory (EPROM or Flash memory) , an optical fiber, a portable compact disc read-only memory (CD-ROM) , an optical storage device, a magnetic storage device, or any suitable combination of the foregoing. In the context of this document, a computer  readable storage medium may be any tangible medium that can contain, or store a program for use by or in connection with an instruction execution system, apparatus, or device.
Computer program code for carrying out operations for aspects of the disclosure may be written in any combination of at least one programming language, including an object oriented programming language such as Java, Smalltalk, C++ or the like and conventional procedural programming languages, such as the "C" programming language or similar programming languages. The program code may execute entirely on the user's computer, partly on the user's computer, as a stand-alone software package, partly on the user's computer and partly on a remote computer or entirely on the remote computer or server.
The flowchart and block diagrams in the figures illustrate the architecture, functionality, and operation of possible implementations of systems, methods and computer program products according to various embodiments of the disclosure. In this regard, each block in the flowchart or block diagrams may represent a module, component, segment, or portion of code, which comprises at least one executable instruction for implementing the specified logical function (s) . It should also be noted that, in some alternative implementations, the functions noted in the block may occur out of the order noted in the figures. For example, two blocks shown in succession may, in fact, be executed substantially concurrently, or the blocks may sometimes be executed in the reverse order, depending upon the functionality involved. It will also be noted that each block of the block diagrams and/or flowchart illustration, and combinations of blocks in the block diagrams and/or flowchart illustration, can be implemented by special purpose hardware-based systems that perform the specified  functions or acts, or combinations of special purpose hardware and computer instructions.
In any case, it should be understood that the components illustrated in this disclosure may be implemented in various forms of hardware, software, or combinations thereof, for example, application specific integrated circuit (s) (ASICS) , functional circuitry, an appropriately programmed general purpose digital computer with associated memory, and the like. Given the teachings of the disclosure provided herein, one of ordinary skill in the related art will be able to contemplate other implementations of the components of the disclosure.
The terminology used herein is for the purpose of describing particular embodiments only and is not intended to be limiting of the disclosure. As used herein, the singular forms “a, ” “an” and “the” are intended to include the plural forms as well, unless the context clearly indicates otherwise. It will be further understood that the terms “comprises” , “containing” and/or “comprising, ” when used in this specification, specify the presence of stated features, integers, steps, operations, elements, and/or components, but do not preclude the presence or addition of another feature, integer, step, operation, element, component, and/or group thereof.
The descriptions of the various embodiments have been presented for purposes of illustration, but are not intended to be exhaustive or limited to the embodiments disclosed. Many modifications and variations will be apparent to those of ordinary skill in the art without departing from the scope and spirit of the described embodiments.

Claims (25)

  1. A method for identifying a vulnerable friend for privacy protection of an object user in the object user’s online social network, wherein the object user has a plurality of friends in the social network, the method comprising:
    for at least some of the object user’s friends, estimating their contributions towards dissemination of the object user’s privacy information based on their behaviors in the social network; and
    identifying the vulnerable friend based on the estimated contributions towards dissemination of the privacy information.
  2. The method according to claim 1, wherein the step of estimating comprises:
    determining an ultimate circle of disseminating (UCD) that includes those users who involve in dissemination of the privacy information; and
    assessing impact of a direct friend of the object user on the overall dissemination of the privacy information in the UCD.
  3. The method according to claim 2, wherein the social network is represented by a directed graph, in which a node represents a user and an edge represents a following relationship between two users; the step of determining UCD comprises:
    calculating a first probability that a user will disseminate the privacy information if that user receives the information; and
    calculating a second probability that the user will access the privacy information.
  4. The method according to claim 3, wherein the step of calculating a first probability comprises:
    calculating the first probability based on the user’s privacy-protection consciousness, privacy leaking tendency and attitude of worship towards the object user.
  5. The method according to claim 3 or 4, wherein the step of calculating a second probability comprises:
    calculating the second probability of the user based on at least the first probability of an upper-stream user in the disseminating path and that user’s concern frequency towards the upper-stream user.
  6. The method according to any one of claims 3 to 5, wherein the step of determining UCD comprises:
    repeating the steps of calculating a second probability and calculating a first probability for users in a possible disseminating path until the calculated second probability is less than a first threshold.
  7. The method according to any one of claims 2 to 6, wherein the step of assessing impact comprises:
    assessing impact of the direct friend on disseminating intensity of the UCD, which is determined from at least the first probability and media capacity of each user in the UCD.
  8. The method according to any one of claims 1 to 7, further comprising:
    storing the information regarding the vulnerable friend.
  9. The method according to any one of claims 1 to 8, further comprising:
    updating the information regarding the vulnerable friend when satisfying a predefined condition.
  10. The method according to any one of claims 1 to 9, further comprising:
    providing the information regarding the vulnerable friend to the object user and/or one or more applications.
  11. The method according to any one of claims 1 to 10, further comprising:
    configuring the privacy setting of the object user based on the vulnerable friend.
  12. An apparatus, comprising means configured to carry out the method according to any one of claims 1 to 11.
  13. A computer program product embodied on a distribution medium readable by a computer and comprising program instructions which, when loaded into a computer, execute the method according to any one of claims 1 to 11.
  14. A non-transitory computer readable medium having encoded thereon statements and instructions to cause a processor to execute a method according to any one of claims 1 to 11.
  15. An apparatus for identifying a vulnerable friend for privacy protection of an object user in the object user’s online social network, wherein the object user has a plurality of friends in the social network, said apparatus comprising:
    an estimator configured to for at least some of the object user’s friends, estimate their contributions towards dissemination of the object user’s privacy information based on their behaviors in the social network; and
    an identifying element configured to identify the vulnerable friend based on the estimated contributions towards dissemination of the privacy information.
  16. The apparatus according to claim 15, wherein the estimator is further configured to:
    determine an ultimate circle of disseminating (UCD) that includes those users who involve in dissemination of the privacy information; and
    assess impact of a direct friend of the object user on the overall dissemination of the privacy information in the UCD.
  17. The apparatus according to claim 16, wherein the social network is represented by a directed graph, in which a node represents a user and an edge represents a following relationship between two users; said determining UCD comprises the estimator configured to:
    calculate a first probability that a user will disseminate the privacy information if that user receives the information; and
    calculate a second probability that the user will access the privacy information.
  18. The apparatus according to claim 17, wherein said calculating a first probability comprises the estimator configured to:
    calculate the first probability based on the user’s privacy-protection consciousness, privacy leaking tendency and attitude of worship towards the object user.
  19. The method according to claim 17 or 18, wherein said calculating a second probability comprises the estimator configured to:
    calculate the second probability of the user based on at least the first probability of an upper-stream user in the disseminating path and that user’s concern frequency towards the upper-stream user.
  20. The apparatus according to any one of claims 17 to 19, wherein said determining UCD comprises the estimator configured to:
    repeat to calculate a second probability and to calculate a first probability for users in a possible disseminating path until the calculated second probability is less than a first threshold.
  21. The apparatus according to any one of claims 16 to 20, wherein said assessing impact comprises the estimator configured to:
    assess impact of the direct friend on disseminating intensity of the UCD, which is determined from at least the first probability and media capacity of each user in the UCD.
  22. The apparatus according to any one of claims 15 to 21, further comprising:
    a storage device configured to store the information regarding the vulnerable friend.
  23. The apparatus according to any one of claims 15 to 22, further comprising:
    an updater configured to update the information regarding the vulnerable friend when satisfying a predefined condition.
  24. The apparatus according to any one of claims 15 to 23, further comprising:
    a provider configured to provide the information regarding the vulnerable friend to the object user and/or one or more application.
  25. The apparatus according to any one of claims 15 to 24, further comprising:
    a configurator configured to configure the privacy setting of the object user based on the vulnerable friend.
PCT/CN2015/075102 2015-03-26 2015-03-26 Method, apparatus and computer program product for identifying a vulnerable friend for privacy protection in a social network WO2016149929A1 (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
PCT/CN2015/075102 WO2016149929A1 (en) 2015-03-26 2015-03-26 Method, apparatus and computer program product for identifying a vulnerable friend for privacy protection in a social network

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
PCT/CN2015/075102 WO2016149929A1 (en) 2015-03-26 2015-03-26 Method, apparatus and computer program product for identifying a vulnerable friend for privacy protection in a social network

Publications (1)

Publication Number Publication Date
WO2016149929A1 true WO2016149929A1 (en) 2016-09-29

Family

ID=56977698

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/CN2015/075102 WO2016149929A1 (en) 2015-03-26 2015-03-26 Method, apparatus and computer program product for identifying a vulnerable friend for privacy protection in a social network

Country Status (1)

Country Link
WO (1) WO2016149929A1 (en)

Cited By (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN107886441A (en) * 2017-10-18 2018-04-06 中国科学院计算技术研究所 A kind of social networks vulnerability assessment method and system
CN108390865A (en) * 2018-01-30 2018-08-10 南京航空航天大学 A kind of fine-grained access control mechanisms and system based on privacy driving

Citations (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN103562929A (en) * 2011-04-05 2014-02-05 阿尔卡特朗讯 Method of parameterizing rules for broadcasting personal data
CN104156388A (en) * 2014-06-26 2014-11-19 西安邮电大学 Collaborative filtering recommendation method based on trustful privacy maintenance in personalized search

Patent Citations (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN103562929A (en) * 2011-04-05 2014-02-05 阿尔卡特朗讯 Method of parameterizing rules for broadcasting personal data
CN104156388A (en) * 2014-06-26 2014-11-19 西安邮电大学 Collaborative filtering recommendation method based on trustful privacy maintenance in personalized search

Cited By (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN107886441A (en) * 2017-10-18 2018-04-06 中国科学院计算技术研究所 A kind of social networks vulnerability assessment method and system
CN107886441B (en) * 2017-10-18 2021-01-08 中国科学院计算技术研究所 Social network vulnerability assessment method and system
CN108390865A (en) * 2018-01-30 2018-08-10 南京航空航天大学 A kind of fine-grained access control mechanisms and system based on privacy driving

Similar Documents

Publication Publication Date Title
TWI712963B (en) Recommendation system construction method and device
KR102076580B1 (en) Systems and methods for dynamic mapping for locality and balance
US20190080063A1 (en) De-identification architecture
US11651253B2 (en) Machine learning classifier for identifying internet service providers from website tracking
US20160110134A1 (en) Large-Scale, Dynamic Graph Storage and Processing System
US20180191849A1 (en) Method and system for tracking residential internet activities
US10796239B2 (en) Method and/or system for recommender system
US10642802B2 (en) Identifying an entity associated with an online communication
US20160232161A1 (en) Method to maximize message spreading in social networks and find the most influential people in social media
US11468521B2 (en) Social media account filtering method and apparatus
Laroui et al. SO‐VMEC: service offloading in virtual mobile edge computing using deep reinforcement learning
Xhafa et al. Modeling and processing for next-generation Big-Data technologies
WO2016149929A1 (en) Method, apparatus and computer program product for identifying a vulnerable friend for privacy protection in a social network
US11557005B2 (en) Addressing propagation of inaccurate information in a social networking environment
KR101509888B1 (en) Method and apparatus for message spreading in social network
US11620520B2 (en) Computer-based systems configured for detecting and splitting data types in a data file and methods of use thereof
Yi et al. Ranking spreaders in complex networks based on the most influential neighbors
Huang et al. Modeling and analysis on congestion control for data transmission in sensor clouds
US11494439B2 (en) Digital modeling and prediction for spreading digital data
Ghiringhelli et al. Recursive estimation of the spatial error model
Wang et al. Ferry node identification model for the security of mobile ad hoc network
CN103064872A (en) Processing search queries in a network of interconnected nodes
Jiang et al. Efficiency improvements in social network communication via MapReduce
Khan et al. BELIEVE: Privacy-Aware Secure Multi-Party Computation for Real-Time Connected and Autonomous Vehicles and Micro-Mobility Data Validation Using Blockchain—A Study on New York City Data
Guo et al. Service composition optimization method based on parallel particle swarm algorithm on spark

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 15885868

Country of ref document: EP

Kind code of ref document: A1

NENP Non-entry into the national phase

Ref country code: DE

122 Ep: pct application non-entry in european phase

Ref document number: 15885868

Country of ref document: EP

Kind code of ref document: A1