US20140007255A1 - Privacy Control in a Social Network - Google Patents

Privacy Control in a Social Network Download PDF

Info

Publication number
US20140007255A1
US20140007255A1 US13/535,920 US201213535920A US2014007255A1 US 20140007255 A1 US20140007255 A1 US 20140007255A1 US 201213535920 A US201213535920 A US 201213535920A US 2014007255 A1 US2014007255 A1 US 2014007255A1
Authority
US
United States
Prior art keywords
user
private data
post
data
information handling
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Abandoned
Application number
US13/535,920
Other languages
English (en)
Inventor
Faheem Altaf
Steven Duane Clay
Eduardo N. Spring
Shunguo Yan
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
International Business Machines Corp
Original Assignee
International Business Machines Corp
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by International Business Machines Corp filed Critical International Business Machines Corp
Priority to US13/535,920 priority Critical patent/US20140007255A1/en
Assigned to INTERNATIONAL BUSINESS MACHINES CORPORATION reassignment INTERNATIONAL BUSINESS MACHINES CORPORATION ASSIGNMENT OF ASSIGNORS INTEREST (SEE DOCUMENT FOR DETAILS). Assignors: SPRING, EDUARDO N., ALTAF, FAHEEM, CLAY, STEVEN DUANE, YAN, SHUNGUO
Priority to US13/746,213 priority patent/US8955153B2/en
Priority to PCT/IB2013/055272 priority patent/WO2014002041A2/en
Priority to EP13810012.8A priority patent/EP2867812A4/de
Priority to CN201380027237.2A priority patent/CN104350505B/zh
Publication of US20140007255A1 publication Critical patent/US20140007255A1/en
Abandoned legal-status Critical Current

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F21/00Security arrangements for protecting computers, components thereof, programs or data against unauthorised activity
    • G06F21/60Protecting data
    • G06F21/62Protecting access to data via a platform, e.g. using keys or access control rules
    • G06F21/6218Protecting access to data via a platform, e.g. using keys or access control rules to a system of files or objects, e.g. local or distributed file system or database
    • G06F21/6245Protecting personal data, e.g. for financial or medical purposes
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04LTRANSMISSION OF DIGITAL INFORMATION, e.g. TELEGRAPHIC COMMUNICATION
    • H04L63/00Network architectures or network communication protocols for network security
    • H04L63/04Network architectures or network communication protocols for network security for providing a confidential data exchange among entities communicating through data packet networks
    • H04L63/0407Network architectures or network communication protocols for network security for providing a confidential data exchange among entities communicating through data packet networks wherein the identity of one or more communicating identities is hidden
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04LTRANSMISSION OF DIGITAL INFORMATION, e.g. TELEGRAPHIC COMMUNICATION
    • H04L63/00Network architectures or network communication protocols for network security
    • H04L63/10Network architectures or network communication protocols for network security for controlling access to devices or network resources
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04WWIRELESS COMMUNICATION NETWORKS
    • H04W12/00Security arrangements; Authentication; Protecting privacy or anonymity
    • H04W12/02Protecting privacy or anonymity, e.g. protecting personally identifiable information [PII]
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04LTRANSMISSION OF DIGITAL INFORMATION, e.g. TELEGRAPHIC COMMUNICATION
    • H04L63/00Network architectures or network communication protocols for network security
    • H04L63/02Network architectures or network communication protocols for network security for separating internal from external traffic, e.g. firewalls
    • H04L63/0227Filtering policies
    • H04L63/0245Filtering by information in the payload

Definitions

  • the present disclosure relates to an approach that assists in protecting private data in a social network.
  • Privacy issue in social network usually involves system breaches or exploitation of a user's private information.
  • a privacy violation can occur during normal social interactions when sensitive data is relayed to a person that is outside of the originally-targeted group.
  • One example is after a user shares his personal information (e.g., a picture, phone number) to a close circle of friends, a member of that group may accidentally or intentionally expose that sensitive information to the users outside of the group.
  • An approach is provided to provide privacy control in a social network.
  • a first post is posted from a first user to a second user in the social network with the first post including private data belonging to the first user.
  • Subsequent postings are monitored for the first user's private data.
  • privacy controls are performed.
  • the privacy controls mask the first user's private data from the third user so that the first user's private data inaccessible (not visible) to the third user.
  • FIG. 2 provides an extension of the information handling system environment shown in FIG. 1 to illustrate that the methods described herein can be performed on a wide variety of information handling systems which operate in a networked environment;
  • FIG. 3 is a component diagram showing various components involved in protecting users' private data
  • FIG. 4 is a flowchart showing steps taken to protect private data during the posting of content to a social network
  • FIG. 5 is a continuation of the flowchart shown in FIG. 4 and shows continued scanning steps taken to protect private data during the posting of content to a social network;
  • FIG. 6 is a flowchart showing steps taken to counteract a suspected privacy breach when such breach is detected
  • FIG. 7 is a flowchart showing steps taken to handle a private data release request received by an owner of the private data.
  • FIG. 8 is a flowchart showing processing by a recipient of a content post to a social network to selectively show other users' private data based on privacy control settings.
  • aspects of the present disclosure may be embodied as a system, method or computer program product. Accordingly, aspects of the present disclosure may take the form of an entirely hardware embodiment, an entirely software embodiment (including firmware, resident software, micro-code, etc.) or an embodiment combining software and hardware aspects that may all generally be referred to herein as a “circuit,” “module” or “system.” Furthermore, aspects of the present disclosure may take the form of a computer program product embodied in one or more computer readable medium(s) having computer readable program code embodied thereon.
  • the computer readable medium may be a computer readable signal medium or a computer readable storage medium.
  • a computer readable storage medium may be, for example, but not limited to, an electronic, magnetic, optical, electromagnetic, infrared, or semiconductor system, apparatus, or device, or any suitable combination of the foregoing.
  • a computer readable storage medium may be any tangible medium that can contain, or store a program for use by or in connection with an instruction execution system, apparatus, or device.
  • a computer readable signal medium may include a propagated data signal with computer readable program code embodied therein, for example, in baseband or as part of a carrier wave. Such a propagated signal may take any of a variety of forms, including, but not limited to, electro-magnetic, optical, or any suitable combination thereof.
  • a computer readable signal medium may be any computer readable medium that is not a computer readable storage medium and that can communicate, propagate, or transport a program for use by or in connection with an instruction execution system, apparatus, or device.
  • Program code embodied on a computer readable medium may be transmitted using any appropriate medium, including but not limited to wireless, wireline, optical fiber cable, RF, etc., or any suitable combination of the foregoing.
  • Computer program code for carrying out operations for aspects of the present disclosure may be written in any combination of one or more programming languages, including an object oriented programming language such as Java, Smalltalk, C++ or the like and conventional procedural programming languages, such as the “C” programming language or similar programming languages.
  • the program code may execute entirely on the user's computer, partly on the user's computer, as a stand-alone software package, partly on the user's computer and partly on a remote computer or entirely on the remote computer or server.
  • the remote computer may be connected to the user's computer through any type of network, including a local area network (LAN) or a wide area network (WAN), or the connection may be made to an external computer (for example, through the Internet using an Internet Service Provider).
  • LAN local area network
  • WAN wide area network
  • Internet Service Provider for example, AT&T, MCI, Sprint, EarthLink, MSN, GTE, etc.
  • These computer program instructions may also be stored in a computer readable medium that can direct a computer, other programmable data processing apparatus, or other devices to function in a particular manner, such that the instructions stored in the computer readable medium produce an article of manufacture including instructions which implement the function/act specified in the flowchart and/or block diagram block or blocks.
  • the computer program instructions may also be loaded onto a computer, other programmable data processing apparatus, or other devices to cause a series of operational steps to be performed on the computer, other programmable apparatus or other devices to produce a computer implemented process such that the instructions which execute on the computer or other programmable apparatus provide processes for implementing the functions/acts specified in the flowchart and/or block diagram block or blocks.
  • FIG. 1 A computing environment in FIG. 1 that is suitable to implement the software and/or hardware techniques associated with the invention.
  • FIG. 2 A networked environment is illustrated in FIG. 2 as an extension of the basic computing environment, to emphasize that modern computing techniques can be performed across multiple discrete devices.
  • FIG. 1 illustrates information handling system 100 , which is a simplified example of a computer system capable of performing the computing operations described herein.
  • Information handling system 100 includes one or more processors 110 coupled to processor interface bus 112 .
  • Processor interface bus 112 connects processors 110 to Northbridge 115 , which is also known as the Memory Controller Hub (MCH).
  • Northbridge 115 connects to system memory 120 and provides a means for processor(s) 110 to access the system memory.
  • Graphics controller 125 also connects to Northbridge 115 .
  • PCI Express bus 118 connects Northbridge 115 to graphics controller 125 .
  • Graphics controller 125 connects to display device 130 , such as a computer monitor.
  • Northbridge 115 and Southbridge 135 connect to each other using bus 119 .
  • the bus is a Direct Media Interface (DMI) bus that transfers data at high speeds in each direction between Northbridge 115 and Southbridge 135 .
  • a Peripheral Component Interconnect (PCI) bus connects the Northbridge and the Southbridge.
  • Southbridge 135 also known as the I/O Controller Hub (ICH) is a chip that generally implements capabilities that operate at slower speeds than the capabilities provided by the Northbridge.
  • Southbridge 135 typically provides various busses used to connect various components. These busses include, for example, PCI and PCI Express busses, an ISA bus, a System Management Bus (SMBus or SMB), and/or a Low Pin Count (LPC) bus.
  • PCI and PCI Express busses an ISA bus
  • SMB System Management Bus
  • LPC Low Pin Count
  • the LPC bus often connects low-bandwidth devices, such as boot ROM 196 and “legacy” I/O devices (using a “super I/O” chip).
  • the “legacy” I/O devices ( 198 ) can include, for example, serial and parallel ports, keyboard, mouse, and/or a floppy disk controller.
  • the LPC bus also connects Southbridge 135 to Trusted Platform Module (TPM) 195 .
  • TPM Trusted Platform Module
  • Other components often included in Southbridge 135 include a Direct Memory Access (DMA) controller, a Programmable Interrupt Controller (PIC), and a storage device controller, which connects Southbridge 135 to nonvolatile storage device 185 , such as a hard disk drive, using bus 184 .
  • DMA Direct Memory Access
  • PIC Programmable Interrupt Controller
  • storage device controller which connects Southbridge 135 to nonvolatile storage device 185 , such as a hard disk drive, using bus 184 .
  • ExpressCard 155 is a slot that connects hot-pluggable devices to the information handling system.
  • ExpressCard 155 supports both PCI Express and USB connectivity as it connects to Southbridge 135 using both the Universal Serial Bus (USB) the PCI Express bus.
  • Southbridge 135 includes USB Controller 140 that provides USB connectivity to devices that connect to the USB. These devices include webcam (camera) 150 , infrared (IR) receiver 148 , keyboard and trackpad 144 , and Bluetooth device 146 , which provides for wireless personal area networks (PANs).
  • webcam camera
  • IR infrared
  • keyboard and trackpad 144 keyboard and trackpad 144
  • Bluetooth device 146 which provides for wireless personal area networks (PANs).
  • USB Controller 140 also provides USB connectivity to other miscellaneous USB connected devices 142 , such as a mouse, removable nonvolatile storage device 145 , modems, network cards, ISDN connectors, fax, printers, USB hubs, and many other types of USB connected devices. While removable nonvolatile storage device 145 is shown as a USB-connected device, removable nonvolatile storage device 145 could be connected using a different interface, such as a Firewire interface, etcetera.
  • Wireless Local Area Network (LAN) device 175 connects to Southbridge 135 via the PCI or PCI Express bus 172 .
  • LAN device 175 typically implements one of the IEEE 0.802.11 standards of over-the-air modulation techniques that all use the same protocol to wireless communicate between information handling system 100 and another computer system or device.
  • Optical storage device 190 connects to Southbridge 135 using Serial ATA (SATA) bus 188 .
  • Serial ATA adapters and devices communicate over a high-speed serial link.
  • the Serial ATA bus also connects Southbridge 135 to other forms of storage devices, such as hard disk drives.
  • Audio circuitry 160 such as a sound card, connects to Southbridge 135 via bus 158 .
  • Audio circuitry 160 also provides functionality such as audio line-in and optical digital audio in port 162 , optical digital output and headphone jack 164 , internal speakers 166 , and internal microphone 168 .
  • Ethernet controller 170 connects to Southbridge 135 using a bus, such as the PCI or PCI Express bus. Ethernet controller 170 connects information handling system 100 to a computer network, such as a Local Area Network (LAN), the Internet, and other public and private computer networks.
  • LAN Local Area Network
  • the Internet and other public and private computer networks.
  • an information handling system may take many forms.
  • an information handling system may take the form of a desktop, server, portable, laptop, notebook, or other form factor computer or data processing system.
  • an information handling system may take other form factors such as a personal digital assistant (PDA), a gaming device, ATM machine, a portable telephone device, a communication device or other devices that include a processor and memory.
  • PDA personal digital assistant
  • the Trusted Platform Module (TPM 195 ) shown in FIG. 1 and described herein to provide security functions is but one example of a hardware security module (HSM). Therefore, the TPM described and claimed herein includes any type of HSM including, but not limited to, hardware security devices that conform to the Trusted Computing Groups (TCG) standard, and entitled “Trusted Platform Module (TPM) Specification Version 1.2.”
  • TCG Trusted Computing Groups
  • TPM Trusted Platform Module
  • the TPM is a hardware security subsystem that may be incorporated into any number of information handling systems, such as those outlined in FIG. 2 .
  • FIG. 2 provides an extension of the information handling system environment shown in FIG. 1 to illustrate that the methods described herein can be performed on a wide variety of information handling systems that operate in a networked environment.
  • Types of information handling systems range from small handheld devices, such as handheld computer/mobile telephone 210 to large mainframe systems, such as mainframe computer 270 .
  • handheld computer 210 include personal digital assistants (PDAs), personal entertainment devices, such as MP3 players, portable televisions, and compact disc players.
  • PDAs personal digital assistants
  • Other examples of information handling systems include pen, or tablet, computer 220 , laptop, or notebook, computer 230 , workstation 240 , personal computer system 250 , and server 260 .
  • Other types of information handling systems that are not individually shown in FIG. 2 are represented by information handling system 280 .
  • the various information handling systems can be networked together using computer network 200 .
  • Types of computer network that can be used to interconnect the various information handling systems include Local Area Networks (LANs), Wireless Local Area Networks (WLANs), the Internet, the Public Switched Telephone Network (PSTN), other wireless networks, and any other network topology that can be used to interconnect the information handling systems.
  • Many of the information handling systems include nonvolatile data stores, such as hard drives and/or nonvolatile memory.
  • Some of the information handling systems shown in FIG. 2 depicts separate nonvolatile data stores (server 260 utilizes nonvolatile data store 265 , mainframe computer 270 utilizes nonvolatile data store 275 , and information handling system 280 utilizes nonvolatile data store 285 ).
  • the nonvolatile data store can be a component that is external to the various information handling systems or can be internal to one of the information handling systems.
  • removable nonvolatile storage device 145 can be shared among two or more information handling systems using various techniques, such as connecting the removable nonvolatile storage device 145 to a USB port or other connector of the information handling systems.
  • FIGS. 3-8 depict an approach that can be executed on an information handling system and computer network as shown in FIGS. 1-2 .
  • a social network privacy control approach identifies and monitors sensitive data to prevent the sensitive data from being relayed beyond the originally-targeted user group.
  • the approach provides for Identification of sensitive (private) data from a post.
  • the private data is determined from a user's privacy configuration that defines sensitive data types, patterns and matching instructions.
  • the system monitors subsequent posts (e.g., from any receivers of the original post, etc.) for inclusion of the private data.
  • the subsequent posts are displayed with the private data masked (not visible) using a generated pattern.
  • the original user (the owner of the private data) is sent a notification with the message containing the private data and targeted receivers. If the original user approves a selected set of the targeted receivers for viewing the user's private data, then the private data is unmasked allowing the selected set of targeted receivers to view the original user's private data.
  • FIG. 3 is a component diagram showing various components involved in protecting users' private data.
  • social network includes components that are accessible by individual users ( 310 ) as well as components that, while used to process user requests, provide system ( 350 ) functionality to all of the users of social network 300 .
  • User 310 selects social network features, including privacy features, using process 320 .
  • Privacy features are used as an input to update privacy configuration process 325 .
  • the user's resulting privacy configuration data is stored in privacy profile 330 .
  • privacy profile 330 includes both the user's private data 340 as well as private data markup (masks) 345 which provide masks for the private data.
  • masks private data markup
  • Social network system 350 receives feature selections from the user as inputs to update the user's profile (process 360 ).
  • the updates to the user's profile are used as an input to the system's process used to process users' private data and policies (process 370 ).
  • the system's process used to process users' private data and policies (process 370 ) receives inputs from both the user's privacy configuration (process 325 ) as well as the updates to the user profile (process 360 ).
  • the private data and policies resulting from process 370 drive the social network's privacy engine 380 that controls access to private data owned by the various uses of the social network.
  • FIG. 4 is a flowchart showing steps taken to protect private data during the posting of content to a social network. Processing commences at 400 whereupon, at step 405 , a first user (the owner of the private data) requests to post message contents 410 to one or more second users of the social network. At step 415 , the social network's posting process retrieves the user's privacy profile from user's privacy profile data store 330 . At step 420 , message contents 410 being posted by the first user are scanned for the user's private data using user's privacy profile 330 and any predefined user privacy data 340 .
  • a first user the owner of the private data
  • the social network's posting process retrieves the user's privacy profile from user's privacy profile data store 330 .
  • message contents 410 being posted by the first user are scanned for the user's private data using user's privacy profile 330 and any predefined user privacy data 340 .
  • User's privacy profile data store 330 describes types of content that the user deems private. These data descriptions are set forth as data types (e.g., image files (photographs), etc.), predetermined patterns (e.g., ###-###-#### as a pattern for phone numbers, etc.), matching instructions, and analytic algorithms.
  • data types e.g., image files (photographs), etc.
  • predetermined patterns e.g., ###-###-#### as a pattern for phone numbers, etc.
  • matching instructions e.g., etc.
  • the privacy profile may indicate that photographs included in the user's posts to other social network users are private.
  • the user's private data may include specific data items, such as a particular telephone number, that the user does want disseminated to unknown parties.
  • Process 420 is a sensitive content analyzer that compares message contents 410 with both the user's privacy profile (data store 330 ) as well as the user's private data (data store 340 ). A copy of any private data detected in message contents 410 is stored in memory area 425 .
  • decision 430 A decision is made as to whether private data was detected by the sensitive content analyzer process performed at step 420 (decision 430 ). If no private data was found, decision 430 branches to the “no” branch whereupon, at step 432 the message is posted to the recipients selected by the user and no further privacy processing is performed as the post was not found to contain private data. On the other hand, if private data was identified in the post, then decision 430 branches to the “yes” branch to perform further privacy processing in order to protect the user's private data.
  • a decision is made as to whether the user edited the contents of the post (e.g., to remove or edit the user's private data in the post) at decision 445 . If the user edited the contents of the post, then decision 445 branches to the “yes” branch whereupon processing loops back to re-scan the post contents using the sensitive content analyzer as described above.
  • a mask pattern is generated for each private data identified in the user's post. For example, a blank image stating “photo not available” may be a mask when the user's private data is a photograph, and a mask such as “###-###-####” may be used when the user's private data is a telephone number.
  • Such masks often indicate the type of data (e.g., a photograph, phone number, etc.) that is being withheld from the recipient without actually divulging the data.
  • the user's private data being transmitted e.g., a phone number, photograph, etc.
  • This data is stored in transmitted private data store 460 .
  • the process updates the private data logs associated with each of the authorized recipients of the user's private data along with the private data conveyed and the mask pattern used to conceal the private data from unauthorized recipients.
  • the social network maintains recipients' private data logs 480 which includes a data log for each user that has received private data from another user of the social network.
  • recipients' private data logs are maintained by the social network and are inaccessible by the individual users in order to prevent any user from tampering with the privacy control data or from viewing other users' private data without authorization.
  • four separate data logs are shown corresponding to User A, User B, User C, and User N (data stores 481 , 482 , 483 , and 485 , respectively).
  • FIG. 5 is a continuation of FIG. 4 , with the emphasis of FIG. 5 being to protect the private data belonging to other users.
  • FIG. 5 is a continuation of the flowchart shown in FIG. 4 and shows continued scanning steps taken to protect private data during the posting of content to a social network.
  • the user's post (message contents 410 ) is scanned during a monitoring process to identify whether the user's post contains private data that this user received from other users.
  • private data that this user received (as a recipient, or “second user”) from other users is retrieved from private data from others data store 515 which is updated when the user previously received private data (as the recipient, or “second user”) from other users (as the original poster, or “first user”).
  • decision 525 A decision is made as to whether other users' private data has been detected in the post that the user is currently attempting to send (decision 525 , with this post then being a second post of the first user's private data that the current (second) user may be attempting to send to a third user). If no private data belonging to other users is detected in message contents 410 , then decision 525 branches to the “no” branch whereupon, at step 530 , the post is transmitted without further privacy control processing taking place and processing ends at 535 .
  • decision 525 branches to the “yes” branch for further processing aimed at protecting the other users' private data.
  • the first recipient of the post is selected, and at step 545 , the first private data detected in the post is also selected.
  • the privacy process checks the private data from others data store 480 in order to determine whether the selected recipient is authorized to view the selected private data (e.g., the selected recipient is the original poster (owner) of the private data or the selected recipient either already previously received this private data from the owner or has been authorized to view the private data by the owner of the private data, etc.).
  • the result of predefined process 570 is masked posts 575 which are posts with private data masked (not visible) to particular recipients. In the example shown, masked posts are created for Recipients A, B, and N (posts 576 , 577 , and 578 , respectively).
  • decision 560 branches to the “yes” branch bypassing predefined process 570 so that the selected recipient will be able to view the private data in the post.
  • FIG. 6 is a flowchart showing steps taken to counteract a suspected privacy breach when such breach is detected. Processing commences at 600 whereupon, at step 610 , the detected private data is masked using a generated mask from data store 345 and the masked message is stored in memory area 570 . A decision is made as to whether the user (the “second user”) has requested (e.g., in the user's profile, etc.) to be warned when sending other users' private data to unauthorized recipients (“third users”) at decision 620 .
  • decision 620 branches to the “yes” branch whereupon, at step 625 , a notification is displayed to the current user (the “second user”) that notifies the user that the post (the “second post”) includes private data belonging to another user (the “first user”, owner of the private data).
  • the user is provided an opportunity to edit the contents of the post.
  • a decision is made as to whether the user edited the contents of the post (decision 630 ). If the user edited the contents of the post, then the entire post is re-processed at predefined process 640 (see FIG. 4 and corresponding text for processing details) and processing ends at 650 .
  • a private data request is prepared and stored in memory area 670 .
  • the private data request includes the requestor (the current user, also known as the “second user”), the intended recipient (the “third user”) and the private data owned by the “first user” that was included in the post (and subsequently masked so that it currently is not visible to the recipient (“third user”)).
  • the private data request ( 670 ) is transmitted to the “first user” who is the owner of the private data.
  • the first user will receive the private data request and decide whether to allow the third user authorization to view the first user's private data. If authorization is provided, then the private data included in the post from the second user to the third user will be unmasked so that the third user is able to view the first user's private data that was included in the post by the second user.
  • Private data owner 680 performs predefined process 685 to handle the private data request (see FIG. 7 and corresponding text for processing details). Processing then returns to the calling routine (see FIG. 5 ) at 695 .
  • FIG. 7 is a flowchart showing steps taken to handle a private data release request received by an owner of the private data. Processing commences at 700 whereupon, at step 710 , the owner (the “first user”) receives private data request 670 which is requesting authorization for a “third user” to be able to view the first user's private data that was previously authorized for viewing by a “second user” that subsequently attempted to transmit the first user's private data to the third user.
  • the owner the “first user” receives private data request 670 which is requesting authorization for a “third user” to be able to view the first user's private data that was previously authorized for viewing by a “second user” that subsequently attempted to transmit the first user's private data to the third user.
  • private data request 670 includes an identifier of the requestor (e.g., the “second user” that is trying to disseminate the first user's private data), an identifier of the recipient (e.g., the “third user” that is receiving the post from the second user and wishes to view the first user's private data), and the first user's private data that is the subject of the request.
  • the requestor e.g., the “second user” that is trying to disseminate the first user's private data
  • an identifier of the recipient e.g., the “third user” that is receiving the post from the second user and wishes to view the first user's private data
  • the first user's private data that is the subject of the request.
  • the owner of the private data (the “first user”) is prompted as to whether he/she wishes to approve the private data request.
  • a decision is made as to whether the owner (first user) has approved the release of the first user's private data to the recipient (third user) at decision 730 . If the owner (first user) approved the release, then decision 730 branches to the “yes” branch to perform approval processing.
  • Steps 740 , 750 , and 770 are performed as part of approval processing.
  • the owner's private data and mask pattern are stored in transmitted private data store 460 along with the recipient (third user) that is being authorized to view the first user's private data.
  • the private data log of the recipient (the third user) is updated in data store 480 , thus indicating the authorization of the third user to view the first user's private data.
  • the owner (the first user) sends a notification to the recipient (the third user) that authorization has been granted for the third user to view the first user's private data. Now, when the recipient (third user) views the post received from the second user the private data owned by the first user will be unmasked (visible) to the third user.
  • decision 730 if the owner of the private data does not approve the release of the private data to the recipient (the third user), then decision 730 branches to the “no” branch bypassing steps 740 , 750 , and 770 . Owner's processing thereafter ends at 795 .
  • Recipient 760 receives the authorization to view the first user's private data with the update to the recipient's private data from others data store 780 reflecting the private data owned by other users that this recipient (third user) is allowed to view.
  • the recipient views posts with private data being masked or unmasked based upon whether authorization has been granted to view such private data (see FIG. 8 and corresponding text for processing details).
  • FIG. 8 is a flowchart showing processing by a recipient (a third user) of a content post to a social network to selectively show other users' private data based on privacy control settings.
  • Recipient (third user) viewing processing commences at 800 whereupon, at step 810 , the recipient receives a post for viewing.
  • these posts can include unmasked posts 815 that do not contain other users' private data that this recipient not authorized to view as well as masked posts 575 which are posts with other users' private data masked (not visible) to the recipient.
  • the first masked area in the post is selected with this selected masked area corresponding to the first occurrence of another user's private data in the post.
  • the viewing process checks this recipient's private data from others data store 480 in order to identify whether the owner of this private data (the first user) has authorized the recipient to view the first user's private data. A decision is made as to whether the recipient is authorized to view the private data corresponding to the selected mask (decision 850 ).
  • decision 850 branches to the “yes” branch whereupon, at step 860 , the process unmasks the selected area by inserting the actual contents of the owner's private data from data store 480 thereby making the owner's private data visible to the recipient and the post is updated in memory area 870 .
  • decision 850 branches to the “no” branch bypassing step 860 .
  • decision 880 A decision is made as to whether there are additional masked areas in the post corresponding to additional references of private data owned by other users (decision 880 ). If there are additional masked areas to process, then decision 880 branches to the “yes” branch which loops back to select and process the next masked area as described above. This looping continues until all of the masked areas in the post have been processed, at which point decision 880 branches to the “no” branch whereupon, at step 890 , the recipient views the masked post 870 will all masks corresponding to private data that has been approved for viewing by the recipient filled in (visible) and those masked areas that have not been authorized masked out so that such unauthorized private data continues to be masked (not visible) to the recipient. Processing thereafter ends at 895 .
  • One of the preferred implementations of the invention is a client application, namely, a set of instructions (program code) or other functional descriptive material in a code module that may, for example, be resident in the random access memory of the computer.
  • the set of instructions may be stored in another computer memory, for example, in a hard disk drive, or in a removable memory such as an optical disk (for eventual use in a CD ROM) or floppy disk (for eventual use in a floppy disk drive).
  • the present invention may be implemented as a computer program product for use in a computer.
  • Functional descriptive material is information that imparts functionality to a machine.
  • Functional descriptive material includes, but is not limited to, computer programs, instructions, rules, facts, definitions of computable functions, objects, and data structures.

Landscapes

  • Engineering & Computer Science (AREA)
  • Computer Security & Cryptography (AREA)
  • Computer Hardware Design (AREA)
  • General Engineering & Computer Science (AREA)
  • Theoretical Computer Science (AREA)
  • Bioethics (AREA)
  • General Health & Medical Sciences (AREA)
  • Health & Medical Sciences (AREA)
  • Signal Processing (AREA)
  • Computer Networks & Wireless Communication (AREA)
  • Computing Systems (AREA)
  • Medical Informatics (AREA)
  • General Physics & Mathematics (AREA)
  • Physics & Mathematics (AREA)
  • Software Systems (AREA)
  • Databases & Information Systems (AREA)
  • Information Transfer Between Computers (AREA)
  • Management, Administration, Business Operations System, And Electronic Commerce (AREA)
  • Storage Device Security (AREA)
US13/535,920 2012-06-28 2012-06-28 Privacy Control in a Social Network Abandoned US20140007255A1 (en)

Priority Applications (5)

Application Number Priority Date Filing Date Title
US13/535,920 US20140007255A1 (en) 2012-06-28 2012-06-28 Privacy Control in a Social Network
US13/746,213 US8955153B2 (en) 2012-06-28 2013-01-21 Privacy control in a social network
PCT/IB2013/055272 WO2014002041A2 (en) 2012-06-28 2013-06-27 Privacy control in a social network
EP13810012.8A EP2867812A4 (de) 2012-06-28 2013-06-27 Steuerung der privatsphäre in einem sozialen netzwerk
CN201380027237.2A CN104350505B (zh) 2012-06-28 2013-06-27 用于社交网络中的隐私控制的方法和系统

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
US13/535,920 US20140007255A1 (en) 2012-06-28 2012-06-28 Privacy Control in a Social Network

Related Child Applications (1)

Application Number Title Priority Date Filing Date
US13/746,213 Continuation US8955153B2 (en) 2012-06-28 2013-01-21 Privacy control in a social network

Publications (1)

Publication Number Publication Date
US20140007255A1 true US20140007255A1 (en) 2014-01-02

Family

ID=49779775

Family Applications (2)

Application Number Title Priority Date Filing Date
US13/535,920 Abandoned US20140007255A1 (en) 2012-06-28 2012-06-28 Privacy Control in a Social Network
US13/746,213 Expired - Fee Related US8955153B2 (en) 2012-06-28 2013-01-21 Privacy control in a social network

Family Applications After (1)

Application Number Title Priority Date Filing Date
US13/746,213 Expired - Fee Related US8955153B2 (en) 2012-06-28 2013-01-21 Privacy control in a social network

Country Status (4)

Country Link
US (2) US20140007255A1 (de)
EP (1) EP2867812A4 (de)
CN (1) CN104350505B (de)
WO (1) WO2014002041A2 (de)

Cited By (12)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20120203765A1 (en) * 2011-02-04 2012-08-09 Microsoft Corporation Online catalog with integrated content
US20150220741A1 (en) * 2014-01-31 2015-08-06 International Business Machines Corporation Processing information based on policy information of a target user
US20150249628A1 (en) * 2014-02-28 2015-09-03 Linkedin Corporation Associating private annotations with public profiles
US20150350255A1 (en) * 2012-11-30 2015-12-03 Intel Corporation Verified Sensor Data Processing
US9311504B2 (en) 2014-06-23 2016-04-12 Ivo Welch Anti-identity-theft method and hardware database device
US9354782B2 (en) * 2013-05-15 2016-05-31 Alex Gorod Social exposure management system and method
US20170048174A1 (en) * 2015-08-10 2017-02-16 Facebook, Inc. Dynamic Communication Participant Identification
US9906484B2 (en) 2015-02-24 2018-02-27 International Business Machines Corporation Dynamic analytics controlled information dissemination in social media
GB2563620A (en) * 2017-06-20 2018-12-26 Inlinx Ltd Social network
US20200110895A1 (en) * 2018-10-03 2020-04-09 International Business Machines Corporation Social post management based on security considerations
US11194925B1 (en) * 2018-12-18 2021-12-07 NortonLifeLock Inc. User-based cyber risk communications using personalized notifications
US11861036B1 (en) * 2018-09-18 2024-01-02 United Services Automobile Association (Usaa) Systems and methods for managing private information

Families Citing this family (11)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
KR101729633B1 (ko) * 2011-03-03 2017-04-24 삼성전자주식회사 통신 시스템에서 소셜 네트워크 서비스의 컨텐츠를 공유하기 위한 장치 및 방법
US10185776B2 (en) 2013-10-06 2019-01-22 Shocase, Inc. System and method for dynamically controlled rankings and social network privacy settings
WO2015130875A1 (en) * 2014-02-27 2015-09-03 Keyless Systems Ltd. Improved data entry systems
CN106339396B (zh) * 2015-07-10 2019-08-13 上海诺基亚贝尔股份有限公司 用于对用户生成的内容进行隐私风险评估的方法和设备
US9792457B2 (en) 2015-09-14 2017-10-17 Facebook, Inc. Systems and methods for trigger-based modification of privacy settings associated with posts
US20180176173A1 (en) * 2016-12-15 2018-06-21 Google Inc. Detecting extraneous social media messages
CN107742083B (zh) * 2017-10-31 2019-10-25 华中科技大学 一种面向大规模图数据发布的隐私保护方法及系统
US10902527B2 (en) * 2017-11-15 2021-01-26 International Business Machines Corporation Collaborative multiuser publishing of social media posts
CN109274582B (zh) * 2018-09-20 2021-12-10 腾讯科技(武汉)有限公司 即时通讯消息的展示方法、装置、设备及存储介质
US20220198035A1 (en) * 2019-02-25 2022-06-23 Mark Aleksandrovich NECHAEV Method for controlling the confidentiality of communications with users
US11868503B2 (en) 2020-11-24 2024-01-09 International Business Machines Corporation Recommending post modifications to reduce sensitive data exposure

Citations (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20050055379A1 (en) * 2003-09-09 2005-03-10 Hitachi, Ltd. Information processing apparatus, method of processing information and server
US20110113084A1 (en) * 2008-08-19 2011-05-12 Manoj Ramnani Automatic profile update in a mobile device
US20120159649A1 (en) * 2007-11-15 2012-06-21 Target Brands, Inc. Sensitive Information Handling on a Collaboration System
US20120303659A1 (en) * 2011-05-24 2012-11-29 Avaya Inc. Social media identity discovery and mapping
US20130006882A1 (en) * 2011-06-20 2013-01-03 Giulio Galliani Promotion via social currency
US20130290716A1 (en) * 2012-04-30 2013-10-31 Anchorfree, Inc. System and method for securing user information on social networks

Family Cites Families (18)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US7685238B2 (en) * 2005-12-12 2010-03-23 Nokia Corporation Privacy protection on application sharing and data projector connectivity
WO2008052068A2 (en) * 2006-10-24 2008-05-02 Careflash, Llc A system and method for secure, anonymous, and pertinent reposting of private blog posting, etc.
WO2008112805A1 (en) * 2007-03-12 2008-09-18 Crackle, Inc. System and method for making a content item, resident or accessible on one resource, available through another
US20080282324A1 (en) * 2007-05-10 2008-11-13 Mary Kay Hoal Secure Social Networking System with Anti-Predator Monitoring
US20090214034A1 (en) * 2008-02-26 2009-08-27 Rohit Mehrotra Systems and methods for enabling electronic messaging with recipient-specific content
US20090319623A1 (en) * 2008-06-24 2009-12-24 Oracle International Corporation Recipient-dependent presentation of electronic messages
US8832201B2 (en) * 2008-08-18 2014-09-09 International Business Machines Corporation Method, system and program product for providing selective enhanced privacy and control features to one or more portions of an electronic message
US20100132049A1 (en) * 2008-11-26 2010-05-27 Facebook, Inc. Leveraging a social graph from a social network for social context in other systems
US8209266B2 (en) * 2009-02-25 2012-06-26 Research In Motion Limited System and method for blocking objectionable communications in a social network
US20100250643A1 (en) 2009-03-26 2010-09-30 Microsoft Corporation Platform for Societal Networking
US20100318571A1 (en) 2009-06-16 2010-12-16 Leah Pearlman Selective Content Accessibility in a Social Network
US20110046981A1 (en) * 2009-07-06 2011-02-24 Onerecovery, Inc. Goals and progress tracking for recovery based social networking
US8782144B2 (en) * 2009-07-29 2014-07-15 Cisco Technology, Inc. Controlling the distribution of messages
US8875219B2 (en) 2009-07-30 2014-10-28 Blackberry Limited Apparatus and method for controlled sharing of personal information
US20110307695A1 (en) * 2010-06-14 2011-12-15 Salesforce.Com, Inc. Methods and systems for providing a secure online feed in a multi-tenant database environment
US20120265827A9 (en) 2010-10-20 2012-10-18 Sony Ericsson Mobile Communications Ab Portable electronic device and method and social network and method for sharing content information
CN102185787A (zh) 2011-01-04 2011-09-14 北京开心人信息技术有限公司 一种基于隐私保护的交互方法与系统
CN102185826A (zh) 2011-01-28 2011-09-14 北京开心人信息技术有限公司 一种保护用户隐私的方法与系统

Patent Citations (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20050055379A1 (en) * 2003-09-09 2005-03-10 Hitachi, Ltd. Information processing apparatus, method of processing information and server
US20120159649A1 (en) * 2007-11-15 2012-06-21 Target Brands, Inc. Sensitive Information Handling on a Collaboration System
US20110113084A1 (en) * 2008-08-19 2011-05-12 Manoj Ramnani Automatic profile update in a mobile device
US20120303659A1 (en) * 2011-05-24 2012-11-29 Avaya Inc. Social media identity discovery and mapping
US20130006882A1 (en) * 2011-06-20 2013-01-03 Giulio Galliani Promotion via social currency
US20130290716A1 (en) * 2012-04-30 2013-10-31 Anchorfree, Inc. System and method for securing user information on social networks

Cited By (18)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20120203765A1 (en) * 2011-02-04 2012-08-09 Microsoft Corporation Online catalog with integrated content
US10104122B2 (en) * 2012-11-30 2018-10-16 Intel Corporation Verified sensor data processing
US20150350255A1 (en) * 2012-11-30 2015-12-03 Intel Corporation Verified Sensor Data Processing
US9354782B2 (en) * 2013-05-15 2016-05-31 Alex Gorod Social exposure management system and method
US9866590B2 (en) * 2014-01-31 2018-01-09 International Business Machines Corporation Processing information based on policy information of a target user
US20150220741A1 (en) * 2014-01-31 2015-08-06 International Business Machines Corporation Processing information based on policy information of a target user
US10009377B2 (en) * 2014-01-31 2018-06-26 International Business Machines Corporation Processing information based on policy information of a target user
US20150288723A1 (en) * 2014-01-31 2015-10-08 International Business Machines Corporation Processing information based on policy information of a target user
US20150249628A1 (en) * 2014-02-28 2015-09-03 Linkedin Corporation Associating private annotations with public profiles
US9722959B2 (en) * 2014-02-28 2017-08-01 Linkedin Corporation Associating private annotations with public profiles
US9311504B2 (en) 2014-06-23 2016-04-12 Ivo Welch Anti-identity-theft method and hardware database device
US9906484B2 (en) 2015-02-24 2018-02-27 International Business Machines Corporation Dynamic analytics controlled information dissemination in social media
US20170048174A1 (en) * 2015-08-10 2017-02-16 Facebook, Inc. Dynamic Communication Participant Identification
US10439970B2 (en) * 2015-08-10 2019-10-08 Facebook, Inc. Dynamic communication participant identification
GB2563620A (en) * 2017-06-20 2018-12-26 Inlinx Ltd Social network
US11861036B1 (en) * 2018-09-18 2024-01-02 United Services Automobile Association (Usaa) Systems and methods for managing private information
US20200110895A1 (en) * 2018-10-03 2020-04-09 International Business Machines Corporation Social post management based on security considerations
US11194925B1 (en) * 2018-12-18 2021-12-07 NortonLifeLock Inc. User-based cyber risk communications using personalized notifications

Also Published As

Publication number Publication date
EP2867812A2 (de) 2015-05-06
EP2867812A4 (de) 2015-07-08
US8955153B2 (en) 2015-02-10
CN104350505B (zh) 2016-12-14
WO2014002041A2 (en) 2014-01-03
CN104350505A (zh) 2015-02-11
WO2014002041A3 (en) 2014-02-20
US20140007249A1 (en) 2014-01-02

Similar Documents

Publication Publication Date Title
US8955153B2 (en) Privacy control in a social network
US8856945B2 (en) Dynamic security question compromise checking based on incoming social network postings
US20140256288A1 (en) On-Screen Notification Privacy and Confidentiality in Personal Devices
US9111181B2 (en) Detecting and flagging likely confidential content in photographs to prevent automated dissemination
KR101373986B1 (ko) 모델을 사용하여 실행가능 프로그램을 조사하는 방법 및 장치
US9692776B2 (en) Systems and methods for evaluating content provided to users via user interfaces
US20140149322A1 (en) Protecting Contents in a Content Management System by Automatically Determining the Content Security Level
US9485606B1 (en) Systems and methods for detecting near field communication risks
US10410304B2 (en) Provisioning in digital asset management
JP7471321B2 (ja) 機密データ管理
CN110489994B (zh) 核电站的文件权限管理方法、装置及终端设备
US20140007206A1 (en) Notification of Security Question Compromise Level based on Social Network Interactions
CN110008740B (zh) 一种文档访问权限的处理方法、装置、介质和电子设备
CN105095758B (zh) 锁屏应用程序处理方法、装置以及移动终端
US20170180129A1 (en) Password Re-Usage Identification Based on Input Method Editor Analysis
US8447857B2 (en) Transforming HTTP requests into web services trust messages for security processing
US10382528B2 (en) Disposition actions in digital asset management based on trigger events
CN109150790B (zh) Web页面爬虫识别方法和装置
US9245132B1 (en) Systems and methods for data loss prevention
US20160034717A1 (en) Filtering Transferred Media Content
US20120173572A1 (en) Concurrent Long Spanning Edit Sessions using Change Lists with Explicit Assumptions
US10979443B2 (en) Automatic traffic classification of web applications and services based on dynamic analysis
US20140342661A1 (en) Social Network Based Wi-Fi Connectivity
US9106766B2 (en) Phone call management
US11640479B1 (en) Mitigating website privacy issues by automatically identifying cookie sharing risks in a cookie ecosystem

Legal Events

Date Code Title Description
AS Assignment

Owner name: INTERNATIONAL BUSINESS MACHINES CORPORATION, NEW Y

Free format text: ASSIGNMENT OF ASSIGNORS INTEREST;ASSIGNORS:ALTAF, FAHEEM;CLAY, STEVEN DUANE;SPRING, EDUARDO N.;AND OTHERS;SIGNING DATES FROM 20120626 TO 20120628;REEL/FRAME:028460/0696

STCB Information on status: application discontinuation

Free format text: EXPRESSLY ABANDONED -- DURING EXAMINATION