CA2316743A1 - Managed database privacy system and method - Google Patents

Managed database privacy system and method Download PDF

Info

Publication number
CA2316743A1
CA2316743A1 CA002316743A CA2316743A CA2316743A1 CA 2316743 A1 CA2316743 A1 CA 2316743A1 CA 002316743 A CA002316743 A CA 002316743A CA 2316743 A CA2316743 A CA 2316743A CA 2316743 A1 CA2316743 A1 CA 2316743A1
Authority
CA
Canada
Prior art keywords
data
privacy
engine
database
user
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Abandoned
Application number
CA002316743A
Other languages
French (fr)
Inventor
Austin Hill
Hooman Katirai
George Favvas
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
ZERO-KNOWLEDGE SYSTEMS Inc
Original Assignee
ZERO-KNOWLEDGE SYSTEMS Inc
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by ZERO-KNOWLEDGE SYSTEMS Inc filed Critical ZERO-KNOWLEDGE SYSTEMS Inc
Priority to CA002316743A priority Critical patent/CA2316743A1/en
Publication of CA2316743A1 publication Critical patent/CA2316743A1/en
Abandoned legal-status Critical Current

Links

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06QINFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR ADMINISTRATIVE, COMMERCIAL, FINANCIAL, MANAGERIAL OR SUPERVISORY PURPOSES; SYSTEMS OR METHODS SPECIALLY ADAPTED FOR ADMINISTRATIVE, COMMERCIAL, FINANCIAL, MANAGERIAL OR SUPERVISORY PURPOSES, NOT OTHERWISE PROVIDED FOR
    • G06Q30/00Commerce
    • G06Q30/02Marketing; Price estimation or determination; Fundraising
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F16/00Information retrieval; Database structures therefor; File system structures therefor
    • G06F16/20Information retrieval; Database structures therefor; File system structures therefor of structured data, e.g. relational data
    • G06F16/21Design, administration or maintenance of databases
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F21/00Security arrangements for protecting computers, components thereof, programs or data against unauthorised activity
    • G06F21/60Protecting data
    • G06F21/62Protecting access to data via a platform, e.g. using keys or access control rules
    • G06F21/6218Protecting access to data via a platform, e.g. using keys or access control rules to a system of files or objects, e.g. local or distributed file system or database
    • G06F21/6227Protecting access to data via a platform, e.g. using keys or access control rules to a system of files or objects, e.g. local or distributed file system or database where protection concerns the structure of data, e.g. records, types, queries
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F21/00Security arrangements for protecting computers, components thereof, programs or data against unauthorised activity
    • G06F21/60Protecting data
    • G06F21/62Protecting access to data via a platform, e.g. using keys or access control rules
    • G06F21/6218Protecting access to data via a platform, e.g. using keys or access control rules to a system of files or objects, e.g. local or distributed file system or database
    • G06F21/6245Protecting personal data, e.g. for financial or medical purposes

Description

i MANAGED DATABASE PRIVACY SYSTEM AND
METHOD
New Canadian Patent Application Applicant: Zero Knowledge Systems Inc.
Filed: August 28, 2000 Our File: 2600726/0003 FASKEN MARTINEAU DUMOULIN LLP
PATENT AND TRADEMARK AGENTS

Managed Database Privacy System and Method r Contents INTRODUCTION...................................................................
.......................................... 3 OVERVIEW OF INVENTION
...............................................................................
........ 6 DE-IDENTIFICATION LAYER
...............................................................................
.....

DB ANALYSIS TOOL
...............................................................................
..........................
S

PRIVACY RESOLUTION TOOL
...............................................................................
............

DE-LINKING
ENGINE.........................................................................
.............................11 FULL TEXT ANALYSIS
TOOL...........................................................................
.................11 PRIVACY ENHANCING LAYER
...............................................................................

DATA MINIMIZATION ENGINE
...............................................................................
..........I

ENCRYPTION ENGINE
...............................................................................
......................15 DATA ACCESS LAYER
...............................................................................
................16 IDENTITY VERIFICATION ENGINE
...............................................................................
.....16 BLINDED TWO-WAY COMMUNICATION ENGINE
..............................................................17 SECURE PROFILE ACCESS ENGINE
...............................................................................
....19 THIRD PARTY DATA ACCESS ENGINE
..............................................................................2 ATTACHMENTS
...............................................................................
............................ 21
2 Introduction As business processes and an increasing number of our customer and partner interactions have been computerized there has been a large increase in the amount of data about our customers; partners and our potential customers available to us as businesses.
With the decreasing cost of capturing, storing and archiving this data, businesses are now finding themselves awash in data about their every aspects of their markets; customers and business transaction history.
In conjuction with the growing availability of large amounts of data that is easy to capture; analyze and process, there have been entire new technologies and industries that have been created to help businesses derive value and intelligence from this data.
(OLAP, ROLAP, eCRM; personalization engines, targeting engines, collaborative filtering engines, data mining) Due to the fact that businesses have an economic and business requirement to.
The fair information practices that define and lay the groundwork for building privacy into such systems have existed since the mid 70's, but due to the lack of any legal requirement and the added cost of building and implementing these systems the investment in such privacy systems were largely ignored. Privacy concerns were certainly not a primary consideration when these systems were being developed and the data mining business practices that were predominant were not only completely legal, but they were also considered good business.
This situation has changed dramatically over the last five years. Business processes and data acquisition and mining techniques that were completely legitimate and legal, are now becoming despised by consumers and in some cases made illegal.
Companies are now finding that investments made in (personalization engines;
eCRM;
data mining; business intelligence) are suddenly at risk because of the growing and increasingly serious privacy issues. Companies are facing huge potential costs in trying retrofit their business processes to meet new and constantly changing legislation coming from many different state, federal and globally diverse countries. Previous systems and investments are now facing the potential of running starved for that previously copious data or potentially abandoned altogether.
While some companies have adopted some parts of the fair information practices in an attempt to address their customers privacy concerns, the growing focus on the privacy issue is causing a changing landscape that is forcing companies to constantly redo and re-address their privacy stances.
We believe that days of privacy policies and lobbyists answering a companies privacy issues are coming to an end. Privacy is now becoming a critical strategic issue facing
3 every company's computerized business automation initiatives. This strategic issue can either be a huge strategic advantage, or potentially a company's Achilles heel.
By investing in a managed privacy service, companies are building a business foundation that allows them to react, change and evolve with their customers needs with regards to privacy.
We employ new privacy engines that automate and accelerate the implementation of the core fair information practices. These engines can be customized to meet specific business and customer interactions; self regulatory industry privacy practices; regulatory privacy requirements; international privacy laws and standards as well as new and privacy friendly ways of delivering increased customer focused services.
Companies that are focused on their customers demands and needs will find themselves able to enhance those relationships; adjust and react to new data handling practices and turn privacy into a powerful strategic advantage.
Step 1. Data De-idenh'fication stage The first step in our managed privacy layer is to separate and segment personally identifiable information out of the incoming data streams. Locating, separating and cleansing a data stream of any PII can be a very challenging process.
Traditional search and replace techniques; including field and record level searches have been found to only locate 30% to 40% of PII in most data streams. The proprietary algorithms employed by us in our De-Identification Engine have been found to locate 98%-99% of all PII. At the de-identification stage the following steps are taken to segment PII from the rest of the data, and to specifically exclude the collection and storage of any data that policy kits define as required exclusionary data. The specific steps that are taken at the de-identification stage are;
~ Locating and segmenting PII from the data stream ~ Apply exclusionary data filters for any data required not to be stored ~ Organizing/Labeling and verifying privacy conditions associated with PII
~ Opt-in vs. Opt Out ~ Geographic and special policy based handling requirements ~ Reviewing and improving data accuracy and data quality ~ Assigning a cryptographically private and secured identifier controlling the linkage of PII and de-identified data.
Step 2. PII Encryption and Security Engine The PII separated in Step 1 of our privacy layer, is passed to the ENCRYPTION
& SECURITY ENGINE. As a key element of all privacy systems is the protection and security of customers private information, as this stage we apply encryption and security
4 expertise and toolkits to help secure the storage of PII inside of a companies database.
These security mechanism secure PII from the following threats;
~ Accidental disclosure and leaking of data to external sources (e.g.
Websites) ~ Hackers and attacks against websites and theft of data ~ Internal misuse/theft of data by errant employees/contractors 'The PII Encryption and Security engine is designed to automate some of the following technologies and techniques to ensure that only the most secure and trusted mechanisms are used to protect customers data;
~ Hash, Secure one way Hash, Seeded Passphrase Security ~ 3DES; DESX; Blowfish; Diffie Helman and RSA based encryption algorithms ~ Key splitting; shared secret and computationally controlled disclosure ~ Blinded credentials; minimal disclosure ~ Secure, or anonymous transport layers (SSL; Anonymous IP) We support and interoperate with the security features in Oracle8; Sysbase;
NCR; blahs blab as well as most standard database security mechanisms. (Maybe mention integration and support for CA software and Directory Servers/Role Based security systems).
Step 3. Data Sieve Engine Processing de-identified data in a privacy enhanced manner, allows businesses to leverage incoming and legacy data while still respecting privacy standards. We link with eCRM; perzonalization engines; stat. analysis; business intelligence; fraud detection;
OLAP; and data mining applications. We are able to provide those applications with the data they require to operate. We apply a series of data processing techniques to ensure that relevant data is obtained while still providing customers with strong privacy protections. These techniques include;
~ Data minimization and data reduction ~ Interest vector mapping ~ Preference matching ~ Data labeling Overview of Invention The key features of our invention are:
~ Privacy infrastructure solutions to enterprise customers.
~ Professional services-based sales and delivery model.
~ Striking a balance between individual customer needs, and the need to create reusable components ~ Industry-specific toolkits which customize those components ~ The need to build in service components wherever possible such that ongoing revenue can be generated from a given customer ~ Auditability is built into each component More specifically our invention includes:
~ Privacy risk analysis methodologies and tools:
~ A methodology for assigning a value to the privacy risk associated with a given set of data fields. This value is based on the probability that a given set of data elements can be correlated with other existing databases in order to uniquely identify an individual.
~ Tools for automating the process of analyzing a given database, computing the above risk value, and performing "what iP' scenarios to determine the effect of various data manipulation actions on the risk value.
~ Technological methods for reducing the privacy risks in business practices by:
~ Aggregating, minimizing and vectoring data so that less granular information is stored.
~ Encrypting private data such that one or more parties must agree in order to decrypt it.
~ Hashing data so that it can still be used ~ Controlled data access technologies:
~ The concept of a data sanitization facility, where data in chained through multiple parties, which each play a role in decrypting and otherwise manipulating data without seeing the whole picture.
~ Methods by which a party can communicate with another, by physical or electronic means, without knowing the person's identity or contact information.
~ Methods by which a user's identifying information is separated from profile or other information, and a digital credential is issued which allows them to re-link the two together.

' . , , ~ Enabling a user to control view and manage their personal information profile across multiple sites, without those sites being able to link their data records to an identifiable individual.
~ Other ~ An automated means of creating, based on industry-, legislative- and customer-specific needs, business rules which feed the above software components.
The software components of the solution fall into three broad layers:
~ De-identification layer ~ Privacy enhancing layer ~ Data access layer Below, we describe the following for each software component where applicable:
~ Business need ~ Technical solution ~ Business model ~ Further links a De-identification layer 'the de-identification layer allows for means by which data or groupings of data which can be used to identify an individual is exposed and assigned a risk factor.
If the risk factor exceeds the threshold for a given situation, various scenarios can be modeled with the goal of obtaining a satisfactory resolution.
DB analysis tool Need While the presence of some types of fields can definitively allow linkage to an individual's identity, the ability to link a given data set to a unique individual is not necessarily binary. For example, a 9-digit zip code and date of birth together have a high-probability of yielding someone's identity, whereas a 9-digit zip code and only a year of birth have a yield a lower probability.
A tool is needed which, for a given database structure, will assign a risk factor to fields or field combinations.
Solution The DB analysis tool is the first step in our managed privacy layer. Using proprietary mathematical algorithms, and industry specific knowledge, it examines an existing legacy database and identifies fields or groups of fields that constitute PII.
Based on toolkits developed in cooperation with industry-specific experts, the privacy engineer can map fields in the customer's database to known datatypes.
Then, the DB analysis tool associates a quantitative number, called a privacy risk factor (PRF) to each individual field, or group of fields. This PRF is a number between zero and one that indicates the level of probability that a given field or combination of fields can uniquely identify an individual. A PRF of zero indicates no privacy risk while a PRF of one defines a high privacy risk. Depending on the particular customer or industry, different risk thresholds may be set.

Example Suppose we have a pizza delivery database with the following three fields:
name, postal code and telephone number. The output from the DB analysis tool might look like this:
Fields Privacy Risk Factor (PRF) Name 0.91 Postal Code o.os Tele hone Number o.50 Name, Postal Code o.98 Name, Tele hone Number Postal Code, Tele hone Number l.oo {Name, Postal Code, Telephone l.oo Number}

refers to the average number of people that can be found using these fields as search criteria.
Further Links ~ Link to people/organizations with domain-specific expertise to help build the rule sets we need for various industries.
~ Link to database vendors to facilitate better integration so that we can automatically import database structures into the tool.
~ Link to third party application vendors/systems by partnering with companies that have created standardized databases for specific verticals so that we can automatically map their fields to the ones in our DB analysis tool.
Privacy Resolution Tool Need Used to create a privacy policy to mitigate the PII identified by the DB
analysis tool.
Solution This tool is used by a privacy engineer to create a data de-identification policy to address problems found in the previous step. Creating this policy is a complex process where each decision affects subsequent decisions. Using proprietary algorithms, this tool helps the privacy engineers leverage the maximum amount of business information from the database while also satisfying privacy concerns.

The database is said to be "free" of PII when the Correlation Risk Factor (CRF) for every field or combination of fields is below some given threshold. In our example we have defined the CRF as the inverse of the Expected Bin Size (EBS) a factor which is defined in the glossary. Supposing our minimum satisfactory threshold is 0.2 we would continue for several iterations until we create a privacy policy that would certify our database to be "free" from PII. We illustrate each iteration for the Pizza Delivery example, introduced in the DB analysis tool section.
Iteration 1:
FIELDS PRESENTED TO USER PRF User -- PRF
Before Action After Name p.g1 1-Way Hash-Postal Code o.03 0.03 Telephone Number o.5o 0.50 Name, Postal Code o.g8 0.03 Name, Tele hone Number 0.97 0.50 Postal Code, Tele hone Number0.99 0.99 {Name, Postal Code, Telephone l.oo Number l.oo Note: H[X] denotes the hash of field X.
Iteration 2:
FIELDS PRESENTED TO USER PRF user ActionPRF
Befo After re H Name _ --Postal Code 0.03 0.03 ele hone Number 0.50 1-Way Hash -H[Name], Postal Code o.gg o.03 H Name , Tele hone Number o.g7 -Postal Code, Tele hone Number0.99 0.03 {H[Name], Postal Code, Telephone o.03 Number l.oo Note: H[X] denotes the hash of field X.
Since all fields have a PRF below 0.2 we do not proceed with further iterations.
The final privacy policy of the database is:
Name ~ H[Name]
Postal Code ~ (leave intact) Telephone Number -~ H[Telephone Number].

De-Linking Engine Need Implements the privacy policy created by the privacy resolution tool Solution This tool implements the privacy policy defined by the Privacy Engineer using the privacy resolution tool. Unlike the privacy resolution tool which only creates a policy, this tool actually makes changes to the database. It calls upon the Encryption, Minimization, Aggregation, and Interest Vectoring engines as required by the privacy policy. For example, it may triply encrypt an email address for use with the blinded communication system described later in this document.
Example The following illustrates the effect of the de-linking engine on a record in the Pizza delivery database before and after the de-linking engine. For demonstrative purposes we have added a "Date of Birth" field to the database.
Field Name Contents of FieldOperation Contents of Field Before After Name "John Smith" _ 12sh1#d'ASD;
1-Wa Hash Telephone "505-555-1244" 1-Way Hash 72dsfi32233 Number Postal Code "L4B 3H7" Do Nothin "L4B 3H7"

Date of Birth"12/21/1971" Minimize 1971 Full text analysis tool Need To protect company's from privacy concerns when sharing unrestricted text that is not stored as a record in a database.

Solution This system removes PII by locating and replacing personally-identifying information in unrestricted text documents using techniques that extend beyond simple search-and-replace procedures. This minimizes risk and maintain confidentiality when files such as doctor's notes need to be shared with 3'd parties who do not require the subject's identity.
The system employs pattern recognition techniques including detection algorithms that employ templates and specialized knowledge of what constitutes a name, address, phone number and so forth to automatically detect and remove PII. It must also be noted that the success of this system is domain dependant and that preliminary investigation must be conducted before its successful delivery can be promised to a client.

a Privacy enhancing layer The various engines which make up this layer transform data into a form which represents a lower privacy risk. They can be run either in batch mode, or on the fly as new datasets are being created.
Data minimization engine Need To maintain important information in a DB field while keeping the user anonymous.
Solution This engine is used to remove unneeded information from the fields of a record by converting the fields to a more general or less specific form. For example, a market researcher may employ minimization to convert the date of birth into a year of birth.
Industry specific minimization routines could allow a full blood analysis to be reduced to a simple blood type.
_ Sam le data fore minimization - be ~~

Date of birth Zi code Income Car 315/1973 90210 $90,000 Lexus 7/2/1968 84070 $40,000 none 11 / 12/ 1975 1 O 115 $65,000 Pathfinder Sam le data -after minimization Year of birth State Income Car cate o 1973 CA $75,000 - Luxury $100,0000 1968 UT $35,000 - $60,000none 1975 NY $60,000 - $75,000SUV

Interest vectoring engine Need Allows transactional records to be mined for one specific individual without knowledge of the actual transactions.
Solution This engine uses industry specific modules to convert a series of items into a set of perceived user interests. For example a clickstream could be converted into a vector representing a user's perceived interest in sports, entertainment, and news based on the frequency by which the user visits the web sites in the latter categories.
Da to aggrega tion engine Need Allows aggregate data to be gleaned from records when the raw data from the records isn't needed.
Solution This engine converts a set of records to aggregate statistics and measures based on those records. Industry specific modules allow specialized aggregation functions to be computed relevant to a given industry.
Sample data -before data aggre ation Patient name A a Sex Disease John Doe 65 M Parkinson's Peter Smith 41 _ M Cancer ~

Erica Peterson 19 F D cession Jane Doe 27 F Cancer Mark Ro ers 32 M De cession Sam le data -after data a re ation _ A a 18-30 = 2 31-SO = 2 (51 + = 1 Sex M = 3 F = 2 Lexus Disease Parkinson's Cancer = 2 D cession =
= 1 2 Further Links ~ Link to application vendors to make sure that our interest vectors are compatible with those in use by market leading applications. (i.e. if most online marketers use the same application to categorize/target individuals, we want to make sure that our output conforms to that format.) Encryption engine Need Used when an identifier is suitable in place of an identifier linked to personal identity, or when access to information needs to be restricted and only released with the consent of several parties.
Solution The encryption engine can perform a number of actions:
i) One way hash functions Allow information to be converted in an irreversible transformation from a human readable form to a unique identifier. For example the names of users can be encoded using a 1-way hash function in a marketing database, thereby transforming each name into a unique code. This would allow marketers to profile databases without knowing the names of the people in the database.
ii) Two way encryption This process is used whenever sensitive information needs to be converted to a different form. Encryption is a reversible process therefore it is only used if the actual information may be needed at a later time. For example, a marketer could encrypt the email address of people in their database before sharing their user profiles with third parties to ensure that the 3'~ parties do not email their customers without their consent.
Further Links ~ Link to the systems of PKI solution providers Data access layer This is the layer where previously encrypted data is decrypted in a controlled fashion, in order to unlock its value. This is where we believe the greatest opportunity exists for our invention.
Identity verification engine Need A means of verifying the identity of an individual so that they can subsequently authenticate their identity to the holder of their data. Such verification is needed in cases where the data collection occurred either offline or without user consent.
Solution The identity verification engine uses known contact or personal data to verify the identity of the user.
Depending on the particular customer requirements and level of verification certainty required, various scenarios are possible:
~ E-mail: A validation token in sent to a known e-mail address belonging to the user.
~ Telephone: The user is called at a given phone number on file and is given a validation token.
~ Snail mail: A validation token is physically mailed to a known address on file.
~ Third party database checks: The user is challenged by being asked to supply personal information which is contained in offline databases such as credit reports. The queries should be such that it would be difficult for a person other than the person associated with the data to possess the information. This process can occur either online or offline.
Regardless of the verification method used, this one-time process results in an unique credential being issued to the user. This credential is what they subsequently use to authenticate in order to view their own data or .

'This credential may take one or more of the following forms:
~ A PIN number ~ A username/password combination ~ An x.509 digital certificate downloaded to the user's browser ~ A Brands or other type of credential stored locally within a Freedom client Business model We could charge:
~ a one-time verification fee commensurate with the verification method used and degree of security desired;
~ ongoing service fees based upon the number of validated users.
Blinded two-way communication engine Need The company needs a means of communicating with a consumer whose data record they hold, without knowing that consumer's identity.
Solution Any PII that could be used to contact an individual (name, e-mail, address, phone) is multiply encrypted using keys belonging to different entities.
When a company wishes to contact a specific individual, they forward the text of the message to be sent, along with the encrypted contact info, to a sanitization facility. There, the blinded data passes through a chain of servers belonging to the customer, us and the audit partner. The chain of servers serves to distribute trust such that no one entity can link the user's identity with other information in the database.
A marketing company wishes to send targeted ads to individuals. The individual's e-mail address is encrypted first with a key belonging to an audit partner, then with our key, then with the customer's key.

When the customer wishes to send e-mail to that individual, they send the message and the encrypted e-mail address to the sanitization facility, where:
1. The company:
a. encrypts the message with the audit partner's public key b. decrypts one layer of encryption on the individual's e-mail address c. forwards both the above to our server 2. Our company:
a. does not touch the message contents b. decrypts one layer of encryption on the individual's e-mail address c. forwards both to the audit partner 3. The audit partner:
a. Decrypts the final layer of encryption to reveal the individual's e-mail address b. Decrypts the contents of the message c. Forwards the message to the individual in question, on behalf of the customer The following table describes who amongst the three parties has access to which data:
Sees personal data Sees profile Sees message data contents Customer No Yes Yes Us No No No Audit Yes No Yes artner Variations on the above scenario include:
~ using the Freedom network to route e-mail where a greater degree of privacy and where a lesser degree of auditability is perhaps required;
~ using a similar process for the blinded addressing of physical mail.
By reversing the process, the user can privately reply to the company. This is accomplished by making the from address in the email, {the encrypted email address(a~private-mail-gateway.pwc.com. If each party reverses the process the user can securely send messages back. If we don't do this, the company could easily find their real email addresses by sending them a note and asking them to reply back. Further, this solves our problem of how to opt out. We no longer need to know the real user's address to opt in or out.

Further Links ~ Link to SMTP servers to allow for outbound delivery of e-mail.
~ Link to snail mail fulfillment organizations to allow for the sending of mass amounts of snail mail.
Business model We could charge:
~ hosting and bandwidth fees for the servers that are part of this process;
~ transaction fees based on the volume of e-mails sent through the system;
~ licensing fees based on the number of users in the database.
Secure profile access engine Need A means is required by which individuals can access their own profile data.
Solution A "personal portal," powered by our secure profile access engine, lives in the sanitization facility. When a user authenticates, his or her personal data, is decrypted by a chain of several servers and it is presented to the user. The personal portal could be cobranded (customer + us).
Business model This perhaps the most interesting business opportunity over the long term.
With a critical mass of users updating their profile data at facility that is essentially controlled and operated by us, we are:
~ reinforcing our brand as the protector of personal information;
~ placing ourselves in a position where we are the intermediary between the user and all their privacy-sensitive interactions online, and can therefore try to hook up these users with other corporate customers.

We could make money:
~ by charging an ongoing per-user fee to manage this process;
~ via customer acquisition fees and/or revenue sharing arrangements when we refer a user to a new partner within our network.
Third party data access engine Similar to the above, but we provide a means whereby authorized entities can decrypt predetermined sets of data under specific circumstances.
The data access engine, which also resides within the sanitization facility, brings together the multiple keys required in order to decrypt data.
Business model Since we will hold at least one of the encryption keys, we are in a position to force companies to come to us any time they want to unlock data.
The business potential here is limited only by the value that the company places on the particular data they need to access or manipulate.

Attachments 1. Case Study 2. Technical Solution Overview ~

. _ O

O

G~

U

c~3 t~ ~ '~c.~

. 4~

~ o .

. U
t~

O

~

..~i N 4~ , . .
.-O

ct3 4 .'~ .~ O
~~

.

I I

0 ~

h .
4,~
~_ ct~l 4~ O v~ v~ .O
O
W ~ ~ N
tr . ~' v ,_, i~ ~ ~ -r-~ LIB _CC3 '""
'cj .~' . '-, ~ ~ j ~ ~ cc3 w ~ ~ w i .~
C~

V ~ U

V U

N '""' U

.
r..a U

4~

+-' O

U

O

W ' ' o W ~

''"'"

C~ ~

,.., ,~

~. ~ ~ ~ ~ ~ ~ y_ CD O
c O ~ Cø!~' O 'O\.
<C
O ~. ~ O. O
. ~ '"~ ~n O
O
a' O
n --,s "' '~C '' p' ('1~G ~ ~C
O
tD ,r .
O
V

~1 CD ~ CD
O O
~- CD ~ ~D
O r-r O ~ "-~-' n ~ ~ ~h ,.~. '' ~7 ~,, . ~ f.-. ,,..r c'p ~ . ~ O ~-.r O N
O
cu ~ ''~ p ~ ~ D
O ~ O
Cp O O ~-,- O C.
,..r .
r'~ ~ O "'~~ ~ ~ Cp V
.
CD
.. '""i r-.~ CD
O
CD ~- n '~C
CD
U' O
fD
'~ ~' CD
O
~,-h O
O ~ ~ ~ .~ ~,.~ ~ ~ 0 Cls ~ C'~D r~' BCD r-~ ' ~!
O
n ~ n CD ~ ~ ~ ~ ~ '~"
D
cn ~ ~ ~ ~ ~, ~ ~ ~ ~1 ° ~ ~ r.~r, W ~ ~ Q ~1 a4 CD . CD ,~ ~ ~ ~
r-r ~°-h rig C'~D
r~ ~''' cP rr~ ~ D
CD ~ r-r. ~ . ~!
' ~ ~ O
O cP
'~ ~' r+ ~ ~ ~. "'fir ' CD ~ ~ '~ O
CD r~ O '"~~" ~ ~
O ~h C . ~.' O
CO
C_D n ~ r~~., . ~-. ~C "'~ C~
..
C

~ ~ ~
o ~ C~ r'' ~_ ~ H
CD C~ls C_D n -v O
;_~ ~ cr~ v, ' ~~~v~ o ~, ° ~-n ~ . ~~ ~ ~ f...

O
c~
b~~ ~:. ~ ~ ~ ~.
o ~ o ~4 cry CD
CD
,..r .

tv ~ ~ lD
. ~ ~ x ~ O c' ~ tD
. D

fP ~ ' ~

.' r ~

'< i ~ i z ~-~

~
' r o :

~- ~

_ ~_ ~ ~_ _._ ,M$~~

d ~~

a= .~ ~~

~, ~~ _ .,~

~, -n o ,~, t.~t' r c;; ; u; ~ 'a . t~ J tl..p, ..~ a c ~ ~ < o c~ O
~
~

co ~ ~ o ~~. ~ c II
oc o o .. ~ c~ a.
~c II ~ ~ ~
..
.

z ~
o ac ~ ~7 .~
a~ . ,.< 0 0 c ~ ~ ~
If ~,-t C'~~ ~ ~ ~ o CD
~

- ~ ~ I ~crj - ~ UaO
.~s.
~

_ I f ~ I
"*:
p ~

,.~ ~' ;~a:
II ' N ' n.~.'-~
z j n ~ O x t V~ ~r U~ 'C
~
o y O

a' O

cn O

~

i Sv a4 ~~-s C O p cD

~D
I
I

II a n d n ~h ca If o a4 Q a C~
p n ~ ~~ O
~~ ~ O
,...~ . ~ . ~.' ~ ,~, CD CD v~ ~ c"D
~. ~ ~.; c~
-' cc ~ ,... . 'C N~
c o ,,.~~ d n o ~ ~i .wSS, ~ o' ~ ~ ~c c~
o ~ ~'' ~, o cv o ''h c~
o c~
V

O 4~ r~ ~; "~ ~.
CD ~ ~-r ~ ~ ~ r+
r+ ~ C
'"t~ ~ . ~ ~ ~..' c"D ~ c7 ~ ~ c D D
,...r c'D ~ . r-, C ~-cr~~~~.~v~0 v, ~ cn ~.. ~ , '"'~ ' ~C ~ ~ ~ ~ ~.
~ ~ ~ CD
Q
''D~'c~
O ~ ~ . C"~ ~ CD v~'~ N
CD r-r. "~ O ~. ~
rr~ ~ ~ ~.. r~ O ~. CD
CAD v~ C . CD C~'D C~'D
~L ~~-r ~., O
CD ~ ~ ~ ~ CAD
~ cP
o ~- ~ ~ o 0 o cv ~ . r-+
c~ o ~ o r,. ~ ~ o "~ o o °
o ~ ~. c~ c~ o c ca C
t~'Q ~p ~ CD
,.-, . ~ ~--f.- _ (r~q +i C"~D ~ ' r-~- CD
~.

_, c l~-I
Ongoing pri~racy and technology audits ___ n ~D C' . ~-r ~_ ~' . O ~
CD N ~ (~. r..r . ~' f..r ' ~ Q
0 0 - _.' ~ ~ f~ ~~ rn '~
a~ 00 0 ~ ~ ~ . ~ ~ CD
'"'r (~q ~ ~. ~ . y c o ~ ~ ~ o t"
0~ 0 ~ CD

a o '~ ~h Cr' CD fp o: m ~' ~ Q
v'.
~ . O CD ,,.0 r"f' ~ ~ ~ ~~
o c .~ ~ ~1. O cn °, ran c~ O cn V
c ~,~ ~ ~ ~ (D
rt ''r ' 3 r ~p ~ ~ (' o v~. ~ _~
I~ v V
e--t- _~
~ ~ Q
,~ . ~., C. 'r. O
' ~ o ""'' V

~a .

a ~.

CD CD ~p ~ ~ ~ ~ ~ 1 o a ~:

o ~ ,~ ~ ~. , w ~

~ r~

~...,.~~

~ ~ ~' ~ i ~

o o , a c ~ ~ ~ ~ ~ ~ ~s cv o .

o cry ~

~ ~' o ~ a h v C_D Cp ~r ~ ~

~

( CD ~t '~

,....
-' . O ~
~' ~

n d' r~ ,~.

N CD

o ~-.,' c .

c ~
r~ rte, _~ o 0 c°
n n ~ ~ c~ ~' ~ ~ ~ CD
m rn o c~ '"~
c,~..~'D p . . ~ ' o c'D O
~. ~D' cro ~--~s ""d ,~. ~'; ~ r+
~,r v_o CD
cr, ~-~ ~ l5, ~ CAD ~ ~ ~ 0 .-, .
"~ ~-~. V~ C~ O
'p_ ~ CD-.
N
r-+
~a 70 ~ 5 , c'D
~ ~ ..
~r-cn. ~ "-~'~h Cr O
o ~-~ o t"~ N
o~ o iv.~
n~ o w -rt ~ ~-.~ CD
CD
~ !Cp C1q CD

~1 Value to Business C

z ~ o s ~ C

. ........ , ........ ..

. ' '-~
' C ~ ~
~

m y , r-~r ~ ~ r ,Z
~

~
~ cC! spa, pip ~
~ ~

t_:
~

~= N, ~
v 3, ~, '. A c~D rn~ ~c~~
~ ~

z~,.~
'~ o ~~ ~o n ~ ~, z-N
:~ ~ ~~' ;

~ m ~1 I~I ~ ~ ~ ~ ~

-~1 1 n 1- ~
~

D -h, ~ ~ ~1 ~ " ..., o ~ Q ,~ ~ ~ ~

n ~ o~ 3 c~ ~ o D -~- ~ ~' v~

...

w ~ ~

C~ ~, r-r-~

o C' D

o ~ CD

t7 v, O

p tfo p D xs ~ D ~

o ~ ~n ou m o ~

n cn ~

a. m _, n o o o o ~~ o ~D

o ~ ~

o n ~ ~ rn c~ o ~

'"

~t tp ~ ~ ~ ~ o ~D D

~' ~ ~ ~ c . o ~ ~

a4 n ,. ..

a ~ a ~ c o N '~ ~ r'+

OU ~ ~ tG -4~(~ ~. C~ CD

d ~ _ ~J
II .

v ~. a _ "'~ ~.-' -n ." 3 ~ tn h N ~

~'' II N
N

n a ~ ~

...
.a ~ ~ ~ ~. cro + ~
II

~ ~ ~ ~ ~

o.

I.

c -P W _N
v v '~.N., ~ ~ ('~ ~
I~ ~ l~D O~0 0~0 ~ ~"~ ~ E"'~
d ~-fit- ~ v~, D O
cp ~ .. ~ "'~ D
o ~ e~-r- ~ _~ W
'""T' ~',~
,C ~ o ~ ~ ~ ~-+ CD
~p v~ ~ n n ~ ~ ~ C7 ~ ~- ~ o ~fl..
n O
"~. '~r'~' Sz° o r-~- h-~'D't -'' 4 ~
UG -c3 ~ ~
.. ~ c~ o ~ " .. G1 O
o °ccn° ~ ~ v H7 H'h y cr n o c~, y ~: ~D
V w ono v ui tn o 0 0 0 0 _ ~.~
CD

t ,.... ~
. .

r-~ e-+

~ ' ~. ~

a C7 r-r "t V~ ,.i fv ~

e-f s-c C ~ ~
. ' .

~ 7 C-~ -I- r-~
e-~

CD ~r f..
n .

n iv ' ~ ''""S
' ~ ~ ~
.

N

V ~ V

V

~i .

i I
1-~~-r O ~ e-r ~ ~ ~ " ~ ~ h-r ' N ~ ~- ~ ~ ~ r"'r' O c~ ~ ~ ~ ~ , cv O ~ ~ ~ ~ ~ n n c~ ('D ~ ~ ~ CD
c~
o ~ ~ ~ Q c~
_~
O
c,~r~~ ~~'p~,~n rn CD O n n fD
O ~ CD ""~ ~ . CD
C~"~D r"~
. ~ ~ ~.
,..,r .
p o ~ O
,,~~ ~ ~ o ~ ~o CAD ~ . c~r~
c~
V

C

0 0 o ~a ~

_.

r 0 0 ~ o' ~ ~ c .

cr, a c ~ ~ c~

...,.

c ~.

.

~..

,..r .

V

r-~
0 ~ O
C~ o ~: ~ ~ ~ p o ~ ~ y p V
~v . C% ''~ CD .~''.,, CI"~
,"d ~ ,!~ .
W
o ~ ~' c ~_ N

O ,..,r .
~, cn c~
,r . CD
O
'~ c~
n V ~
'.~i .
~.

~ ~
~ G I I I ~ O ~ ~'' '~ C~ id ~"~ ~ O ~ '"'~ c n ~ ~ o ~ ,~ . ,.~.
"t~ ~ CD ~'D ~' 1~:
~ ~ r.~ n ~ n ~ ~ ~ O
',~ ~. ~ ~ ~ ~ ~ O
n c~ ~ "'r_~ ~ '~. O
O
Ci~ ~ O ~ '~ O
a,~ ~ ~ ~ ~ rr~ O O
~ .""~ ~ h~~~ n ~ ~ O W
t~D ~ ~q O cv N ~ ~ ~...,.
b~ ~' n ~ ~ v~ ~' ~- O
O O ~? t~ C'~D, v; ~ ~ ~ CD ''r .
r~ ~ r~ O ~'~',, ,"~ ~"~-'p (rQ
w S~ r+ N :..r i~ r-~'~ ~4 CD
C7 ~ CD
cn i~ ~", ca' ~'u "'~ ~-.
O
"~ N
W..i ~ h.r .
~-r-O
CD

x " ~ Jr ~ d 1~ f .,' 3 ~ f a,~ n s ~8'- A
7~ ~~,~Y
1 ~a~ 5~&~. °~ , i .. , A. b 3 ' k ~ p ~ d S ~,,~ ~ ~,'' ~.i $e ~"~. I ~ e~
~ws: ~ ~Y
~ '~; ~s'~, x ,., r '~ '.~ ,t':
~ ~k" , fir. ~ .~;.. t'~
a ~~C~, a r :a f ~ c ~ ' ~ ~..,~~"t,,r ': '.
v .
~& 1 ".a~
>;:
i ~ ~~ ~
'_ x ~s ,~r~& s3-"_.
~ ~ ~ _/
n N ~ ""~ ~ ''~
0 ~ ~;, ~ ~ ~ p D~ ~'~~CD ~nj CD iD ~, O ''"~' CAD v' CD
y ~ ~ ~D v~ ~_.
r~~
c~ ~
w ~ ~~
cu ~ ~s v Q", ~ c~ ~ cr~ x c~ ~. ~ ro c~ v O '"
'-r' ~q ~-~- v~
~,. ~-. c'D

d "~ N o i ~ ~ ~ i ~. ~ , d ~ ~ . . ~.~ o ~ ~ o ~ ~ ~ ~. ~ ~ o ~ o p o ,.-~., o ~-. o ~ ~ ~ v~ ~ O C'D
O ~ "~ . ~ n ri ~ ~ . ~ O ""~'~h C ' CD ~ c~ ~ ' ~ ~ ' U~? ~ , Cr4 .~~.
vs x ~ ~ "'r . N ~ CD
a O ~ "-t c~ ~ ~C ~ CD ,r .
V
CD CD ,,~~ ~ ~ C/~
a ~ _C~
n ~ "~ CD ~ ~
c O
w ° ~ ~ ~
cP

d 1 I I I ~
v~ ~ ~ ~' o C~ c~ ~ ~' ~ "'~ o . CD CI4 d'Q ~,~"~ C~'D ~ r.~.~ ('C'p ""~~ . "'~ ~..~ y ~. ,~.0 ~.. ~C CD ~ ~ CD
~'4 ~ d'4 "'°~ C~' ø_~
O
'"° cv O ~ ~ ~ n ~ ~ '~ ~ '""C~
n ~ O ~ ~. ~ ~ .m ~ ' CD

p ~ ,..r . ~ fT4 p, ,..., °""'~ v~ O ~ ~ cry O
c ~ N
O ~"D ~ cnN O
,~...~.~
,r . 'I
O O
O
c~°
c~--w- ~ ' ~.. c~
r~r~ O c'~~ Q+ ~ ~..
~ CI~~~.
~ o ,r . ~~., "

o I I ~' ~d ~ ~: a o ~ o '-~ ~ to c~ o' ---r ~ . ~ ~ a Cr4 , "'~ . C~ ~..,, .
cry ~ . ~ ~: ca c~ro o ~ o p, n ~ ~ O c ~ y cw _~
., ~~.. ~ ~ ""
O ~ C""';'D ('D rr~ ~
,~.~ rn .. ~. ~. o ~. ~ ~ o ~a cry ~ ~
' ~, o ~ o c~
.

Claims

CA002316743A 2000-08-28 2000-08-28 Managed database privacy system and method Abandoned CA2316743A1 (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CA002316743A CA2316743A1 (en) 2000-08-28 2000-08-28 Managed database privacy system and method

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CA002316743A CA2316743A1 (en) 2000-08-28 2000-08-28 Managed database privacy system and method

Publications (1)

Publication Number Publication Date
CA2316743A1 true CA2316743A1 (en) 2002-02-28

Family

ID=4166966

Family Applications (1)

Application Number Title Priority Date Filing Date
CA002316743A Abandoned CA2316743A1 (en) 2000-08-28 2000-08-28 Managed database privacy system and method

Country Status (1)

Country Link
CA (1) CA2316743A1 (en)

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2009127771A1 (en) * 2008-04-16 2009-10-22 Nokia Corporation Privacy management of data

Cited By (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2009127771A1 (en) * 2008-04-16 2009-10-22 Nokia Corporation Privacy management of data
US8397304B2 (en) 2008-04-16 2013-03-12 Nokia Corporation Privacy management of data

Similar Documents

Publication Publication Date Title
US11805131B2 (en) Methods and systems for virtual file storage and encryption
US11934540B2 (en) System and method for multiparty secure computing platform
US11818251B2 (en) System and method for securely storing and sharing information
EP3298532B1 (en) Encryption and decryption system and method
US20180232526A1 (en) System and method for securely storing and sharing information
US20170111331A1 (en) Verifiable trust for data through wrapper composition
US20210406386A1 (en) System and method for multiparty secure computing platform
TWI532355B (en) Trustworthy extensible markup language for trustworthy computing and data services
US20030158960A1 (en) System and method for establishing a privacy communication path
HU231270B1 (en) Method and system for registration and data handling in an anonymous data share system
Hassan et al. The rise of cloud computing: data protection, privacy, and open research challenges—a systematic literature review (SLR)
US20090097769A1 (en) Systems and methods for securely processing form data
KR20050119133A (en) User identity privacy in authorization certificates
EP3469512A1 (en) Systems and methods for secure storage of user information in a user profile
JP4246112B2 (en) File security management system, authentication server, client device, program, and recording medium
WO2001090968A1 (en) A system and method for establishing a privacy communication path
JP5112153B2 (en) Approver selection method, system, apparatus, and program
CA2316743A1 (en) Managed database privacy system and method
Pavithra et al. Enhanced Secure Big Data in Distributed Mobile Cloud Computing Using Fuzzy Encryption Model
Koushikaa et al. A public key cryptography security system for big data
CN117436107A (en) Logistics information encryption method and system
EP4211586A1 (en) System and method for multiparty secure computing platform
TR2023010074A2 (en) WORKING METHOD OF SHARING PLATFORM THAT PROTECTS DATA PRIVACY
TWI430134B (en) File permissions control and confidentiality methods, program products and authority control and management system
HU231482B1 (en) Computer implemented method, system, program and data storage to provide service for personal data anonymisation

Legal Events

Date Code Title Description
FZDE Discontinued
FZDE Discontinued

Effective date: 20030612