US20170323004A1 - Block classified term - Google Patents

Block classified term Download PDF

Info

Publication number
US20170323004A1
US20170323004A1 US15/524,122 US201415524122A US2017323004A1 US 20170323004 A1 US20170323004 A1 US 20170323004A1 US 201415524122 A US201415524122 A US 201415524122A US 2017323004 A1 US2017323004 A1 US 2017323004A1
Authority
US
United States
Prior art keywords
term
user
class
rule
permission
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Granted
Application number
US15/524,122
Other versions
US10902026B2 (en
Inventor
Daniel Lau
Lewis Mackay
Daniel Timms
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Longsand Ltd
Original Assignee
Longsand Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Longsand Ltd filed Critical Longsand Ltd
Assigned to LONGSAND LIMITED reassignment LONGSAND LIMITED ASSIGNMENT OF ASSIGNORS INTEREST (SEE DOCUMENT FOR DETAILS). Assignors: LAU, DANIEL, MACKAY, Lewis, TIMMS, DANIEL
Publication of US20170323004A1 publication Critical patent/US20170323004A1/en
Application granted granted Critical
Publication of US10902026B2 publication Critical patent/US10902026B2/en
Active legal-status Critical Current
Adjusted expiration legal-status Critical

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F16/00Information retrieval; Database structures therefor; File system structures therefor
    • G06F16/20Information retrieval; Database structures therefor; File system structures therefor of structured data, e.g. relational data
    • G06F16/28Databases characterised by their database models, e.g. relational or object models
    • G06F16/284Relational databases
    • G06F16/285Clustering or classification
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F16/00Information retrieval; Database structures therefor; File system structures therefor
    • G06F16/90Details of database functions independent of the retrieved data types
    • G06F16/903Querying
    • G06F16/9032Query formulation
    • G06F16/90324Query formulation using system suggestions
    • G06F17/30598
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F16/00Information retrieval; Database structures therefor; File system structures therefor
    • G06F16/90Details of database functions independent of the retrieved data types
    • G06F16/95Retrieval from the web
    • G06F16/951Indexing; Web crawling techniques
    • G06F17/30864
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06NCOMPUTING ARRANGEMENTS BASED ON SPECIFIC COMPUTATIONAL MODELS
    • G06N20/00Machine learning
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06NCOMPUTING ARRANGEMENTS BASED ON SPECIFIC COMPUTATIONAL MODELS
    • G06N5/00Computing arrangements using knowledge-based models
    • G06N5/04Inference or reasoning models
    • G06N5/046Forward inferencing; Production systems
    • G06N7/005
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06NCOMPUTING ARRANGEMENTS BASED ON SPECIFIC COMPUTATIONAL MODELS
    • G06N7/00Computing arrangements based on specific mathematical models
    • G06N7/01Probabilistic graphical models, e.g. probabilistic networks
    • G06N99/005

Definitions

  • Autocomplete may involve the device or system predicting a word or phrase that the user wants to type in without the user actually typing it in completely. Manufacturers, vendors, and/or service providers are challenged to provide improved autocomplete technologies to better assist the user.
  • FIG. 1 is an example block diagram of a device to block a term from being presented to a user
  • FIG. 2 is another example block diagram of a device to block a term from being presented to a user
  • FIG. 3 is an example block diagram of a computing device including instructions for blocking a term based on a class of the term
  • FIG. 4 is an example flowchart of a method for blocking a term based on a class of the term.
  • Auto-completion dialogues may provide a user with suggestions from fragments of input text. For example “capit” may be auto-completed to “capital” or “capitulate.” Auto-completion may be implemented through, for example, web browsers, e-mail programs, search engine interfaces, source code editors, database query tools, word processors, and command line interpreters.
  • Some implementations may use either a dictionary or search engine.
  • the search engine may only provide suggestions that return relevant items indexed into the search engine, as opposed to a dictionary where some entries may not be present.
  • the indexed data may include sensitive information.
  • a search index of medical records could contain patient names or their social security numbers. Auto-completing sensitive information may be undesirable whilst completing non-sensitive information is beneficial to the search operator.
  • Filtering data using only weighting or some popularity/threshold parameter (number of documents containing terms), may not provide fine enough control to prevent leaking of sensitive information. Further, providing explicit blacklists for suggestions may filter out exact term matches. However, manually providing and/or updating such a level of fine control may be cost-prohibitive, to the point where it is unlikely to be usefully applied.
  • Examples may use classification technology to filter auto-complete suggestions so that users are presented only with information they are permitted to see.
  • An example device may determine a class a term from a database. The device may block the term from being presented to a user, if the determined class does not include a permission for the user to view the term. The term may suggest a remainder of an incomplete query input by the user.
  • examples may allow for finer control over what elements are filtered compared to simple weight/threshold parameters. Further, examples may allow for faster deployment and less maintenance compared to a manually maintained blacklist or whitelist of exact terms/phrases/entries.
  • FIG. 1 is an example block diagram of a device 100 to block a term from being presented to a user.
  • the device 100 may be a microprocessor, a controller, a memory module or device, a notebook computer, a desktop computer, an all-in-one system, a server, a network device, a wireless device, or any other type of device capable of interacting with a database and/or intercepting a message along a network.
  • the device 100 is shown to include a classification unit 110 and a filter unit 120 .
  • the classification and filter units 110 and 120 may include, for example, a hardware device including electronic circuitry for implementing the functionality described below, such as control logic and/or memory.
  • the classification and filter units 110 and 120 may be implemented as a series of instructions encoded on a machine-readable storage medium and executable by a processor.
  • the classification unit 110 may determine a class 112 of a term from a database.
  • the term may be a word or phrase used to describe a thing or to express a concept, such as a name, an address, and a social security number, and the like.
  • the term may suggest a remainder of an incomplete query input by the user.
  • the class 112 may relate to a system for identifying various types of terms, such as confidential and non-confidential terms.
  • the filter unit 120 may block a term from being presented to a user, if the determined class 112 does not include a permission 122 for the user to view the term.
  • the determined class 112 may indicate at least one of sensitive and personally identifiable information, if the determined class 112 does not include permission 122 for the user to view the term.
  • the filter unit 120 may allow the term to be presented to the user, if the determined class 112 includes the permission 122 for the user to view the term.
  • the user may be any person who is entering a query, such as by using a computer or network service, for which the database may autocomplete with the term.
  • the user may have a user account and/or be identified by a user name and/or password.
  • the permission 122 may relate to the whether the user has a right to view, access or modify the term.
  • the permission 122 here may relate to whether the user may view the term triggered by the database in response to the user's query.
  • the filter unit 120 may block the term by preventing the term from being sent to the user and/or denying access to the term.
  • the determined class 112 may be stored and/or associated with the term at the database, the classification unit 110 and/or the filter unit 120 , such as via metadata. The classification and filter units 110 and 120 are explained in greater detail below with respect to FIG. 2 .
  • FIG. 2 is another example block diagram of a device 200 to block a term from being presented to a user.
  • the device 200 may be a microprocessor, a controller, a memory module or device, a notebook computer, a desktop computer, an all-in-one system, a server, a network device, a wireless device, or any other type of device capable of interacting with a database and/or intercepting a message along a network.
  • the device 200 is shown to interface with a database 230 .
  • the database 230 may be any electronic, magnetic, optical, or other physical storage device that contains or stores information, such as Random Access Memory (RAM), an Electrically Erasable Programmable Read-Only Memory (EEPROM), a storage drive, a Compact Disc Read Only Memory (CD-ROM), and the like.
  • RAM Random Access Memory
  • EEPROM Electrically Erasable Programmable Read-Only Memory
  • CD-ROM Compact Disc Read Only Memory
  • the database 230 may include the most popular search terms 232 - 1 to 232 - n , where n is a natural number, indexed from a search engine. Further, at least some of the search terms 232 - 1 to 232 - n , may include personally identifiable information (PII), such as medical records, names, social security numbers and the like.
  • PII personally identifiable information
  • the device 200 of FIG. 2 may include at least the functionality and/or hardware of the device 100 of FIG. 1 .
  • a classification unit 210 of the device 200 of FIG. 2 may include at least the functionality and/or hardware of the classification unit 110 of the device 100 of FIG. 1 and a filter unit 220 of the device 200 of FIG. 2 may include at least the functionality and/or hardware of the classification unit 120 of the device 100 of FIG. 1 .
  • the classification unit 210 may determine a class 212 of a term 232 from the database 230 .
  • the class 212 of the term 232 may vary with respect to the user 250 .
  • the term 232 may be classified as confidential with respect to a first user but classified as non-confidential with respect to a second user.
  • the classification unit 210 may take into account a type or identify of the user 250 when determining the class 212 of the term 232 .
  • Different types of the users 250 may correspond to different types of classes 212 .
  • the user's 250 account may be used to identify the type of user, such as when the user 250 logs into a system.
  • the filter unit 220 may block a term from being presented to a user 250 , if the determined class 212 does not include a permission 222 for the user 250 to view the term 232 .
  • the filter unit 220 may allow the term 232 to be presented to the user 250 , if the determined class 212 includes the permission 222 for the user 250 to view the term 232 .
  • the classification unit 210 may classify the term 232 based on at least one of a rule 214 and machine learning 216 . While one rule 214 is shown, examples may include a plurality of rules. The rule 214 may indicate an operation to be performed on a number, letter, grammar, punctuation and/or syntax of the term 232 . The classification unit 210 may use the rule 214 to match the term 232 to at least one of a template and a pattern. For example, the classification unit 210 may use a rule to classify a term 232 as a social security number, if the term 232 matches a particular pattern for a social security number, as indicated by the rule 214 . The filter unit 220 may block the term 232 from being presented to the user 250 , if the term 232 is classified as a social security number.
  • the classification unit 210 may perform an arithmetic operation on the term 232 .
  • the filter unit 220 may allow the term to be presented to the user 250 , if a result of the arithmetic operation satisfies the rule 214 .
  • the classification unit 210 may classify the term 232 as a credit card number upon a result of a checksum or multiplication of the digits of the credit card or instead classify the term 232 as a date upon comparing a range and/or syntax of the term 232 to a template.
  • the filter unit 220 may block the term 232 from being presented to the user 250 , if the term 232 is classified as a credit card number or a date that falls on prohibited day.
  • Machine learning 216 may relate to a construction and study of algorithms that can learn from data. Such algorithms may operate by building a model based on inputs and using that to make predictions or decisions, rather than following only explicitly programmed instructions.
  • Machine learning 216 techniques may include, for example, grammar induction and/or a probabilistic classifier.
  • the probabilistic classifier may be a Bayesian classifier.
  • Grammar induction may include, for example, inference by trial-and-error, a genetic algorithm, a greedy algorithm, a distributional learning algorithm and a pattern learning algorithm.
  • the classification unit 210 may use machine learning to classify types of terms 232 that may not be easily identifiable via a rule 214 , such as addresses or spam.
  • the classification unit 210 may determine a plurality of the different types of classes 212 , based on the plurality of terms 232 - 1 to 232 - n included in the database 230 .
  • the types of classes 212 may relate to different security clearances. Further, at least one of the classes 212 may be a subset of another of the classes 212 .
  • the filter unit 220 may compare to an identify of the user 250 to class 212 of the term 232 determine, if the user's security clearance only allows them to see a subset of the terms 232 . If the user 240 does have not security clearance, the filer unit 220 may not provide the term 232 to the user 250 , which was suggested by the database in response to the user's 250 query.
  • the classification unit 210 may determine a plurality of the classes 212 of the terms 232 simultaneously.
  • the filter unit 220 may block and/or allow a plurality of the terms 232 simultaneously.
  • examples may remove or prevent terms 232 from being suggested to the user 250 that are classified as not to be presented to the user 250 .
  • P 11 is just one example of a type classification that could be filtered upon by the filter unit 220 .
  • Examples may determine a class 212 of a term 232 , based on any type of criteria deemed appropriate for denying to the term 232 .
  • FIG. 3 is an example block diagram of a computing device 300 including instructions for blocking a term based on a class of the term.
  • the computing device 300 includes a processor 310 and a machine-readable storage medium 320 .
  • the machine-readable storage medium 320 further includes instructions 322 and 324 for blocking the term based on the class of the term.
  • the computing device 300 may be included in or part of, for example, a microprocessor, a controller, a memory module or device, a notebook computer, a desktop computer, an all-in-one system, a server, a network device, a wireless device, or any other type of device capable of executing the instructions 322 and 324 .
  • the computing device 300 may include or be connected to additional components such as memories, controllers, etc.
  • the processor 310 may be, at least one central processing unit (CPU), at least one semiconductor-based microprocessor, at least one graphics processing unit (GPU), a microcontroller, special purpose logic hardware controlled by microcode or other hardware devices suitable for retrieval and execution of instructions stored in the machine-readable storage medium 320 , or combinations thereof.
  • the processor 310 may fetch, decode, and execute instructions 321 , 323 , 325 , 327 and 329 to implement blocking the term based on the class of the term.
  • the processor 310 may include at least one integrated circuit (IC), other control logic, other electronic circuits, or combinations thereof that include a number of electronic components for performing the functionality of instructions 322 and 324 .
  • IC integrated circuit
  • the machine-readable storage medium 320 may be any electronic, magnetic, optical, or other physical storage device that contains or stores executable instructions.
  • the machine-readable storage medium 320 may be, for example, Random Access Memory (RAM), an Electrically Erasable Programmable Read-Only Memory (EEPROM), a storage drive, a Compact Disc Read Only Memory (CD-ROM), and the like.
  • RAM Random Access Memory
  • EEPROM Electrically Erasable Programmable Read-Only Memory
  • CD-ROM Compact Disc Read Only Memory
  • the machine-readable storage medium 320 can be non-transitory.
  • machine-readable storage medium 320 may be encoded with a series of executable instructions for blocking the term based on the class of the term.
  • the instructions 322 and 324 when executed by a processor (e.g., via one processing element or multiple processing elements of the processor) can cause the processor to perform processes, such as, the process of FIG. 4 .
  • the analyze instructions 322 may be executed by the processor 310 to analyze a term from a database (not shown) to determine a class, the term is to relate to part of a query and to suggest a remainder of the query.
  • the determine instructions 324 may be executed by the processor 310 to determine if the term is to be blocked in response to the query, based on the class of the analyzed term.
  • the class may be determined based on at least one of a rule and machine learning. For example, the term may be blocked from being presented, if a user does not have permission to the analyzed class. The term may be allowed to be presented, if the user, if the user has permission to the analyzed class.
  • FIG. 4 is an example flowchart 400 of a method for blocking a term based on a class of the term.
  • execution of the method 400 is described below with reference to the device 200 , other suitable components for execution of the method 400 can be utilized, such as the device 100 .
  • the components for executing the method 400 may be spread among multiple devices (e.g., a processing device in communication with input and output devices). In certain scenarios, multiple devices acting in coordination can be considered a single device to perform the method 400 .
  • the method 400 may be implemented in the form of executable instructions stored on a machine-readable storage medium, such as storage medium 320 , and/or in the form of electronic circuitry.
  • the device 200 receives a term 232 from a database 230 related to part of a query of a user 250 .
  • the term 232 may suggest a remainder of the query.
  • the device 200 may classify the term based on at least one of a rule 214 and machine learning 216 .
  • the machine learning 216 may include at least one of grammar induction and a probabilistic classifier to classify the term 232 .
  • the rule 214 may match the term to at least one of a template and a pattern to classify the term 232 .
  • the device 200 blocks the term 232 from being suggested, if the class 212 of the term 232 does not provide permission 222 to a user 250 to view the term 232 .
  • the device 200 allows the term to be suggested, if the class 212 of the term 232 does provide permission 222 to the user 250 to view the term 232 .

Landscapes

  • Engineering & Computer Science (AREA)
  • Theoretical Computer Science (AREA)
  • Physics & Mathematics (AREA)
  • Databases & Information Systems (AREA)
  • General Physics & Mathematics (AREA)
  • Data Mining & Analysis (AREA)
  • General Engineering & Computer Science (AREA)
  • Software Systems (AREA)
  • Mathematical Physics (AREA)
  • Evolutionary Computation (AREA)
  • Artificial Intelligence (AREA)
  • Computing Systems (AREA)
  • Computational Linguistics (AREA)
  • Computer Vision & Pattern Recognition (AREA)
  • Medical Informatics (AREA)
  • Computational Mathematics (AREA)
  • Pure & Applied Mathematics (AREA)
  • Mathematical Optimization (AREA)
  • Mathematical Analysis (AREA)
  • Algebra (AREA)
  • Probability & Statistics with Applications (AREA)
  • Information Retrieval, Db Structures And Fs Structures Therefor (AREA)

Abstract

A class may be determined of a term from a database. The term may be blocked from being presented to a user, if the determined class does not include a permission for the user to view the term. The term may suggest a remainder of an incomplete query input by the user.

Description

    BACKGROUND
  • Device or systems may provide a feature called autocomplete, or word completion. Autocomplete may involve the device or system predicting a word or phrase that the user wants to type in without the user actually typing it in completely. Manufacturers, vendors, and/or service providers are challenged to provide improved autocomplete technologies to better assist the user.
  • BRIEF DESCRIPTION OF THE DRAWINGS
  • The following detailed description references the drawings, wherein:
  • FIG. 1 is an example block diagram of a device to block a term from being presented to a user;
  • FIG. 2 is another example block diagram of a device to block a term from being presented to a user;
  • FIG. 3 is an example block diagram of a computing device including instructions for blocking a term based on a class of the term; and
  • FIG. 4 is an example flowchart of a method for blocking a term based on a class of the term.
  • DETAILED DESCRIPTION
  • Specific details are given in the following description to provide a thorough understanding of embodiments. However, it will be understood that embodiments may be practiced without these specific details. For example, systems may be shown in block diagrams in order not to obscure embodiments in unnecessary detail. In other instances, well-known processes, structures and techniques may be shown without unnecessary detail in order to avoid obscuring embodiments.
  • Auto-completion dialogues may provide a user with suggestions from fragments of input text. For example “capit” may be auto-completed to “capital” or “capitulate.” Auto-completion may be implemented through, for example, web browsers, e-mail programs, search engine interfaces, source code editors, database query tools, word processors, and command line interpreters.
  • Some implementations may use either a dictionary or search engine. The search engine may only provide suggestions that return relevant items indexed into the search engine, as opposed to a dictionary where some entries may not be present. However, in some scenarios the indexed data may include sensitive information. For example, a search index of medical records could contain patient names or their social security numbers. Auto-completing sensitive information may be undesirable whilst completing non-sensitive information is beneficial to the search operator.
  • Filtering data using only weighting or some popularity/threshold parameter (number of documents containing terms), may not provide fine enough control to prevent leaking of sensitive information. Further, providing explicit blacklists for suggestions may filter out exact term matches. However, manually providing and/or updating such a level of fine control may be cost-prohibitive, to the point where it is unlikely to be usefully applied.
  • Examples may use classification technology to filter auto-complete suggestions so that users are presented only with information they are permitted to see. An example device may determine a class a term from a database. The device may block the term from being presented to a user, if the determined class does not include a permission for the user to view the term. The term may suggest a remainder of an incomplete query input by the user.
  • Thus, examples may allow for finer control over what elements are filtered compared to simple weight/threshold parameters. Further, examples may allow for faster deployment and less maintenance compared to a manually maintained blacklist or whitelist of exact terms/phrases/entries.
  • Referring now to the drawings, FIG. 1 is an example block diagram of a device 100 to block a term from being presented to a user. The device 100 may be a microprocessor, a controller, a memory module or device, a notebook computer, a desktop computer, an all-in-one system, a server, a network device, a wireless device, or any other type of device capable of interacting with a database and/or intercepting a message along a network.
  • The device 100 is shown to include a classification unit 110 and a filter unit 120. The classification and filter units 110 and 120 may include, for example, a hardware device including electronic circuitry for implementing the functionality described below, such as control logic and/or memory. In addition or as an alternative, the classification and filter units 110 and 120 may be implemented as a series of instructions encoded on a machine-readable storage medium and executable by a processor.
  • The classification unit 110 may determine a class 112 of a term from a database. The term may be a word or phrase used to describe a thing or to express a concept, such as a name, an address, and a social security number, and the like. The term may suggest a remainder of an incomplete query input by the user. The class 112 may relate to a system for identifying various types of terms, such as confidential and non-confidential terms.
  • The filter unit 120 may block a term from being presented to a user, if the determined class 112 does not include a permission 122 for the user to view the term. For instance, the determined class 112 may indicate at least one of sensitive and personally identifiable information, if the determined class 112 does not include permission 122 for the user to view the term. The filter unit 120 may allow the term to be presented to the user, if the determined class 112 includes the permission 122 for the user to view the term.
  • The user may be any person who is entering a query, such as by using a computer or network service, for which the database may autocomplete with the term. The user may have a user account and/or be identified by a user name and/or password. The permission 122 may relate to the whether the user has a right to view, access or modify the term. The permission 122 here may relate to whether the user may view the term triggered by the database in response to the user's query.
  • For instance, if the user does not have permission to view the term based on the class 112 of the term, the filter unit 120 may block the term by preventing the term from being sent to the user and/or denying access to the term. The determined class 112 may be stored and/or associated with the term at the database, the classification unit 110 and/or the filter unit 120, such as via metadata. The classification and filter units 110 and 120 are explained in greater detail below with respect to FIG. 2.
  • FIG. 2 is another example block diagram of a device 200 to block a term from being presented to a user. The device 200 may be a microprocessor, a controller, a memory module or device, a notebook computer, a desktop computer, an all-in-one system, a server, a network device, a wireless device, or any other type of device capable of interacting with a database and/or intercepting a message along a network.
  • The device 200 is shown to interface with a database 230. The database 230 may be any electronic, magnetic, optical, or other physical storage device that contains or stores information, such as Random Access Memory (RAM), an Electrically Erasable Programmable Read-Only Memory (EEPROM), a storage drive, a Compact Disc Read Only Memory (CD-ROM), and the like. For instance, the database 230 may include the most popular search terms 232-1 to 232-n, where n is a natural number, indexed from a search engine. Further, at least some of the search terms 232-1 to 232-n, may include personally identifiable information (PII), such as medical records, names, social security numbers and the like.
  • The device 200 of FIG. 2 may include at least the functionality and/or hardware of the device 100 of FIG. 1. For example, a classification unit 210 of the device 200 of FIG. 2 may include at least the functionality and/or hardware of the classification unit 110 of the device 100 of FIG. 1 and a filter unit 220 of the device 200 of FIG. 2 may include at least the functionality and/or hardware of the classification unit 120 of the device 100 of FIG. 1.
  • As noted above, the classification unit 210 may determine a class 212 of a term 232 from the database 230. The class 212 of the term 232 may vary with respect to the user 250. For example, the term 232 may be classified as confidential with respect to a first user but classified as non-confidential with respect to a second user. Thus, the classification unit 210 may take into account a type or identify of the user 250 when determining the class 212 of the term 232. Different types of the users 250 may correspond to different types of classes 212. For instance, the user's 250 account may be used to identify the type of user, such as when the user 250 logs into a system.
  • As also noted above, the filter unit 220 may block a term from being presented to a user 250, if the determined class 212 does not include a permission 222 for the user 250 to view the term 232. The filter unit 220 may allow the term 232 to be presented to the user 250, if the determined class 212 includes the permission 222 for the user 250 to view the term 232.
  • The classification unit 210 may classify the term 232 based on at least one of a rule 214 and machine learning 216. While one rule 214 is shown, examples may include a plurality of rules. The rule 214 may indicate an operation to be performed on a number, letter, grammar, punctuation and/or syntax of the term 232. The classification unit 210 may use the rule 214 to match the term 232 to at least one of a template and a pattern. For example, the classification unit 210 may use a rule to classify a term 232 as a social security number, if the term 232 matches a particular pattern for a social security number, as indicated by the rule 214. The filter unit 220 may block the term 232 from being presented to the user 250, if the term 232 is classified as a social security number.
  • In another example, the classification unit 210 may perform an arithmetic operation on the term 232. In turn, the filter unit 220 may allow the term to be presented to the user 250, if a result of the arithmetic operation satisfies the rule 214. For instance, the classification unit 210 may classify the term 232 as a credit card number upon a result of a checksum or multiplication of the digits of the credit card or instead classify the term 232 as a date upon comparing a range and/or syntax of the term 232 to a template. Here, the filter unit 220 may block the term 232 from being presented to the user 250, if the term 232 is classified as a credit card number or a date that falls on prohibited day.
  • Machine learning 216 may relate to a construction and study of algorithms that can learn from data. Such algorithms may operate by building a model based on inputs and using that to make predictions or decisions, rather than following only explicitly programmed instructions. Machine learning 216 techniques may include, for example, grammar induction and/or a probabilistic classifier. For instance, the probabilistic classifier may be a Bayesian classifier. Grammar induction may include, for example, inference by trial-and-error, a genetic algorithm, a greedy algorithm, a distributional learning algorithm and a pattern learning algorithm. The classification unit 210 may use machine learning to classify types of terms 232 that may not be easily identifiable via a rule 214, such as addresses or spam.
  • As noted above, the classification unit 210 may determine a plurality of the different types of classes 212, based on the plurality of terms 232-1 to 232-n included in the database 230. The types of classes 212 may relate to different security clearances. Further, at least one of the classes 212 may be a subset of another of the classes 212. Thus, the filter unit 220 may compare to an identify of the user 250 to class 212 of the term 232 determine, if the user's security clearance only allows them to see a subset of the terms 232. If the user 240 does have not security clearance, the filer unit 220 may not provide the term 232 to the user 250, which was suggested by the database in response to the user's 250 query.
  • The classification unit 210 may determine a plurality of the classes 212 of the terms 232 simultaneously. Similarly, the filter unit 220 may block and/or allow a plurality of the terms 232 simultaneously. Thus, examples may remove or prevent terms 232 from being suggested to the user 250 that are classified as not to be presented to the user 250. Further, P11 is just one example of a type classification that could be filtered upon by the filter unit 220. Examples may determine a class 212 of a term 232, based on any type of criteria deemed appropriate for denying to the term 232.
  • FIG. 3 is an example block diagram of a computing device 300 including instructions for blocking a term based on a class of the term. In the embodiment of FIG. 3, the computing device 300 includes a processor 310 and a machine-readable storage medium 320. The machine-readable storage medium 320 further includes instructions 322 and 324 for blocking the term based on the class of the term.
  • The computing device 300 may be included in or part of, for example, a microprocessor, a controller, a memory module or device, a notebook computer, a desktop computer, an all-in-one system, a server, a network device, a wireless device, or any other type of device capable of executing the instructions 322 and 324. In certain examples, the computing device 300 may include or be connected to additional components such as memories, controllers, etc.
  • The processor 310 may be, at least one central processing unit (CPU), at least one semiconductor-based microprocessor, at least one graphics processing unit (GPU), a microcontroller, special purpose logic hardware controlled by microcode or other hardware devices suitable for retrieval and execution of instructions stored in the machine-readable storage medium 320, or combinations thereof. The processor 310 may fetch, decode, and execute instructions 321, 323, 325, 327 and 329 to implement blocking the term based on the class of the term. As an alternative or in addition to retrieving and executing instructions, the processor 310 may include at least one integrated circuit (IC), other control logic, other electronic circuits, or combinations thereof that include a number of electronic components for performing the functionality of instructions 322 and 324.
  • The machine-readable storage medium 320 may be any electronic, magnetic, optical, or other physical storage device that contains or stores executable instructions. Thus, the machine-readable storage medium 320 may be, for example, Random Access Memory (RAM), an Electrically Erasable Programmable Read-Only Memory (EEPROM), a storage drive, a Compact Disc Read Only Memory (CD-ROM), and the like. As such, the machine-readable storage medium 320 can be non-transitory. As described in detail below, machine-readable storage medium 320 may be encoded with a series of executable instructions for blocking the term based on the class of the term.
  • Moreover, the instructions 322 and 324, when executed by a processor (e.g., via one processing element or multiple processing elements of the processor) can cause the processor to perform processes, such as, the process of FIG. 4. For example, the analyze instructions 322 may be executed by the processor 310 to analyze a term from a database (not shown) to determine a class, the term is to relate to part of a query and to suggest a remainder of the query. The determine instructions 324 may be executed by the processor 310 to determine if the term is to be blocked in response to the query, based on the class of the analyzed term. The class may be determined based on at least one of a rule and machine learning. For example, the term may be blocked from being presented, if a user does not have permission to the analyzed class. The term may be allowed to be presented, if the user, if the user has permission to the analyzed class.
  • FIG. 4 is an example flowchart 400 of a method for blocking a term based on a class of the term. Although execution of the method 400 is described below with reference to the device 200, other suitable components for execution of the method 400 can be utilized, such as the device 100. Additionally, the components for executing the method 400 may be spread among multiple devices (e.g., a processing device in communication with input and output devices). In certain scenarios, multiple devices acting in coordination can be considered a single device to perform the method 400. The method 400 may be implemented in the form of executable instructions stored on a machine-readable storage medium, such as storage medium 320, and/or in the form of electronic circuitry.
  • At block 410, the device 200 receives a term 232 from a database 230 related to part of a query of a user 250. The term 232 may suggest a remainder of the query. At block 420, the device 200 may classify the term based on at least one of a rule 214 and machine learning 216. The machine learning 216 may include at least one of grammar induction and a probabilistic classifier to classify the term 232. The rule 214 may match the term to at least one of a template and a pattern to classify the term 232.
  • At block 430, the device 200 blocks the term 232 from being suggested, if the class 212 of the term 232 does not provide permission 222 to a user 250 to view the term 232. At block 440, the device 200 allows the term to be suggested, if the class 212 of the term 232 does provide permission 222 to the user 250 to view the term 232.

Claims (15)

We claim:
1. A device, comprising:
a classification unit to determine a class of a term from a database; and
a filter unit to block the term from being presented to a user, if the determined class does not include a permission for the user to view the term, wherein
the term is to suggest a remainder of an incomplete query input by the user.
2. The device of claim 1, wherein the classification unit is to classify the term based on at least one of a rule and machine learning.
3. The device of claim 2, wherein,
the classification unit is classify the term based on machine learning, and
the machine learning includes at least one of grammar induction and a probabilistic classifier.
4. The device of claim 3, wherein,
the probabilistic classifier includes a Bayesian classifier; and
the grammar induction includes at least one of inference by trial-and-error, a genetic algorithm, a greedy algorithm, a distributional learning algorithm and a pattern learning algorithm.
5. The device of claim 2, wherein
the classification unit is to classify the term based on the rule, and
the rule indicates an operation to be performed on at least one of a number, letters and syntax of the term, and
the classification unit is to use the rule to match the term to at least one of a template and a pattern.
6. The device of claim 5, wherein,
the classification unit is to perform an arithmetic operation on the term, and
the filter unit is to allow the term to be presented to the user, if a result of the arithmetic operation satisfies the rule.
7. The device of claim 1, wherein,
the filter unit is to allow the term to be presented to the user, if the determined class includes the permission for the user to view the term, and
different types of the users correspond to different types of classes.
8. The device of claim 7, wherein,
the classification unit is determine a plurality of the different types of classes, based on the plurality of terms included in the database, and
the terms of the database are mined from data indexed into a search engine.
9. The device of claim 8, wherein,
the types of classes relates to different security clearances, and
at least one of the classes is a subset of another of the classes.
10. The device of claim 1, wherein,
the term includes at least one of a name, an address, and a social security number, and
the determined class indicates at least one of sensitive and personally identifiable information, if the determined class does not include permission for the user to view the term.
11. The device of claim 1, wherein
the classification unit is to determine a plurality of the classes of the terms simultaneously, and
the filter unit is to at least one block and allow a plurality of the terms simultaneously.
12. A method, comprising:
receiving a term from a database related to part of a query of a user, the term to suggest a remainder of the query;
classifying the term based on at least one of a rule and machine learning;
blocking the term from being suggested, if the class of the term does not provide permission to a user to view the term; and
allowing the term to be suggested, if the class of the term does provide permission to the user to view the term.
13. The method of claim 12, wherein,
the machine learning includes at least one of grammar induction and a probabilistic classifier to classify the term, and
the rule is to match the term to at least one of a template and a pattern to classify the term.
14. A non-transitory computer-readable storage medium storing instructions that, if executed by a processor of a device, cause the processor to:
analyze a term from a database to determine a class, the term is to relate to part of a query and to suggest a remainder of the query; and
determine if the term is to be blocked in response to the query, based on the class of the analyzed term, wherein,
the class is determined based on at least one of a rule and machine learning.
15. The non-transitory computer-readable storage medium of claim 14, wherein,
the term is blocked from being presented, if a user does not have permission to the analyzed class, and
the term is allowed to be presented, if the user has permission to the analyzed class.
US15/524,122 2014-11-27 2014-11-27 Block classified term Active 2036-10-23 US10902026B2 (en)

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
PCT/EP2014/075782 WO2016082877A1 (en) 2014-11-27 2014-11-27 Block classified term

Publications (2)

Publication Number Publication Date
US20170323004A1 true US20170323004A1 (en) 2017-11-09
US10902026B2 US10902026B2 (en) 2021-01-26

Family

ID=52000838

Family Applications (1)

Application Number Title Priority Date Filing Date
US15/524,122 Active 2036-10-23 US10902026B2 (en) 2014-11-27 2014-11-27 Block classified term

Country Status (5)

Country Link
US (1) US10902026B2 (en)
EP (1) EP3224738A1 (en)
JP (1) JP2017534128A (en)
CN (1) CN107077471A (en)
WO (1) WO2016082877A1 (en)

Families Citing this family (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US10936945B2 (en) * 2016-06-06 2021-03-02 Microsoft Technology Licensing, Llc Query classification for appropriateness

Citations (7)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20090106178A1 (en) * 2007-10-23 2009-04-23 Sas Institute Inc. Computer-Implemented Systems And Methods For Updating Predictive Models
US7809714B1 (en) * 2007-04-30 2010-10-05 Lawrence Richard Smith Process for enhancing queries for information retrieval
US20140136543A1 (en) * 2012-11-13 2014-05-15 Oracle International Corporation Autocomplete searching with security filtering and ranking
US20160110657A1 (en) * 2014-10-14 2016-04-21 Skytree, Inc. Configurable Machine Learning Method Selection and Parameter Optimization System and Method
US20160253403A1 (en) * 2015-02-27 2016-09-01 Microsoft Technology Licensing, Llc Object query model for analytics data access
US20170177993A1 (en) * 2015-12-18 2017-06-22 Sandia Corporation Adaptive neural network management system
US20180018585A1 (en) * 2016-07-15 2018-01-18 Microsoft Technology Licensing, Llc Data evaluation as a service

Family Cites Families (12)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US6820075B2 (en) * 2001-08-13 2004-11-16 Xerox Corporation Document-centric system with auto-completion
US7802194B2 (en) 2007-02-02 2010-09-21 Sap Ag Business query language
US20090024590A1 (en) 2007-03-15 2009-01-22 Sturge Timothy User contributed knowledge database
CN100565523C (en) * 2007-04-05 2009-12-02 中国科学院自动化研究所 A kind of filtering sensitive web page method and system based on multiple Classifiers Combination
JP5168620B2 (en) 2007-11-07 2013-03-21 独立行政法人情報通信研究機構 Data type detection apparatus and data type detection method
US9411907B2 (en) 2010-04-26 2016-08-09 Salesforce.Com, Inc. Method and system for performing searches in a multi-tenant database environment
EP2469404B1 (en) * 2010-12-22 2018-04-18 Lg Electronics Inc. Mobile terminal and method of displaying information in accordance with a plurality of modes of use
US8412728B1 (en) * 2011-09-26 2013-04-02 Google Inc. User interface (UI) for presentation of match quality in auto-complete suggestions
DE102012212426B3 (en) 2012-07-16 2013-08-29 Schaeffler Technologies AG & Co. KG Rolling element, in particular rolling bearing ring
US10067913B2 (en) * 2013-05-08 2018-09-04 Microsoft Technology Licensing, Llc Cross-lingual automatic query annotation
CN103441986B (en) * 2013-07-29 2017-05-17 中国航天科工集团第二研究院七〇六所 Data resource security control method in thin client mode
CN103646109B (en) 2013-12-25 2017-01-25 武汉大学 Spatial data matching method based on machine learning

Patent Citations (7)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US7809714B1 (en) * 2007-04-30 2010-10-05 Lawrence Richard Smith Process for enhancing queries for information retrieval
US20090106178A1 (en) * 2007-10-23 2009-04-23 Sas Institute Inc. Computer-Implemented Systems And Methods For Updating Predictive Models
US20140136543A1 (en) * 2012-11-13 2014-05-15 Oracle International Corporation Autocomplete searching with security filtering and ranking
US20160110657A1 (en) * 2014-10-14 2016-04-21 Skytree, Inc. Configurable Machine Learning Method Selection and Parameter Optimization System and Method
US20160253403A1 (en) * 2015-02-27 2016-09-01 Microsoft Technology Licensing, Llc Object query model for analytics data access
US20170177993A1 (en) * 2015-12-18 2017-06-22 Sandia Corporation Adaptive neural network management system
US20180018585A1 (en) * 2016-07-15 2018-01-18 Microsoft Technology Licensing, Llc Data evaluation as a service

Also Published As

Publication number Publication date
JP2017534128A (en) 2017-11-16
EP3224738A1 (en) 2017-10-04
WO2016082877A1 (en) 2016-06-02
US10902026B2 (en) 2021-01-26
CN107077471A (en) 2017-08-18

Similar Documents

Publication Publication Date Title
CN109614816B (en) Data desensitizing method, device and storage medium
JP6966099B2 (en) Data content filter
US11381580B2 (en) Machine learning classification using Markov modeling
US11361068B2 (en) Securing passwords by using dummy characters
US10803057B1 (en) Utilizing regular expression embeddings for named entity recognition systems
US20140068757A1 (en) Authentication device, authentication method, and recording medium
CN110419041B (en) Automatic user profile generation and authentication
US20240028650A1 (en) Method, apparatus, and computer-readable medium for determining a data domain associated with data
CN111538978A (en) System and method for executing tasks based on access rights determined from task risk levels
CN112364625A (en) Text screening method, device, equipment and storage medium
Miura et al. Macros finder: Do you remember loveletter?
CN111159697A (en) Key detection method and device and electronic equipment
EP3543877A1 (en) Method and device for processing accumulative retrieval, terminal and storage medium
JP6777612B2 (en) Systems and methods to prevent data loss in computer systems
US10902026B2 (en) Block classified term
US9652627B2 (en) Probabilistic surfacing of potentially sensitive identifiers
CN111753304B (en) System and method for executing tasks on computing devices based on access rights
Al Zaabi et al. Android malware detection using static features and machine learning
Vatamanu et al. Building a practical and reliable classifier for malware detection
WO2023125336A1 (en) Methods and devices for generating sensitive text detectors
Fusco Normalising sovereignty: reflections of Schmitt’s notions of exception, decision and normality
JP6194180B2 (en) Text mask device and text mask program
US20160078072A1 (en) Term variant discernment system and method therefor
Ghaisas et al. Resolving ambiguities in regulations: towards achieving the kohlbergian stage of principled morality
EP3493093B1 (en) Data protection method for preventing of re-pasting of confidential data

Legal Events

Date Code Title Description
AS Assignment

Owner name: LONGSAND LIMITED, UNITED KINGDOM

Free format text: ASSIGNMENT OF ASSIGNORS INTEREST;ASSIGNORS:LAU, DANIEL;MACKAY, LEWIS;TIMMS, DANIEL;REEL/FRAME:042734/0149

Effective date: 20141126

STPP Information on status: patent application and granting procedure in general

Free format text: DOCKETED NEW CASE - READY FOR EXAMINATION

STPP Information on status: patent application and granting procedure in general

Free format text: NON FINAL ACTION MAILED

STPP Information on status: patent application and granting procedure in general

Free format text: RESPONSE TO NON-FINAL OFFICE ACTION ENTERED AND FORWARDED TO EXAMINER

STPP Information on status: patent application and granting procedure in general

Free format text: FINAL REJECTION MAILED

STPP Information on status: patent application and granting procedure in general

Free format text: PUBLICATIONS -- ISSUE FEE PAYMENT VERIFIED

STCF Information on status: patent grant

Free format text: PATENTED CASE