EP3224738A1 - Block classified term - Google Patents

Block classified term

Info

Publication number
EP3224738A1
EP3224738A1 EP14805567.6A EP14805567A EP3224738A1 EP 3224738 A1 EP3224738 A1 EP 3224738A1 EP 14805567 A EP14805567 A EP 14805567A EP 3224738 A1 EP3224738 A1 EP 3224738A1
Authority
EP
European Patent Office
Prior art keywords
term
user
class
rule
permission
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Withdrawn
Application number
EP14805567.6A
Other languages
German (de)
English (en)
French (fr)
Inventor
Daniel Lau
Lewis MACKAY
Daniel Timms
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Longsand Ltd
Original Assignee
Longsand Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Longsand Ltd filed Critical Longsand Ltd
Publication of EP3224738A1 publication Critical patent/EP3224738A1/en
Withdrawn legal-status Critical Current

Links

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F16/00Information retrieval; Database structures therefor; File system structures therefor
    • G06F16/20Information retrieval; Database structures therefor; File system structures therefor of structured data, e.g. relational data
    • G06F16/28Databases characterised by their database models, e.g. relational or object models
    • G06F16/284Relational databases
    • G06F16/285Clustering or classification
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F16/00Information retrieval; Database structures therefor; File system structures therefor
    • G06F16/90Details of database functions independent of the retrieved data types
    • G06F16/903Querying
    • G06F16/9032Query formulation
    • G06F16/90324Query formulation using system suggestions
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F16/00Information retrieval; Database structures therefor; File system structures therefor
    • G06F16/90Details of database functions independent of the retrieved data types
    • G06F16/95Retrieval from the web
    • G06F16/951Indexing; Web crawling techniques
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06NCOMPUTING ARRANGEMENTS BASED ON SPECIFIC COMPUTATIONAL MODELS
    • G06N20/00Machine learning
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06NCOMPUTING ARRANGEMENTS BASED ON SPECIFIC COMPUTATIONAL MODELS
    • G06N5/00Computing arrangements using knowledge-based models
    • G06N5/04Inference or reasoning models
    • G06N5/046Forward inferencing; Production systems
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06NCOMPUTING ARRANGEMENTS BASED ON SPECIFIC COMPUTATIONAL MODELS
    • G06N7/00Computing arrangements based on specific mathematical models
    • G06N7/01Probabilistic graphical models, e.g. probabilistic networks

Definitions

  • Device or systems may provide a feature called autocomplete, or word completion.
  • Autocomplete may involve the device or system predicting a word or phrase that the user wants to type in without the user actually typing it in completely.
  • Manufacturers, vendors, and/or service providers are challenged to provide improved autocomplete technologies to better assist the user.
  • FIG. 1 is an example block diagram of a device to block a term from being presented to a user
  • FIG. 2 is another example block diagram of a device to block a term from being presented to a user
  • FIG. 3 is an example block diagram of a computing device including instructions for blocking a term based on a class of the term
  • FIG. 4 is an example flowchart of a method for blocking a term based on a class of the term.
  • Auto-completion dialogues may provide a user with suggestions from fragments of input text. For example “capit” may be auto-completed to "capital” or “capitulate.” Auto-completion may be implemented through, for example, web browsers, e-mail programs, search engine interfaces, source code editors, database query tools, word processors, and command line interpreters.
  • Some implementations may use either a dictionary or search engine.
  • the search engine may only provide suggestions that return relevant items indexed into the search engine, as opposed to a dictionary where some entries may not be present.
  • the indexed data may include sensitive information.
  • a search index of medical records could contain patient names or their social security numbers. Auto-completing sensitive information may be undesirable whilst completing non-sensitive information is beneficial to the search operator.
  • Filtering data using only weighting or some popularity/threshold parameter may not provide fine enough control to prevent leaking of sensitive information. Further, providing explicit blacklists for suggestions may filter out exact term matches. However, manually providing and/or updating such a level of fine control may be cost- prohibitive, to the point where it is unlikely to be usefully applied.
  • Examples may use classification technology to filter auto-complete suggestions so that users are presented only with information they are permitted to see.
  • An example device may determine a class a term from a database. The device may block the term from being presented to a user, if the determined class does not include a permission for the user to view the term. The term may suggest a remainder of an incomplete query input by the user.
  • examples may allow for finer control over what elements are filtered compared to simple weight/threshold parameters. Further, examples may allow for faster deployment and less maintenance compared to a manually maintained blacklist or whitelist of exact terms/phrases/entries.
  • FIG. 1 is an example block diagram of a device 100 to block a term from being presented to a user.
  • the device 100 may be a microprocessor, a controller, a memory module or device, a notebook computer, a desktop computer, an all-in-one system, a server, a network device, a wireless device, or any other type of device capable of interacting with a database and/or intercepting a message along a network.
  • the device 100 is shown to include a classification unit 110 and a filter unit 120.
  • the classification and filter units 110 and 120 may include, for example, a hardware device including electronic circuitry for implementing the functionality described below, such as control logic and/or memory. In addition or as an alternative, the classification and filter units 110 and 120 may be implemented as a series of instructions encoded on a machine-readable storage medium and executable by a processor.
  • the classification unit 110 may determine a class 12 of a term from a database.
  • the term may be a word or phrase used to describe a thing or to express a concept, such as a name, an address, and a social security number, and the like. The term may suggest a remainder of an incomplete query input by the user.
  • the class 112 may relate to a system for identifying various types of terms, such as confidential and non-confidential terms.
  • the filter unit 120 may block a term from being presented to a user, if the determined class 112 does not include a permission 122 for the user to view the term.
  • the determined class 112 may indicate at least one of sensitive and personally identifiable information, if the determined class 112 does not include permission 122 for the user to view the term.
  • the filter unit 120 may allow the term to be presented to the user, if the determined class 112 includes the permission 122 for the user to view the term.
  • the user may be any person who is entering a query, such as by using a computer or network service, for which the database may autocomplete with the term.
  • the user may have a user account and/or be identified by a user name and/or password.
  • the permission 122 may relate to the whether the user has a right to view, access or modify the term.
  • the permission 122 here may relate to whether the user may view the term triggered by the database in response to the user's query.
  • the filter unit 120 may block the term by preventing the term from being sent to the user and/or denying access to the term.
  • the determined class 112 may be stored and/or associated with the term at the database, the classification unit 110 and/or the filter unit 120, such as via metadata. The classification and filter units 110 and 120 are explained in greater detail below with respect to FIG. 2.
  • FIG. 2 is another example block diagram of a device 200 to block a term from being presented to a user.
  • the device 200 may be a microprocessor, a controller, a memory module or device, a notebook computer, a desktop computer, an all-in-one system, a server, a network device, a wireless device, or any other type of device capable of interacting with a database and/or intercepting a message along a network.
  • the device 200 is shown to interface with a database 230.
  • the database 230 may be any electronic, magnetic, optical, or other physical storage device that contains or stores information, such as Random Access Memory (RAM), an Electrically Erasable Programmable Read-Only Memory (EEPROM), a storage drive, a Compact Disc Read Only Memory (CD-ROM), and the like.
  • RAM Random Access Memory
  • EEPROM Electrically Erasable Programmable Read-Only Memory
  • CD-ROM Compact Disc Read Only Memory
  • the database 230 may include the most popular search terms 232- 1 to 232-n, where n is a natural number, indexed from a search engine. Further, at least some of the search terms 232-1 to 232-n, may include personally identifiable information (Pll), such as medical records, names, social security numbers and the like.
  • Pll personally identifiable information
  • the device 200 of FIG. 2 may include at least the functionality and/or hardware of the device 100 of FIG. .
  • a classification unit 210 of the device 200 of FIG. 2 may include at least the functionality and/or hardware of the classification unit 110 of the device 100 of FIG. 1 and a filter unit 220 of the device 200 of FIG. 2 may include at least the functionality and/or hardware of the classification unit 120 of the device 100 of FIG. 1.
  • the classification unit 210 may determine a class 212 of a term 232 from the database 230.
  • the class 212 of the term 232 may vary with respect to the user 250.
  • the term 232 may be classified as confidential with respect to a first user but classified as non-confidential with respect to a second user.
  • the classification unit 210 may take into account a type or identify of the user 250 when determining the class 212 of the term 232. Different types of the users 250 may correspond to different types of classes 212.
  • the user's 250 account may be used to identify the type of user, such as when the user 250 logs into a system.
  • the filter unit 220 may block a term from being presented to a user 250, if the determined class 212 does not include a permission 222 for the user 250 to view the term 232.
  • the filter unit 220 may allow the term 232 to be presented to the user 250, if the determined class 212 includes the permission 222 for the user 250 to view the term 232.
  • the classification unit 210 may classify the term 232 based on at least one of a rule 214 and machine learning 216. While one rule 214 is shown, examples may include a plurality of rules.
  • the rule 214 may indicate an operation to be performed on a number, letter, grammar, punctuation and/or syntax of the term 232.
  • the classification unit 210 may use the rule 214 to match the term 232 to at least one of a template and a pattern. For example, the classification unit 210 may use a rule to classify a term 232 as a social security number, if the term 232 matches a particular pattern for a social security number, as indicated by the rule 214.
  • the filter unit 220 may block the term 232 from being presented to the user 250, if the term 232 is classified as a social security number.
  • the classification unit 210 may perform an arithmetic operation on the term 232.
  • the filter unit 220 may allow the term to be presented to the user 250, if a result of the arithmetic operation satisfies the rule 214.
  • the classification unit 210 may classify the term 232 as a credit card number upon a result of a checksum or multiplication of the digits of the credit card or instead classify the term 232 as a date upon comparing a range and/or syntax of the term 232 to a template.
  • the filter unit 220 may block the term 232 from being presented to the user 250, if the term 232 is classified as a credit card number or a date that falls on prohibited day.
  • Machine learning 216 may relate to a construction and study of algorithms that can learn from data. Such algorithms may operate by building a model based on inputs and using that to make predictions or decisions, rather than following only explicitly programmed instructions.
  • Machine learning 216 techniques may include, for example, grammar induction and/or a probabilistic classifier.
  • the probabilistic classifier may be a Bayesian classifier.
  • Grammar induction may include, for example, inference by trial-and-error, a genetic algorithm, a greedy algorithm, a distributional learning algorithm and a pattern learning algorithm.
  • the classification unit 210 may use machine learning to classify types of terms 232 that may not be easily identifiable via a rule 214, such as addresses or spam.
  • the classification unit 210 may determine a plurality of the different types of classes 212, based on the plurality of terms 232-1 to 232-n included in the database 230.
  • the types of classes 212 may relate to different security clearances. Further, at least one of the classes 212 may be a subset of another of the classes 212.
  • the filter unit 220 may compare to an identify of the user 250 to class 212 of the term 232 determine, if the user's security clearance only allows them to see a subset of the terms 232. If the user 240 does have not security clearance, the filer unit 220 may not provide the term 232 to the user 250, which was suggested by the database in response to the user's 250 query.
  • the classification unit 210 may determine a plurality of the classes 212 of the terms 232 simultaneously.
  • the filter unit 220 may block and/or allow a plurality of the terms 232 simultaneously.
  • examples may remove or prevent terms 232 from being suggested to the user 250 that are classified as not to be presented to the user 250.
  • PI I is just one example of a type classification that could be filtered upon by the filter unit 220. Examples may determine a class 212 of a term 232, based on any type of criteria deemed appropriate for denying to the term 232.
  • FIG. 3 is an example block diagram of a computing device 300 including instructions for blocking a term based on a class of the term.
  • the computing device 300 includes a processor 310 and a machine- readable storage medium 320.
  • the machine-readable storage medium 320 further includes instructions 322 and 324 for blocking the term based on the class of the term.
  • the computing device 300 may be included in or part of, for example, a microprocessor, a controller, a memory module or device, a notebook computer, a desktop computer, an all-in-one system, a server, a network device, a wireless device, or any other type of device capable of executing the instructions 322 and 324.
  • the computing device 300 may include or be connected to additional components such as memories, controllers, etc.
  • the processor 310 may be, at least one central processing unit (CPU), at least one sem ico nd u cto r-based microprocessor, at least one graphics processing unit (GPU), a microcontroller, special purpose logic hardware controlled by microcode or other hardware devices suitable for retrieval and execution of instructions stored in the machine-readable storage medium 320, or combinations thereof.
  • the processor 310 may fetch, decode, and execute instructions 321 , 323, 325, 327 and 329 to implement blocking the term based on the class of the term.
  • the processor 310 may include at least one integrated circuit (IC), other control logic, other electronic circuits, or combinations thereof that include a number of electronic components for performing the functionality of instructions 322 and 324.
  • IC integrated circuit
  • the machine-readable storage medium 320 may be any electronic, magnetic, optical, or other physical storage device that contains or stores executable instructions.
  • the machine-readable storage medium 320 may be, for example, Random Access Memory (RAM), an Electrically Erasable Programmable Read-Only Memory (EEPROM), a storage drive, a Compact Disc Read Only Memory (CD-ROM), and the like.
  • RAM Random Access Memory
  • EEPROM Electrically Erasable Programmable Read-Only Memory
  • CD-ROM Compact Disc Read Only Memory
  • the machine- readable storage medium 320 can be non-transitory.
  • machine-readable storage medium 320 may be encoded with a series of executable instructions for blocking the term based on the class of the term.
  • the instructions 322 and 324 when executed by a processor (e.g., via one processing element or multiple processing elements of the processor) can cause the processor to perform processes, such as, the process of FIG. 4.
  • the analyze instructions 322 may be executed by the processor 310 to analyze a term from a database (not shown) to determine a class, the term is to relate to part of a query and to suggest a remainder of the query.
  • the determine instructions 324 may be executed by the processor 310 to determine if the term is to be blocked in response to the query, based on the class of the analyzed term.
  • the class may be determined based on at least one of a rule and machine learning. For example, the term may be blocked from being presented, if a user does not have permission to the analyzed class. The term may be allowed to be presented, if the user, if the user has permission to the analyzed class.
  • FIG. 4 is an example flowchart 400 of a method for blocking a term based on a class of the term.
  • execution of the method 400 is described below with reference to the device 200, other suitable components for execution of the method 400 can be utilized, such as the device 100. Additionally, the components for executing the method 400 may be spread among multiple devices (e.g., a processing device in communication with input and output devices). In certain scenarios, multiple devices acting in coordination can be considered a single device to perform the method 400.
  • the method 400 may be implemented in the form of executable instructions stored on a machine-readable storage medium, such as storage medium 320, and/or in the form of electronic circuitry.
  • the device 200 receives a term 232 from a database 230 related to part of a query of a user 250.
  • the term 232 may suggest a remainder of the query.
  • the device 200 may classify the term based on at least one of a rule 214 and machine learning 216.
  • the machine learning 216 may include at least one of grammar induction and a probabilistic classifier to classify the term 232.
  • the rule 214 may match the term to at least one of a template and a pattern to classify the term 232.
  • the device 200 blocks the term 232 from being suggested, if the class 212 of the term 232 does not provide permission 222 to a user 250 to view the term 232.
  • the device 200 allows the term to be suggested, if the class 212 of the term 232 does provide permission 222 to the user 250 to view the term 232.

Landscapes

  • Engineering & Computer Science (AREA)
  • Theoretical Computer Science (AREA)
  • Physics & Mathematics (AREA)
  • Databases & Information Systems (AREA)
  • General Physics & Mathematics (AREA)
  • Data Mining & Analysis (AREA)
  • General Engineering & Computer Science (AREA)
  • Software Systems (AREA)
  • Mathematical Physics (AREA)
  • Evolutionary Computation (AREA)
  • Artificial Intelligence (AREA)
  • Computing Systems (AREA)
  • Computational Linguistics (AREA)
  • Computer Vision & Pattern Recognition (AREA)
  • Medical Informatics (AREA)
  • Computational Mathematics (AREA)
  • Pure & Applied Mathematics (AREA)
  • Mathematical Optimization (AREA)
  • Mathematical Analysis (AREA)
  • Algebra (AREA)
  • Probability & Statistics with Applications (AREA)
  • Information Retrieval, Db Structures And Fs Structures Therefor (AREA)
EP14805567.6A 2014-11-27 2014-11-27 Block classified term Withdrawn EP3224738A1 (en)

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
PCT/EP2014/075782 WO2016082877A1 (en) 2014-11-27 2014-11-27 Block classified term

Publications (1)

Publication Number Publication Date
EP3224738A1 true EP3224738A1 (en) 2017-10-04

Family

ID=52000838

Family Applications (1)

Application Number Title Priority Date Filing Date
EP14805567.6A Withdrawn EP3224738A1 (en) 2014-11-27 2014-11-27 Block classified term

Country Status (5)

Country Link
US (1) US10902026B2 (ja)
EP (1) EP3224738A1 (ja)
JP (1) JP2017534128A (ja)
CN (1) CN107077471A (ja)
WO (1) WO2016082877A1 (ja)

Families Citing this family (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US10936945B2 (en) * 2016-06-06 2021-03-02 Microsoft Technology Licensing, Llc Query classification for appropriateness

Family Cites Families (19)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US6820075B2 (en) * 2001-08-13 2004-11-16 Xerox Corporation Document-centric system with auto-completion
US7802194B2 (en) 2007-02-02 2010-09-21 Sap Ag Business query language
US20090024590A1 (en) 2007-03-15 2009-01-22 Sturge Timothy User contributed knowledge database
CN100565523C (zh) * 2007-04-05 2009-12-02 中国科学院自动化研究所 一种基于多分类器融合的敏感网页过滤方法及系统
US7809714B1 (en) * 2007-04-30 2010-10-05 Lawrence Richard Smith Process for enhancing queries for information retrieval
US8214308B2 (en) * 2007-10-23 2012-07-03 Sas Institute Inc. Computer-implemented systems and methods for updating predictive models
JP5168620B2 (ja) 2007-11-07 2013-03-21 独立行政法人情報通信研究機構 データ種類検出装置及びデータ種類検出方法
US9411907B2 (en) 2010-04-26 2016-08-09 Salesforce.Com, Inc. Method and system for performing searches in a multi-tenant database environment
EP2469404B1 (en) * 2010-12-22 2018-04-18 Lg Electronics Inc. Mobile terminal and method of displaying information in accordance with a plurality of modes of use
US8412728B1 (en) * 2011-09-26 2013-04-02 Google Inc. User interface (UI) for presentation of match quality in auto-complete suggestions
DE102012212426B3 (de) 2012-07-16 2013-08-29 Schaeffler Technologies AG & Co. KG Wälzlagerelement, insbesondere Wälzlagerring
US9613165B2 (en) * 2012-11-13 2017-04-04 Oracle International Corporation Autocomplete searching with security filtering and ranking
US10067913B2 (en) * 2013-05-08 2018-09-04 Microsoft Technology Licensing, Llc Cross-lingual automatic query annotation
CN103441986B (zh) * 2013-07-29 2017-05-17 中国航天科工集团第二研究院七〇六所 一种瘦客户端模式的数据资源安全管控方法
CN103646109B (zh) 2013-12-25 2017-01-25 武汉大学 一种基于机器学习的空间数据匹配方法
US20160110657A1 (en) * 2014-10-14 2016-04-21 Skytree, Inc. Configurable Machine Learning Method Selection and Parameter Optimization System and Method
US10102269B2 (en) * 2015-02-27 2018-10-16 Microsoft Technology Licensing, Llc Object query model for analytics data access
US10891540B2 (en) * 2015-12-18 2021-01-12 National Technology & Engineering Solutions Of Sandia, Llc Adaptive neural network management system
US10733534B2 (en) * 2016-07-15 2020-08-04 Microsoft Technology Licensing, Llc Data evaluation as a service

Also Published As

Publication number Publication date
JP2017534128A (ja) 2017-11-16
US20170323004A1 (en) 2017-11-09
WO2016082877A1 (en) 2016-06-02
US10902026B2 (en) 2021-01-26
CN107077471A (zh) 2017-08-18

Similar Documents

Publication Publication Date Title
CN109614816B (zh) 数据脱敏方法、装置及存储介质
JP6966099B2 (ja) データコンテンツフィルタ
US11381580B2 (en) Machine learning classification using Markov modeling
US8863301B2 (en) Classification of an electronic document
US9116879B2 (en) Dynamic rule reordering for message classification
US11361068B2 (en) Securing passwords by using dummy characters
KR101874373B1 (ko) 난독화 스크립트에 대한 악성 스크립트 탐지 방법 및 그 장치
US20140068757A1 (en) Authentication device, authentication method, and recording medium
US10803057B1 (en) Utilizing regular expression embeddings for named entity recognition systems
CN110419041B (zh) 自动用户简档生成和认证
CN104516882B (zh) 确定sql语句的危害度的方法和设备
US20240028650A1 (en) Method, apparatus, and computer-readable medium for determining a data domain associated with data
CN111538978A (zh) 基于从任务危险等级确定的访问权执行任务的系统和方法
CN112364625A (zh) 文本筛选方法、装置、设备及存储介质
EP3543877A1 (en) Method and device for processing accumulative retrieval, terminal and storage medium
JP6777612B2 (ja) コンピュータシステムにおけるデータ損失を防止するためのシステム及び方法
US10902026B2 (en) Block classified term
CN111753304B (zh) 用于基于访问权在计算设备上执行任务的系统和方法
Al Zaabi et al. Android malware detection using static features and machine learning
JP6194180B2 (ja) 文章マスク装置及び文章マスクプログラム
US20160078072A1 (en) Term variant discernment system and method therefor
WO2019242443A1 (zh) 一种基于字符串的恶意软件识别方法、系统及相关装置
Kyaw et al. Machine learning based android malware detection using significant permission identification
EP3493093A1 (en) Data protection method for preventing of re-pasting of confidential data
KR102580865B1 (ko) 복수의 단말 간의 기밀 문서 관리 시스템

Legal Events

Date Code Title Description
PUAI Public reference made under article 153(3) epc to a published international application that has entered the european phase

Free format text: ORIGINAL CODE: 0009012

17P Request for examination filed

Effective date: 20170403

AK Designated contracting states

Kind code of ref document: A1

Designated state(s): AL AT BE BG CH CY CZ DE DK EE ES FI FR GB GR HR HU IE IS IT LI LT LU LV MC MK MT NL NO PL PT RO RS SE SI SK SM TR

AX Request for extension of the european patent

Extension state: BA ME

DAX Request for extension of the european patent (deleted)
STAA Information on the status of an ep patent application or granted ep patent

Free format text: STATUS: THE APPLICATION IS DEEMED TO BE WITHDRAWN

18D Application deemed to be withdrawn

Effective date: 20180116