GB2441350A - Filtering access to internet content - Google Patents

Filtering access to internet content Download PDF

Info

Publication number
GB2441350A
GB2441350A GB0617113A GB0617113A GB2441350A GB 2441350 A GB2441350 A GB 2441350A GB 0617113 A GB0617113 A GB 0617113A GB 0617113 A GB0617113 A GB 0617113A GB 2441350 A GB2441350 A GB 2441350A
Authority
GB
United Kingdom
Prior art keywords
requested
access
url
requested url
list
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Withdrawn
Application number
GB0617113A
Other versions
GB0617113D0 (en
Inventor
Michael John Andrew Phillips
Lee Hesselden
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
PUREPAGES GROUP Ltd
Original Assignee
PUREPAGES GROUP Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by PUREPAGES GROUP Ltd filed Critical PUREPAGES GROUP Ltd
Priority to GB0617113A priority Critical patent/GB2441350A/en
Publication of GB0617113D0 publication Critical patent/GB0617113D0/en
Publication of GB2441350A publication Critical patent/GB2441350A/en
Withdrawn legal-status Critical Current

Links

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F16/00Information retrieval; Database structures therefor; File system structures therefor
    • G06F16/90Details of database functions independent of the retrieved data types
    • G06F16/95Retrieval from the web
    • G06F16/955Retrieval from the web using information identifiers, e.g. uniform resource locators [URL]
    • G06F16/9566URL specific, e.g. using aliases, detecting broken or misspelled links
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F16/00Information retrieval; Database structures therefor; File system structures therefor
    • G06F16/90Details of database functions independent of the retrieved data types
    • G06F16/95Retrieval from the web
    • G06F16/953Querying, e.g. by the use of web search engines
    • G06F16/9535Search customisation based on user profiles and personalisation
    • G06F17/30867

Landscapes

  • Engineering & Computer Science (AREA)
  • Databases & Information Systems (AREA)
  • Theoretical Computer Science (AREA)
  • Data Mining & Analysis (AREA)
  • Physics & Mathematics (AREA)
  • General Engineering & Computer Science (AREA)
  • General Physics & Mathematics (AREA)
  • Information Transfer Between Computers (AREA)
  • Storage Device Security (AREA)

Abstract

Disclosed is a method of controlling access to a remote Internet site, defined by a URL. comprising the steps of: intercepting, using a local proxy server, a user's request to visit a particular site; transmitting details of the requested URL to a remote server to compare the URL with a white-list comprising details of URLs, which have previously been assessed and found to be safe; if the requested URL is present in the white-list, allowing the local proxy server to access the requested URL; or if the requested URL is not present in the white-list, directing a web-spider to visit the site and assess its content according to one or more predefined criteria, wherein if the content is assessed to be safe, the local proxy server is allowed to access the requested URL.

Description

<p>IMPROVEMENTS IN AND RELATING TO INTERNET CONTENT FILTERING</p>
<p>The present invention relates to the filtering of internet content in an attempt to remove or at least reduce the risk of a user viewing or accessing offensive or inappropriate material.</p>
<p>There is a general concern amongst many users of the internet that they or their children may inadvertently encounter material which they would deem to be offensive or upsetting. It is not generally possible to ascertain from a web address or a URL (Uniform Resource Locator) whether the content of that particular page contains material which the user would not wish to see. There are many prior art solutions directed towards removing or reducing this risk but none of them are completely satisfactory.</p>
<p>In particular two common techniques for controlling which websites may be accessed are termed "black list" and "white list" In black list systems the user is prevented from accessing any websites or domain names which appear on a so-called blacklist. This type of system requires continuous updating to ensure that suspect or questionable sites are added to the black list. Given the enormous number of web pages available on the internet, the task of maintaining a black list in this fashion is not particularly practicable for most purposes. As such, the black list of sites which are deemed unsuitable is out of date almost as soon as it has been updated. This is clearly unsatisfactory as it leads to the possibility that a user could easily stumble across an offensive website which has not yet made it onto the black list.</p>
<p>An alternative approach uses the so-called white list.</p>
<p>This is also known as a "walled garden" approach and the user in these systems is only able to access websites from a predefined list of approved websites. Such a system may be suitable for a computer installation in a school, perhaps, where it may be desirable to limit the pages which are accessible to the user to a very few carefully defined suitable pages, but this is unlikely to be of any real use to most users. If a user did wish to visit a page which is not listed on the white list, then they would need to somehow request authorisation or permission from the owner of the white list so that the desired page can be verified and added to the white list. The practicalities of managing such a system on any scale would render such an enterprise too complex to be of any use.</p>
<p>It is an aim of embodiments of the present invention to address the shortcomings with the prior art, whether mentioned herein or not.</p>
<p>Throughout this description, the terms domain name, URL and website address are used interchangeably to refer to particular internet websites associated with a particular URL.</p>
<p>According to the present invention there is provided a system and method as set forth in the appended claims.</p>
<p>Preferred features of the invention will be apparent from the dependent claims, and the description which follows.</p>
<p>For a better understanding of the invention, and to show how embodiments of the same may be carried into effect, reference will now be made, by way of example, to the accompanying diagrammatic drawings in which: Figure 1 shows a schematic view of an embodiment of the present invention; Figure 2 shows a flow chart representing a first embodiment of the present invention; Figure 3 shows a flow chart representing a second embodiment of the present invention.</p>
<p>Embodiments of the present invention operate according to a dynamic white list principle. In the context of this invention, dynamic white list" refers to a database of predetermined safe websites and domain names which can be dynamically updated in real time if a user attempts to access a website not listed in the white list.</p>
<p>Additionally, further databases may be provided. These additional databases may include: a black list of. sites which are known to include offensive material; a list of ambiguous sites requiring human investigation; and a list of newly discovered sites awaiting automatic investigation.</p>
<p>Figure 1 illustrates a typical setup of an embodiment of the invention.</p>
<p>In order to interact with the white list, a piece of software called the client program 20 is installed on the user's computer 10. This client program includes a proxy server, which operates to intercept all requests from a s user to access the Internet 30, and ensures that all such requests are handled via the proxy server according to rules which are intended to ensure that only safe and unobjectionable sites are visited.</p>
<p>Once the client program 20 is installed on a particular PC, it is impossible to access the internet without such requests being intercepted by the client program, unless the user has sufficient authority to override the client program. Such authority generally belongs only to a master user who can override the security settings through use of a suitable password or other access control technique, such as a smart card or biometric identification.</p>
<p>In order to describe the operation of an embodiment of the invention, it is useful to examine a typical scenario.</p>
<p>Figure 2 illustrates this scenario in flowchart form.</p>
<p>If a user attempts 100 to access a website, the client program, intercepts 110 the request and queries 120 the central server 40, which compares the requested tJRL with a database, known as the white list 50, which includes IJRLs which are known to be safe.</p>
<p>If the requested URL is included in the white list, then a message is sent from the server 40 to the client program running on the PC 10, allowing the user to access 170 the requested site. The user will experience little or no delay in such a situation.</p>
<p>Additionally, if the requested URL is not included in the white list, a check may be performed to see if it features in further databases, accessible to the server. The further database may include different categories of sites, such as black-listed sites, ambiguous sites and sites awaiting further investigation, as mentioned previously.</p>
<p>If the requested URL features in the black-list, then access is denied and the user receives an appropriate message or is re-directed to a warning page and access is denied. If the requested URL features in one of the other databases, then a suitable message is displayed.</p>
<p>If the requested URL is not included in the white list 50, or any of the other lists, then this means that the requested URL is unknown to the server and the requested website is logged 130 and a web spider 60 is directed immediately to the requested URL to assess its content, as will be described shortly. The user experiences a short delay while spidering takes place 140, but this is only likely to be a few seconds.</p>
<p>If the assessment reveals 150 that the site is safe, then it is added to the white list 50 and the user is directed to the site, exactly as if the site was already listed in the white list.</p>
<p>If the assessment reveals 150 that the site is not safe for viewing, then the user is redirected 160 to an "access denied" page, informing him that the site is not viewable.</p>
<p>In order to assess the content of a requested site, which is not present in the white list, the web spider 60 performs an analysis of the content of the site according to one or more pre-defined rules or criteria.</p>
<p>There are a wide variety of criteria which may be used to assess the content of any particular website and various examples are given below: Languages If the user is an English speaker and expects to only find English language websites, then one criterion which may be used to assess a website is whether the language in which it is written is English or not. The central server may be instructed to not authorise access to any websites which are not solely or predominantly in English, simply as the analysis of a foreign language website may be too onerous.</p>
<p>However, multi-language analysis of websites may be available in certain embodiments of the invention.</p>
<p>Images / Videos Any website which consists only of images and/or video files with no accompanying text is flagged as suspicious.</p>
<p>This is because it is not straightforward to assess the nature of images on a website and, in the absence of any further text, it is assumed that such a website is questionable. In other embodiments of the invention, it may be possible to perform a qualitative assessment of the images using specialised software which is able to detect and analyse images for flesh tones, which may be indicative of pornographic content.</p>
<p>Offensive Terms Many websites which are generally considered offensive or objectionable are of the type that display pornographic words and images. Many such websites will include certain words or terms which will alert the server to the nature of the website. A predetermined list of such terms is accessible to the server and a comparison of these terms against terms appearing on the requested websites can be performed. If the requested website contains any of the defined terms or, in some cases, more than a certain number of them, then the requested website is flagged as objectionable and access is denied.</p>
<p>The list of offensive terms may be further classified according to a degree of offensiveness so that pages including only less offensive terms may be accessible, whilst pages including more offensive terms will be barred.</p>
<p>Metadata Web pages often contain a great deal of hidden information, which is not normally displayed on the user's computer when that site is visited. In particular, Metadata is concealed in the source code for web pages in an attempt to influence search engines such as Google and to provide other control information. Often, Metadata uses many of the same offensive or objectionable terms which are displayed on the user's computer and the spider 60 can form an analysis of the hidden Metadata to characterise the site as objectionable or not. Links</p>
<p>Another criterion which may be used to assess a requested website is whether that website contains links to other websites which are already known to be questionable, or conversely, whether there are reverse links to the requested website from websites that are known to be objectionable. In either of these cases a further S determination can be made to decide whether the requested site is itself objectionable or not. In this way, a website may first be assessed in terms of the sites it links to and are linked to it, and, if needed, a secondary assessment can be made in case of doubt.</p>
<p>Special Terms Under U.S. Law, pornographic websites are required to display a legal notice under USC 2257. The presence of "USC 2257", "2257" or variants thereof, generally indicate the presence of offensive material, which can be used to prevent a site being entered onto the white list. Since a great number of pornographic websites are hosted in the USA, this is a convenient method of identifying legitimate' pornography sites.</p>
<p>It is possible for the server to grade a website according to the degree of objectionable material contained therein.</p>
<p>This information may be fed back to the client program running on the user's computer and, depending on a security level set by the user, certain less objectionable websites may be viewable whereas those that are deemed to be the most offensive are not viewable. In this way, a user who is in control of the computer (e.g. a parent) may permit themselves to view websites of a kind which they would not wish or allow other users (e.g. their children) to visit.</p>
<p>The access level of a particular user may only be set by a user having the appropriate permissions to change various settings within the client program 20. In most cases for domestic use, the parent will be Set up to be the master user who can then assign various security levels to other users of the system to allow or disallow access to sites of different degrees of objectionability.</p>
<p>As can be seen from the above, a master user can override the security settings of the client program 20 so that effectively no web filtering is performed at all. The same master user may set various levels of access control so that other users of the system can enjoy varying degrees of access to websites depending upon their inclusion within the white list 50 or other lists 55,65,75,85 assessed or to be assessed by the spider.</p>
<p>In use, if a user not having unfettered access to all sites attempts to access a site not known to the server (i.e. not on the white list 50, or any of the other lists 55, 60, 65, 75, 85), the central server 40 dispatches the spider 60 to investigate the requested site. While this is going ahead, a warning message may be displayed to the user informing them that a site analysis is being performed and the results will be known in a short time.</p>
<p>Typically, assuming normal levels of network load, such an analysis will take less than a few seconds to perform. If the requested site is unobjectionable, the user will automatically be directed to the site and browse it as normal. If, however, the website is assessed to be objectionable, according to any one or more of the criteria which have been defined, then the user receives a message indicating that access to that particular website has been barred.</p>
<p>As a further level of assurance, it is possible to configure the system to include a degree of human intervention. If the spider is unable to determine the content of a website for certain, then it is possible to log, in an ambiguous site list 65, the URL of the website in question and schedule this for later human intervention, where a more accurate determination may be made.</p>
<p>For instance, depending upon the settings chosen for the server software, a website which contains nothing but images may be classified as objectionable if an extremely conservative approach is taken, or, it may be classified as suspect only, requiring human intervention to verify one way or another what the content of that website is.</p>
<p>If human intervention is used, the user receives a suitable warning notice informing them that the website is being checked manually and asking that they revisit that site at some point in the future to determine whether access has been allowed or not.</p>
<p>The extent to which human intervention is necessary or desirable depends largely upon the level of control required. If it is desired to err on the conservative side, then any site which is assessed by the spider and which appears suspect, is simply not entered onto the white list and access is thus denied. If, however, a less conservative approach is required, and, in an attempt to prevent legitimate sites from being barred, then human intervention can be used.</p>
<p>In a further enhancement, embodiments of the invention may be configured to make use of certain data stored locally on the user's PC, which can allow access to known safe sites without recourse to the central server 40. This may be achieved by the provision of a local database 21 including one or more lists 22, 24 of sites which are deemed safe and unobjectionable according to one or more criteria.</p>
<p>In particular the database 21 may consist of two lists.</p>
<p>The first list 22, hereafter called the quicklist, may comprise a number of websites which are assessed and determined to be safe by the provider of the system i.e. the party which provides the client program and operates the central server. Such a list may include popular websites which are known not to include any objectionable material, such as popular newspaper, entertainment, information and computing websites, to name only a few, which are generally safe and unobjectionable.</p>
<p>The second list 24 may include a personal cache of websites which the user has previously visited and which have been either deemed safe by the central server and/or the spider, or which the user has overridden the security settings to access. Each user registered with the client program (e.g. master user and various other users, such as children) has a separate personal cache 24 so that sites which one user has deemed acceptable may not be visited by another user without being first assessed as described.</p>
<p>Figure 3 shows a flowchart illustrating the operation of the second embodiment. Similarly numbered operations shown in Figure 3 are identical with those shown in Figure 2 and so are not described in detail again.</p>
<p>In use, when a user wishes to access a website, a check of the URL is made 200, 210, firstly against the local database 21, including the quicklist 24 and the list of sites in the personal cache 22, to determine if the website to be visited is listed amongst these. If it is, then access is allowed without recourse to the central server 50 or spider 60, since these sites have previously been deemed safe. This arrangement greatly reduces the load on the central server 50 for cases where the user wishes to access websites which have already been visited and/or judged safe.</p>
<p>In the event that the requested URL is not included in either the quicklist 22 or the cache of personal sites 24, then the requested URL is transmitted to the server 40 for checking as described previously.</p>
<p>Most users tend to regularly visit a relatively small number of websites and by use of the local database 21, in most cases, access to the central server's white list 50 will not be required, thereby reducing the load on the server and also increasing the Internet access speed enjoyed by the user.</p>
<p>An unauthorised user may be tempted to hack into the local database 21 in an attempt to manipulate the lists stored therein, so as to gain access to offensive sites. The local database is therefore suitably encrypted using known encryption techniques so that it is protected against such hacking attempts. The encryption is managed by the client program 20.</p>
<p>In addition to the dynamic white list which has been described so far, information on websites which may not be visited at all may be optionally contained in a separate black list which is kept alongside the dynamic white list.</p>
<p>For instance, if a particular domain name is known to include pornographic or offensive material, then this can be added to the black list, meaning that it can never be visited by a user not having the appropriate level of security clearance.</p>
<p>The spider program (also known as a robot or a crawler) which is used to visit and assess the content of requested websites is also configured to randomly roam the web looking for new websites which can then be assessed on a continuous basis, and used to update the white list 50. It is arranged to operate in this way at times when it is not specifically tasked with assessing a user's requested URL.</p>
<p>In this way, the white list may be updated continuously.</p>
<p>Alternatively, a plurality of spiders may be provided, each designated for a particular task i.e. one or more may be randomly trawling and another one or more may be arranged to respond to a user's request.</p>
<p>The granularity of the white-list can be adjusted according to particular user's needs. This means that if a sub-domain is found to contain offensive material, then either the corresponding domain can be denied entry to the white list or only the sub-domain. For instance, if the sub-domain support.xyz.com is found to contain offensive material, then only that sub-domain (support.xyz.com) can be omitted or the entire domain (www.xyz.com).</p>
<p>The present invention can be implemented as computer-readable code on a computer-readable recording medium.</p>
<p>The computer-readable recording medium is any data storage device that can store data which can be thereafter read by a computer system. Examples of the computer-readable recording medium include read-only memory (ROM), random-access memory (RAM), CD-ROMs, magnetic tapes, floppy disks, optical data storage devices, and carrier waves (such as data transmission through the Internet).</p>
<p>The computer-readable recording medium can also be distributed over network-coupled computer systems so that the computer-readable code is stored and executed in a distributed fashion.</p>
<p>In order to operate the client program on a PC, the user must install it and configure the program with suitable information identifying the master-user and any other users who are to use the PC. Different levels of access privileges may be assigned to each user, and each user will be associated with a different personal cache.</p>
<p>Periodically, the client program may receive software updates from the server 40. These may include updates of the operational code, the quicklist URLs or other program modules.</p>
<p>An ongoing subscription may be payable by the user in order to make use of the service.</p>
<p>Attention is directed to all papers and documents which are filed concurrently with or previous to this specification in connection with this application and which are open to public inspection with this specification, and the contents of all such papers and documents are incorporated herein by reference.</p>
<p>All of the features disclosed in this specification (including any accompanying claims, abstract and drawings), and/or all of the steps of any method or process so disclosed, may be combined in any combination, except combinations where at least some of such features and/or steps are mutually exclusive.</p>
<p>Each feature disclosed in this specification (including any accompanying claims, abstract and drawings) may be replaced by alternative features serving the same, equivalent or similar purpose, unless expressly stated otherwise. Thus, unless expressly stated otherwise, each feature disclosed is one example only of a generic series of equivalent or similar features.</p>
<p>The invention is not restricted to the details of the foregoing embodiment (s). The invention extends to any novel one, or any novel combination, of the features</p>
<p>disclosed in this specification (including any</p>
<p>accompanying claims, abstract and drawings), or to any novel one, or any novel combination, of the steps of any method or process so disclosed.</p>

Claims (1)

  1. <p>CIAIMS</p>
    <p>1. A method of controlling access to a remote Internet site, defined by a IJRL, comprising the steps of: intercepting, using a local proxy server, a user's request to visit a particular site; transmitting details of the requested URL to a remote server to compare the URL with a white-list comprising details of URLS, which have previously been assessed and found to be safe; if the requested URL is present in the white-list, allowing the local proxy server to access the requested URL; or if the requested URL is not present in the white-list, directing a web-spider to visit the site and assess its content according to one or more predefined criteria, wherein if the content is assessed to be safe, the local proxy server is allowed to access the requested URL.</p>
    <p>2. A method as claimed in claim 1 further comprising the step of comparing the requested URL with a black list of sites to which access is barred if the requested URL is not present in the white list.</p>
    <p>3. A method as claimed in claim 1 or 2 wherein if the assessment reveals that the content of the requested site is safe, then the URL of the requested site is added to the white list.</p>
    <p>4. A method as claimed in claim 1 or 2 wherein if the assessment is that the content is not safe, then the user receives an access-denied message.</p>
    <p>5. A method as claimed in claim 4 wherein the access-denied message is delivered by the user being directed to a webpage.</p>
    <p>6. A method as claimed in any preceding claim wherein the predefined criteria by which the content of the requested site is assessed includes one or more of: the language of the content; the presence of only image and/or video files; the presence of offensive words or phrases, as defined in a database; the presence to offensive words or phrases, as defined in a database, in metadata; a link to or from the requested site to a site known to contain offensive material; and the presence of certain predefined terms, indicative of pornographic material.</p>
    <p>7. A method as claimed in claim 6 wherein the database of offensive words or phrases is available to the server or the spider.</p>
    <p>8. A method as claimed in claim any preceding claim wherein different users are assigned different access levels, allowing suitably authorised users to view content not available to users having a different access level.</p>
    <p>9. A method as claimed in any preceding claim wherein, before details of the requested URL are transmitted to the remote server, the requested URL is compared with entries in a local database of IJRLs to which access is allowed, s such that details of the requested TJRL are only transmitted to the remote server if the requested URL is not present in the local database.</p>
    <p>10. A method as claimed in claim 9 wherein the local database comprises a list of URLs which have previously been visited by the user.</p>
    <p>11. A method as claimed in claim 10 wherein the local database comprises a further list of URLs which have been provided by a third party.</p>
    <p>12. A method as claimed in any of claim 9 to 11 wherein the local database is encrypted.</p>
    <p>13. A computer-readable storage medium having software resident thereon in the form of computer readable code executable by a computer for performing the method of any preceding claim.</p>
    <p>14. A system comprising a server and a web-spider, the system being operable to: receive details of a requested URL from a user's personal computer; compare the requested URL with a database of known safe URLs; transmit a message to the personal computer, allowing access to the requested URL, if the requested URL is in the database; or direct the web-spider to assess the content of the site at the requested URL, using one or more predefined criteria, if it is not in the database.</p>
    <p>15. A system as claimed in claim 14 further operable to compare the requested URL with a black list of sites to which access is barred if the requested tJRL is not present in the white list.</p>
    <p>16. A system as claimed in claim 14 wherein the server is operable to transmit a message to disallow access to the requested URL if the assessment of the requested URL reveals the presence of offensive material.</p>
    <p>17. A system as claimed in any of claims 14 to 16 further comprising a personal computer arranged to comprise a local database of known safe URLs to which access is allowed without recourse to the server or web-spider, these only being used if the requested TJRL is not in the local database.</p>
    <p>18. A system as claimed in claim 17 wherein the local database comprises first and second lists of URLs which have been previously visited by a particular user and IJRLs which have been provided by a third party, respectively.</p>
    <p>19. A method as hereinbefore described and having particular reference to the accompanying drawings.</p>
    <p>20. A system as hereinbefore described and having particular reference to the accompanying drawings.</p>
GB0617113A 2006-08-31 2006-08-31 Filtering access to internet content Withdrawn GB2441350A (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
GB0617113A GB2441350A (en) 2006-08-31 2006-08-31 Filtering access to internet content

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
GB0617113A GB2441350A (en) 2006-08-31 2006-08-31 Filtering access to internet content

Publications (2)

Publication Number Publication Date
GB0617113D0 GB0617113D0 (en) 2006-10-11
GB2441350A true GB2441350A (en) 2008-03-05

Family

ID=37137070

Family Applications (1)

Application Number Title Priority Date Filing Date
GB0617113A Withdrawn GB2441350A (en) 2006-08-31 2006-08-31 Filtering access to internet content

Country Status (1)

Country Link
GB (1) GB2441350A (en)

Cited By (11)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2010121542A1 (en) * 2009-04-22 2010-10-28 中兴通讯股份有限公司 Home gateway-based anti-virus method and device thereof
EP2387747A1 (en) * 2009-01-16 2011-11-23 Devicescape Software, INC. Systems and methods for enhanced smartclient support
US20120317233A1 (en) * 2011-06-13 2012-12-13 International Business Machines Corporation Mobile web app infrastructure
US8549588B2 (en) 2006-09-06 2013-10-01 Devicescape Software, Inc. Systems and methods for obtaining network access
US8554830B2 (en) 2006-09-06 2013-10-08 Devicescape Software, Inc. Systems and methods for wireless network selection
US8667596B2 (en) 2006-09-06 2014-03-04 Devicescape Software, Inc. Systems and methods for network curation
GB2508235A (en) * 2012-11-27 2014-05-28 Ibm Software asset management using browser plug-in
US8743778B2 (en) 2006-09-06 2014-06-03 Devicescape Software, Inc. Systems and methods for obtaining network credentials
GB2509766A (en) * 2013-01-14 2014-07-16 Wonga Technology Ltd Website analysis
US20150222649A1 (en) * 2012-10-17 2015-08-06 Fansheng ZENG Method and apparatus for processing a webpage
US9326138B2 (en) 2006-09-06 2016-04-26 Devicescape Software, Inc. Systems and methods for determining location over a network

Families Citing this family (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN110278271B (en) * 2019-06-24 2022-04-12 厦门美图之家科技有限公司 Network request control method and device and terminal equipment
CN111800390A (en) * 2020-06-12 2020-10-20 深信服科技股份有限公司 Abnormal access detection method, device, gateway equipment and storage medium

Citations (8)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO1998028690A1 (en) * 1996-12-20 1998-07-02 Livingston Enterprises, Inc. Network access control system and process
WO2001063835A1 (en) * 2000-02-21 2001-08-30 Clicksafe.Com Llc System and method for identifying and blocking pornographic and other web content on the internet
WO2001098947A1 (en) * 2000-06-16 2001-12-27 N2H2, Inc. Method and system for filtering content available through a distributed computing network
EP1318468A2 (en) * 2001-12-07 2003-06-11 Websense Inc. System and method for an internet filter
US20040073604A1 (en) * 2002-10-11 2004-04-15 Kazuhiro Moriya Cache control method of proxy server with white list
GB2403830A (en) * 2002-02-28 2005-01-12 David Wigley Method, system and software product for restricting access to network accessible digital information
US20050144297A1 (en) * 2003-12-30 2005-06-30 Kidsnet, Inc. Method and apparatus for providing content access controls to access the internet
EP1638016A1 (en) * 2004-09-15 2006-03-22 PCSafe Inc. Methods and systems for filtering URLs, webpages, and content

Patent Citations (8)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO1998028690A1 (en) * 1996-12-20 1998-07-02 Livingston Enterprises, Inc. Network access control system and process
WO2001063835A1 (en) * 2000-02-21 2001-08-30 Clicksafe.Com Llc System and method for identifying and blocking pornographic and other web content on the internet
WO2001098947A1 (en) * 2000-06-16 2001-12-27 N2H2, Inc. Method and system for filtering content available through a distributed computing network
EP1318468A2 (en) * 2001-12-07 2003-06-11 Websense Inc. System and method for an internet filter
GB2403830A (en) * 2002-02-28 2005-01-12 David Wigley Method, system and software product for restricting access to network accessible digital information
US20040073604A1 (en) * 2002-10-11 2004-04-15 Kazuhiro Moriya Cache control method of proxy server with white list
US20050144297A1 (en) * 2003-12-30 2005-06-30 Kidsnet, Inc. Method and apparatus for providing content access controls to access the internet
EP1638016A1 (en) * 2004-09-15 2006-03-22 PCSafe Inc. Methods and systems for filtering URLs, webpages, and content

Cited By (16)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US8549588B2 (en) 2006-09-06 2013-10-01 Devicescape Software, Inc. Systems and methods for obtaining network access
US8554830B2 (en) 2006-09-06 2013-10-08 Devicescape Software, Inc. Systems and methods for wireless network selection
US8667596B2 (en) 2006-09-06 2014-03-04 Devicescape Software, Inc. Systems and methods for network curation
US9913303B2 (en) 2006-09-06 2018-03-06 Devicescape Software, Inc. Systems and methods for network curation
US8743778B2 (en) 2006-09-06 2014-06-03 Devicescape Software, Inc. Systems and methods for obtaining network credentials
US9326138B2 (en) 2006-09-06 2016-04-26 Devicescape Software, Inc. Systems and methods for determining location over a network
EP2387747A1 (en) * 2009-01-16 2011-11-23 Devicescape Software, INC. Systems and methods for enhanced smartclient support
EP2387747A4 (en) * 2009-01-16 2013-06-12 Devicescape Software Inc Systems and methods for enhanced smartclient support
WO2010121542A1 (en) * 2009-04-22 2010-10-28 中兴通讯股份有限公司 Home gateway-based anti-virus method and device thereof
CN101527721B (en) * 2009-04-22 2012-09-05 中兴通讯股份有限公司 Anti-virus method on the basis of household gateway and device thereof
US20120317233A1 (en) * 2011-06-13 2012-12-13 International Business Machines Corporation Mobile web app infrastructure
US9077770B2 (en) * 2011-06-13 2015-07-07 International Business Machines Corporation Mobile web app infrastructure
US20150222649A1 (en) * 2012-10-17 2015-08-06 Fansheng ZENG Method and apparatus for processing a webpage
US9348923B2 (en) 2012-11-27 2016-05-24 International Business Machines Corporation Software asset management using a browser plug-in
GB2508235A (en) * 2012-11-27 2014-05-28 Ibm Software asset management using browser plug-in
GB2509766A (en) * 2013-01-14 2014-07-16 Wonga Technology Ltd Website analysis

Also Published As

Publication number Publication date
GB0617113D0 (en) 2006-10-11

Similar Documents

Publication Publication Date Title
GB2441350A (en) Filtering access to internet content
US11258785B2 (en) User login credential warning system
RU2336561C2 (en) Content filtering in process of web-viewing
US9900346B2 (en) Identification of and countermeasures against forged websites
US9654492B2 (en) Malware detection system based on stored data
US6959420B1 (en) Method and system for protecting internet users&#39; privacy by evaluating web site platform for privacy preferences policy
JP6155521B2 (en) Detect and prevent illegal purchases of content on the Internet
US8443452B2 (en) URL filtering based on user browser history
US8745733B2 (en) Web content ratings
US20080235239A1 (en) Pre-fetching web proxy
US20130276061A1 (en) System, method, and computer program product for preventing access to data with respect to a data access attempt associated with a remote data sharing session
US20070006321A1 (en) Methods and apparatus for implementing context-dependent file security
Greenfield et al. Effectiveness of Internet filtering software products
Narayanan Adult content filtering: Restricting minor audience from accessing inappropriate internet content
Shukla et al. Web browsing and spyware intrusion
US10104116B2 (en) System for detecting link spam, a method, and an associated computer readable medium
Ryngaert Extraterritorial Enforcement Jurisdiction in Cyberspace: Normative Shifts
AU2014202431A1 (en) Acces control system
Farmer The Spector of Crypto-anarchy: Regulating Anonymity-Protecting Peer-To-Peer Networks
US20230328068A1 (en) Content sharing in an enterprise digital space
Kissel License to blog: Internet regulation in the People's Republic of China
Jerkovic et al. Vulnerability Analysis of most Popular Open Source Content Management Systems with Focus on WordPress and Proposed Integration of Artificial Intelligence Cyber Security Features.
US8434154B1 (en) Method and apparatus for distributing content across platforms in a regulated manner
Initiative Internet filtering in Vietnam in 2005-2006: A country study
Ruddock et al. Widening the lens on content moderation

Legal Events

Date Code Title Description
WAP Application withdrawn, taken to be withdrawn or refused ** after publication under section 16(1)