GB2506605A - Identifying computer file based security threats by analysis of communication requests from files to recognise requests sent to untrustworthy domains - Google Patents
Identifying computer file based security threats by analysis of communication requests from files to recognise requests sent to untrustworthy domains Download PDFInfo
- Publication number
- GB2506605A GB2506605A GB1217613.7A GB201217613A GB2506605A GB 2506605 A GB2506605 A GB 2506605A GB 201217613 A GB201217613 A GB 201217613A GB 2506605 A GB2506605 A GB 2506605A
- Authority
- GB
- United Kingdom
- Prior art keywords
- url
- reputation
- source computer
- file
- computer file
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Withdrawn
Links
Classifications
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F21/00—Security arrangements for protecting computers, components thereof, programs or data against unauthorised activity
- G06F21/50—Monitoring users, programs or devices to maintain the integrity of platforms, e.g. of processors, firmware or operating systems
- G06F21/51—Monitoring users, programs or devices to maintain the integrity of platforms, e.g. of processors, firmware or operating systems at application loading time, e.g. accepting, rejecting, starting or inhibiting executable software based on integrity or source reliability
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F21/00—Security arrangements for protecting computers, components thereof, programs or data against unauthorised activity
- G06F21/50—Monitoring users, programs or devices to maintain the integrity of platforms, e.g. of processors, firmware or operating systems
- G06F21/55—Detecting local intrusion or implementing counter-measures
- G06F21/56—Computer malware detection or handling, e.g. anti-virus arrangements
- G06F21/566—Dynamic detection, i.e. detection performed at run-time, e.g. emulation, suspicious activities
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F21/00—Security arrangements for protecting computers, components thereof, programs or data against unauthorised activity
- G06F21/50—Monitoring users, programs or devices to maintain the integrity of platforms, e.g. of processors, firmware or operating systems
- G06F21/57—Certifying or maintaining trusted computer platforms, e.g. secure boots or power-downs, version controls, system software checks, secure updates or assessing vulnerabilities
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F2221/00—Indexing scheme relating to security arrangements for protecting computers, components thereof, programs or data against unauthorised activity
- G06F2221/21—Indexing scheme relating to G06F21/00 and subgroups addressing additional information or applications relating to security arrangements for protecting computers, components thereof, programs or data against unauthorised activity
- G06F2221/2115—Third party
Landscapes
- Engineering & Computer Science (AREA)
- Computer Security & Cryptography (AREA)
- Software Systems (AREA)
- Theoretical Computer Science (AREA)
- Computer Hardware Design (AREA)
- General Engineering & Computer Science (AREA)
- Physics & Mathematics (AREA)
- General Physics & Mathematics (AREA)
- Health & Medical Sciences (AREA)
- General Health & Medical Sciences (AREA)
- Virology (AREA)
- Computer And Data Communications (AREA)
Abstract
A method or apparatus which detects an access request from a source computer file to a target uniform resource locator (URL), obtains reputation data for the target URL, identifies the file requesting the URL, based on the reputation of the URL take further action on the file. If the reputation of the URL suggests that the web page or domain it is trying to access is dangerous or malicious, the file may relate to a virus application, malware, a bot or spyware etc. Therefore the further action may be controlling access of the file to URL, controlling communication of the file, closing applications triggered by the file, displaying a warning message, deleting or quarantining the file. The URL may be determined by accessing a packet header. The file may be identified on the basis of a socket connection by monitoring network connections or hooking socket API functionalities. The file may be given a rating in a reputation rating database, the rating indicating the URL as trustworthy, suspicious or unknown.
Description
SECURITY METHOD AND APPARATUS
TECHNICAL FIELD
100011 Thc cxcmplary and non-limiting cmbodimcnts of thc prcscnt application rclatc generally to methods, apparatuses and computer programs and, more specifically, to the field of detecting computer-file-based security threats.
BACKGROUND
[0002] Computer files that are known to be malicious are typically blocked and removed from client computer systems by different security applications to prevent them from causing harm.
[0003] However, this is not the case with unknown computer files that in reality are malware or present a security risk. For example, if an unknown computer file accesses a known, malicious uniform resource locator (URL), a security application can block access to the URL but the actual computer file may remain unaffected in the client computer system where it can continue functioning. It is becoming more and more important to prevent malicious computer files from performing other malicious activities, not only on the computer where they are found but also on other computers in the ecosystem that are protected by an antivirus product.
[0004] Therefore, more effective ways to manage computer file-based security threats are needed.
SUMMARY
[0005] The present invention is defined by the independent claims.
[0006] Embodiments of the invention are defined in the dependent claims.
BRIEF DESCRIPTION OF THE DRAWINGS
[0007] The accompanying drawings provide a more complete understanding of the embodiments ofthe present invention according to the following descriptions: [0008] FTGTJRE I shows a simplified block diagram that illustrates an example of apparatuses according to the invention; 100091 FIGURE 2 shows an examplc of a method; and 100101 FIGURE 3 is a signal sequence diagram showing an example according to an embodiment of the present invention.
DETAILED DESCRIPTON OF THE DRAWINGS
[0011] Example embodiments of the present invention are later described in more detail with reference to the accompanying drawings, in which some embodiments of the invention arc shown. The invention may be embodied in many different forms and should not be construed as limited to the embodiments presented here. Although the specification may refer to "an", "one", or "some" embodiment in several locations, this does not necessarily mean that each such reference is to the same embodiment, or that the feature only applies to a single embodiment. Single features of different embodiments may also be combined to provide other embodiments.
[0012] The present invention can apply to any terminal, server, corresponding component, or to any communication system or combination of different communications systems that support the required functionality. Due to the rapid development of the specifications of computer systems and protocols that are used, all words and expressions should be interpreted broadly; they are intended only to explain the embodiment.
[0013] The following description relates to the categorization of web page content.
This may apply to any type of content, such as data, data blocks, complete files or portions of files, links and cache links.
[0014] Figure 1 illustrates a general example of apparatuses in which the embodiments of the invention may be applied. It only shows the elements and ifinctional entities that are required for understanding the arrangement according to an embodiment of the invention. Other components have been omitted for the sake of simplicity. The implementation of the elements and functional entities may vary from that shown in Figure I. The connections shown in Figure I are logical connections, and the actual physical connections may be different. Tt is apparent to a person skilled in the field that the arrangement may also comprise other functions and structures.
[0015] Figure I shows an example of a computer system I that is suitable for implementing the methods that are described below. The computer system I can be implemented as a combination of computer hardware and software. The computer system I comprises a mcmory 2, a proccssor 3 and a transccivcr 4. Thc rncmory 2 storcs thc various programs or cxccutablc flics that arc implcmcntcd by thc proccssor 3, and providcs a computcr systcm memory 5 that stores any data required by the computer system 1.
100161 Thc programs or cxccutablc flics that arc storcd in thc mcmory 2, and implcmcntcd by thc proecssor 3, can includc an opcrating systcm unit 6. Thc mcmory 2 also provides a memory 10 that is used by a detection unit 8, an analyzing unit 7 and a reputation unit 9. The detection unit 8, analyzing unit 7 and reputation unit 9 can be sub-units of a security control unit 11, for cxamplc. Thc transccivcr 4 is uscd to communicatc ovcr a network 12 such as a LAN, the Internet, or a mobile broadband network. Typically, the computer system 1 may be a personal computer (PC), laptop, tablet computer, or mobile phone, or any other suitable device.
[0017] The example of Figure 1 also shows a server system 16 that may communicate with thc computcr systcm 1 and othcr clicnt tcrminals. Thc scrvcr system 16 may comprise a transceiver 22 and a processor 26. The server system 16 may also comprise a database 24, and analyzing unit 20 and a reputation unit 18. The server system 16 may belong to, for example, an Internet service provider, a wireless service operator, mobile network operator, or a security service provider.
[0018] It should be noted that the computer system 1 and the server system 16 are only examples of apparatuses or systems, and that they may have more or fewer components than shown, may combine two or more components, or may have a different configuration or arrangement of the components. For example, in some embodimen the security control unit 11 forms a part of the server system 16.
[0019] The reputation unit 18 of the server system 16 and the reputation unit 9 of the computer system may comprise separate file-reputation (FRS) and a network-reputation systems NRS) that manage file and URL information respectively from different internal and external sources. This may also comprise a Web Traffic Scanner which analyzes URLs and their corresponding web pages and applies a rating for them. The analyzing unit 20 of the server system 16 may use different methods to identify malicious behavior for NRS and FRS. In an embodiment, both systems can be used to supplement one another to detect security threaths more efficiently.
[0020] Figure 2 is a flow diagram that shows an example of a process.
[0021] The method starts in 200.
100221 In 202, an access rcqucst from a computer file to a uniform resource locator (URL) is dctcctcd.
[0023] In 204, reputation data of the IJRL that was requested is obtained from a remote or local rcputation scrvcr or rcputation scanner.
100241 In 206, the obtained reputation data is used to dctcrminc whether thc URL can be trusted.
[0025] If the TJRL is determined to be trusted, then 208 is entered where the computer file is allowed to access the (JRL.
[0026] In 210, if the URL is determined to be untrusted or malicious, then 212 is entered where access to the URL is blocked.
[0027] In 214, the source computer file that requests access to the target URL is identified.
[0028] In 216, on the basis of the received reputation data that relates to the target URL, further action on the source computer file is taken.
[0029] In 218, if the URL is determined to be unknown, then 220 is entered where access to the URL is allowed or blocked depending on configured settings, such as predetermined user seftings. In an embodiment, user can choose to block or allow all unknown sites by configuring user settings of a security product. Blocking all unknown sites may also be disabled or enabled by default.
[0030] Figure 3 is a signal sequence diagram that shows another example of the process. At 300, the reputation server maintains a database of URL reputation information.
[0031] In 302, it is possible that an update of the reputation data that is maintained in the reputation server is regularly sent to the client computer. The reputation data may be related to URL reputation information, computer file reputation data and any other reputation information that is maintained on the reputation server.
[0032] In 304, the client computer detects an access request to a target URL. The source computer file may connect to a URL via either HTTP GET/POST or other protocols. In an embodiment, the target URL may be determined by accessing a packet header included in a packet request issued by the source computer file when requesting access to the target URL.
[0033] In 306, the client computer requests reputation data for the target URL from the reputation server. Tn an embodiment, the client computer may first check whether it has already received the reputation data for the target URL, for example, through regular reputation data updates from the reputation server. In that ease, the reputation data may be stored in the memory of the client computer already.
100341 In 308, the requested reputation data of the target URL is received from the reputation server. In an embodiment, the obtained reputation data comprises an indication of whether the URL is trusted, suspicious or unknown.
[0035] In 310, the client computer may analyze or calculate the received reputation data. In an embodiment, the client computer may, for example, calculate reputation scores or determine different thresholds on the basis of the received reputation data. When analyzing the reputation data that is received, the client computer may also use other information that is available from different sources. It is also possible that the received reputation data can be used as it is just to determine whether the computer file can be allowed access or not. The analysis or calculation may also be used to determine a reputation score for the source computer file requesting access to the target IJRL. The reputation score that is given to the source computer file may also correspond to the reputation data of the target URL.
[0036] In 312, if the analyzed reputation data implies that the source computer file is suspicious or cannot be trusted, the source computer file is identified. If the result of the analysis shows that the target URL is safe or some predetermined threshold has not been reached, the source computer file need not be identified. In practice, it does not matter whether the computer file is a downloader, an information thief or a bot; the moment it tries to access a URL, it can be detected. In an embodiment, a network interception framework that blocks any unwanted connections can be used to detect and block the computer file that made the malicious query. In an embodiment, the computer file that accesses the suspect URL may also be identified by other functionalities that can, for example, hook certain system calls including socket connections. An example of such fUnctionality is DeepGuard. This can also be performed by existing functionalities that monitor network connections or hook socket API (application programming interface) functionalities. The detected file that tried to connect to the suspect URL is then identified and fUrther action is performed.
[0037] In 314, the source computer file is rated according to a reputation rating that can be based on the received reputation data of the target URL.
100381 In 316, a reputation database of the client computer is updated with the reputation rating given to the source computer ifie and further action is taken based on the reputation rating. If the client computer determines that the source computer file cannot be trusted basal on the reputation rating, the file may be disabled, isolated, quarantined or deleted.
Depending on the settings of the device, the usa may not even be able to access thc suspect URL or an indication may be shown on the display of the device that warns the user about the URL, the source computer file or both. In an embodiment, prevalence logic can be used when calculating the reputation rating for the computer file to reduce the risk of false positives. The prevalence logic applies the action only to rare, unknown files based on the reputation server data.
[0039] In an embodiment, the source computer files' access to the URL may be allowed or blocked based on the reputation rating. In another embodiment, communication or operation of the source computer file may be allowed or blocked depending on the reputation rating. It is also possible that applications that are triggered by the source computer file are closed or monitored based on the reputation rating or a warning message is displayed on the user interface of the client computer. It is also possible to allow only a specific type of communication with the source computer ifie or to delete or quarantine the source computer file depending on the reputation rating of the source computer file.
[0040] Tn 318, the reputation rating of the source computer file may also be sent to the reputation server.
[0041] In 320, the reputation server may use the received reputation rating of the source computer file to update its own databases.
[0042] Without limiting the scope, interpretation, or application ofthe claims appearing below, the technical eficts of one or more of the example embodiments disclosed here enable controlling computer-ifie-based security threats in an efficient manner. The embodiments of the invention enable the identi1ing of unknown computer files that are trying to access known malicious URLs. By using tJRL reputation as a basis for determining a reputation for a computer file, the embodiments can block unwanted or malicious computer files even if those computer files are not yet known as malicious by the computer system, a security application or a reputation server. Further, embodiments of the invention enable inlbrming the reputation server of thc suspicious bchavior of a particular flic so that thc logic bchind thc rcputation scrvcr can dccidc whcthcr or not to updatc thc rcputation scrvcr with this ncw information.
[0043] Malicious files do access safe IJRLs but a safe computer file should not access malicious URLs unlcss thc fllc or thc LJRL has bccn compromised, which cffcctivcly makcs thc computcr fllc no longcr safc. Most malwarc connccts to somc location on thc lntcrnct to gct or send something. This kind of maiware includes files that have never been seen before, but the reputation of the URLs that they access may be available. The embodiments of the invention make use of this realization in an effective way.
[0044] In many situations, polymorphic files or even files that belong to different malware families may access the same domains or URLs that have already been rated by security servers as malicious. Making use of the reputation information of the URLs improves the detection of unluiown files that may be seen for the first time. The embodiments of the invention also enable creating dynamical detections for suspicious computer files on the client computer systems, sending this information to the cloud environment and propagating it to other client systems. Current reputation server systems store detection data for domains and IPs that commonly host malware. It is also possible to add Name Servers and ASNs to these reputation services. Since a lot of malware recycles subdomain hosting sites, redirections sites and dynamic DNS sites, new malicious entities can be detected with the same process as long as the domain is covered.
[0045] The steps, points, signaling messages and related functions described above in relation to Figures 2 and 3 are in no absolute chronological order, and some of the steps may be performed simultaneously or in a different order. Other fUnctions may also be executed between the steps or within the steps, and other signaling messages may be sent between the iHustrated ones. Some of the steps can also be left out or replaced by a corresponding step. The system functions illustrate a procedure that may be implemented in one or more physical or logical entities.
[0046] The techniques described here may be implemented by various means. An apparatus or system that implements one or more of the described functions with an embodiment comprises not only existing means, but also means for implementing one or more functions of a corresponding apparatus that is described with an embodiment. An apparatus or system may also comprise separate means for each separate function. These techniques may be implemented in hardware (one or more moduics) or combinations thcrcof. For software, implementation can bc through modules, for example, proccdurcs and frmnctions that perform thc fhnctions described here. The software code may be stored in any suitable data storage medium that is readable by proccssors, computers, memory unit(s) or articles(s) of manufacture, and may be cxecuted by one or more processors or computcrs. The data storage medium or memory unit may be implemented within the processor or computer, or as an external part of the processor or computer, in which case it can be connected to the processor or computer via various means
known in the field.
[0047] The programming, such as executable code or instructions, electronic data, databases or other digital information can be stored into memories and may include a processor-usable medium. A processor-usable medium may be embodied in any computer program product or article of manufacture which can contain, store, or maintain programming, data or digital information for use by or in connection with an instruction execution system, including the processor 3, 26 in the exemplary embodiments.
[0048] An embodiment provides a computer program product that comprises a computer-readable medium bearing computer program code embodied therein for use with a computer. The computer program code comprises: a code for detecting an access request from a source computer file to a target uniform resource locator (URL), a code for obtaining reputation data for the target URL from a reputation server, a code for identifring the source computer file requesting access to the target URL, and a code for taking further action on the source computer file on the basis ofthe obtained reputation data relating to the target URL.
[0049] Embodiments of the present invention may be implemented in software, hardware, application logic or a combination of these. In an example of an embodiment, the application logic, sofiware or a set of instructions is maintained on any conventional computer-readable media. In the context of this document, a "computer-readable medium" may be any media or means that can contain, store, communicate, propagate or transport the instructions for use by or in connection with an instruction execution system, apparatus, or device, such as a computer. A computer-readable medium may comprise a computer-readable storage medium that may be any media or means that can contain or store the instructions for use by or in connection with an instruction execution system, apparatus, or device, such as a computer.
100501 Thc various aspccts of thc invcntion arc not llmitcd to thc combinations that arc cxplicitly set out in thc indcpcndcnt claims. Othcr aspccts of thc invcntion may comprise combinations of features from the described embodiments, the dependent claims and the indcpcndcnt claims.
100511 It is obvious to a pcrson skillcd in thc ficld that, as thc tcchnology advanccs, thc inventive concept can be implemented in various ways. The invention and its embodiments are not liithted to the examples described above but may vary within the scope of the claims.
Claims (13)
- WHAT IS CLAIMED IS1. A method for dctccting computer-file-based sccurity threats, comprising: detecting an access request from a source computer file to a target uniform resource locator (URL); obtaining rcputation data for the target URL from a rcputation scrvcr or via a local reputation scanner; identi'ing the source computer file requesting access to the target URL; and on the basis of the obtained rcputation data relating to the target IJRL, taking further action on the source computer file.
- 2. The method of claim 1, flirther comprising determining the target URL by accessing a packet header included in a packet request issued by the source computer file when requesting access to the target URL.
- 3. The method of claim 1 or 2, further comprising identifying the source computer file on the basis of a socket connection launched by the source computer file targeting the target URL by monitoring network connections or hooking socket application programming interface (API) functionalities.
- 4. The method of any one of claims 1 to 3, wherein the further action comprises rating the source computer file with a reputation rating and providing the reputation rating of the source computer file with a reputation database.
- 5. The method of any one of the preceding claims, wherein the obtained reputation data comprises an indication of whether the URL is trusted, suspicious or unknown.
- 6. The method of any one of the preceding claims, wherein the fttrther action comprises any of the following: allowing or blocking the source computer file accessing the URL, allowing or blocking communication of the source computer file, closing applications triggered by the source computer file, displaying a warning message, allowing only spccific typc of communication with thc source computer ific, deleting or quarantining the source computer file.
- 7. An apparatus, comprising: at least one processor; and at least one memory including computer program code the at least one memory and the computer program code configured to, with the at least one processor, cause the apparatus to perform at least the following: detect an access request from a source computer ifie to a target unitbrm resource locator (URL); obtain reputation data for the target URL from a reputation server or via a local reputation scanner; identi, the source computer file requesting access to the target URL; and on the basis of the obtained reputation data relating to the target URL, take further action on the source computer file.
- 8. The apparatus of claim 7, wherein the apparatus is further configured to: determine the target URL by accessing a packet header included in a packet request issued by the source computer file when requesting access to the target URL.
- 9. The apparatus of claim 7 or 8, wherein the apparatus is further configured to identify the source computer file on the basis of a socket connection launched by the source computer file targeting the target URL by monitoring network connections or hooking socket application programming interftce (API) fbnctionalities.
- 10. The apparatus of any one of claims 7 to 9, wherein the further action comprises rating the source computer file with a reputation rating and pmviding the reputation rating of the source computer ifie with a reputation database.
- II. The apparatus of any one of claims 7 to 10, wherein the obtained reputation data comprises an indication of whether the URL is trusted, suspicious or unknown.
- 12. Thc apparatus of any onc of claims 7 to 11, whcrcin thc furthcr action compriscs any of the following: allowing or blocking the source computer file accessing the URL, allowing or blocking communication of thc sourcc computcr fllc, closing applications triggcrcd by thc sourcc computcr fllc, displaying a warning mcssagc, allowing only specific type of communication with the source computer file, deleting or quarantining the source computer file.
- 13. A computer program product of embodied on a distribution medium readable by a computer and comprising program instructions which, when loaded into an apparatus, execute the method according to any preceding claim 1 to 6.
Priority Applications (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
GB1217613.7A GB2506605A (en) | 2012-10-02 | 2012-10-02 | Identifying computer file based security threats by analysis of communication requests from files to recognise requests sent to untrustworthy domains |
Applications Claiming Priority (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
GB1217613.7A GB2506605A (en) | 2012-10-02 | 2012-10-02 | Identifying computer file based security threats by analysis of communication requests from files to recognise requests sent to untrustworthy domains |
Publications (2)
Publication Number | Publication Date |
---|---|
GB201217613D0 GB201217613D0 (en) | 2012-11-14 |
GB2506605A true GB2506605A (en) | 2014-04-09 |
Family
ID=47225550
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
GB1217613.7A Withdrawn GB2506605A (en) | 2012-10-02 | 2012-10-02 | Identifying computer file based security threats by analysis of communication requests from files to recognise requests sent to untrustworthy domains |
Country Status (1)
Country | Link |
---|---|
GB (1) | GB2506605A (en) |
Cited By (2)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
EP3416085A1 (en) * | 2017-06-16 | 2018-12-19 | AO Kaspersky Lab | System and method of detecting malicious files with the use of elements of static analysis |
US10867038B2 (en) | 2017-06-16 | 2020-12-15 | AO Kaspersky Lab | System and method of detecting malicious files with the use of elements of static analysis |
Citations (4)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
WO2007136665A2 (en) * | 2006-05-19 | 2007-11-29 | Cisco Ironport Systems Llc | Method and apparatus for controlling access to network resources based on reputation |
US20110067101A1 (en) * | 2009-09-15 | 2011-03-17 | Symantec Corporation | Individualized Time-to-Live for Reputation Scores of Computer Files |
US20110185423A1 (en) * | 2010-01-27 | 2011-07-28 | Mcafee, Inc. | Method and system for detection of malware that connect to network destinations through cloud scanning and web reputation |
US20110185428A1 (en) * | 2010-01-27 | 2011-07-28 | Mcafee, Inc. | Method and system for protection against unknown malicious activities observed by applications downloaded from pre-classified domains |
-
2012
- 2012-10-02 GB GB1217613.7A patent/GB2506605A/en not_active Withdrawn
Patent Citations (4)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
WO2007136665A2 (en) * | 2006-05-19 | 2007-11-29 | Cisco Ironport Systems Llc | Method and apparatus for controlling access to network resources based on reputation |
US20110067101A1 (en) * | 2009-09-15 | 2011-03-17 | Symantec Corporation | Individualized Time-to-Live for Reputation Scores of Computer Files |
US20110185423A1 (en) * | 2010-01-27 | 2011-07-28 | Mcafee, Inc. | Method and system for detection of malware that connect to network destinations through cloud scanning and web reputation |
US20110185428A1 (en) * | 2010-01-27 | 2011-07-28 | Mcafee, Inc. | Method and system for protection against unknown malicious activities observed by applications downloaded from pre-classified domains |
Cited By (2)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
EP3416085A1 (en) * | 2017-06-16 | 2018-12-19 | AO Kaspersky Lab | System and method of detecting malicious files with the use of elements of static analysis |
US10867038B2 (en) | 2017-06-16 | 2020-12-15 | AO Kaspersky Lab | System and method of detecting malicious files with the use of elements of static analysis |
Also Published As
Publication number | Publication date |
---|---|
GB201217613D0 (en) | 2012-11-14 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
US10164993B2 (en) | Distributed split browser content inspection and analysis | |
JP6147309B2 (en) | Computer program, system, method and apparatus | |
US10523609B1 (en) | Multi-vector malware detection and analysis | |
US11044270B2 (en) | Using private threat intelligence in public cloud | |
Virvilis et al. | Security Busters: Web browser security vs. rogue sites | |
US8677493B2 (en) | Dynamic cleaning for malware using cloud technology | |
US9215242B2 (en) | Methods and systems for preventing unauthorized acquisition of user information | |
US8850584B2 (en) | Systems and methods for malware detection | |
EP2755157B1 (en) | Detecting undesirable content | |
US11861008B2 (en) | Using browser context in evasive web-based malware detection | |
US9215209B2 (en) | Source request monitoring | |
US9147067B2 (en) | Security method and apparatus | |
US20140259168A1 (en) | Malware identification using a hybrid host and network based approach | |
US20120222117A1 (en) | Method and system for preventing transmission of malicious contents | |
CN112703496B (en) | Content policy based notification to application users regarding malicious browser plug-ins | |
US11627146B2 (en) | Detection and prevention of hostile network traffic flow appropriation and validation of firmware updates | |
US11799876B2 (en) | Web crawler systems and methods to efficiently detect malicious sites | |
US20230179631A1 (en) | System and method for detection of malicious interactions in a computer network | |
WO2014114127A1 (en) | Method, apparatus and system for webpage access control | |
GB2505398A (en) | Social network protection system | |
US20140208385A1 (en) | Method, apparatus and system for webpage access control | |
US20170070460A1 (en) | Controlling Access to Web Resources | |
US8640242B2 (en) | Preventing and detecting print-provider startup malware | |
GB2506605A (en) | Identifying computer file based security threats by analysis of communication requests from files to recognise requests sent to untrustworthy domains | |
EP3999985A1 (en) | Inline malware detection |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
WAP | Application withdrawn, taken to be withdrawn or refused ** after publication under section 16(1) |