WO2017003580A1 - Mitigation of malware - Google Patents

Mitigation of malware Download PDF

Info

Publication number
WO2017003580A1
WO2017003580A1 PCT/US2016/033846 US2016033846W WO2017003580A1 WO 2017003580 A1 WO2017003580 A1 WO 2017003580A1 US 2016033846 W US2016033846 W US 2016033846W WO 2017003580 A1 WO2017003580 A1 WO 2017003580A1
Authority
WO
WIPO (PCT)
Prior art keywords
malware
tasks
electronic device
behavior
detection
Prior art date
Application number
PCT/US2016/033846
Other languages
French (fr)
Inventor
Ashish Mishra
Rahul Mohandas
Sakthikumar Subramanian
Kumaraguru A. VELMURUGAN
Arun SATYARTH
Anadi Madhukar
Lixin Lu
Original Assignee
Mcafee, Inc.
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Mcafee, Inc. filed Critical Mcafee, Inc.
Priority to EP16818395.2A priority Critical patent/EP3314509A4/en
Priority to CN201680037878.XA priority patent/CN108064384A/en
Priority to JP2017567410A priority patent/JP6668390B2/en
Publication of WO2017003580A1 publication Critical patent/WO2017003580A1/en

Links

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F21/00Security arrangements for protecting computers, components thereof, programs or data against unauthorised activity
    • G06F21/50Monitoring users, programs or devices to maintain the integrity of platforms, e.g. of processors, firmware or operating systems
    • G06F21/55Detecting local intrusion or implementing counter-measures
    • G06F21/56Computer malware detection or handling, e.g. anti-virus arrangements
    • G06F21/566Dynamic detection, i.e. detection performed at run-time, e.g. emulation, suspicious activities
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F21/00Security arrangements for protecting computers, components thereof, programs or data against unauthorised activity
    • G06F21/50Monitoring users, programs or devices to maintain the integrity of platforms, e.g. of processors, firmware or operating systems
    • G06F21/55Detecting local intrusion or implementing counter-measures
    • G06F21/56Computer malware detection or handling, e.g. anti-virus arrangements
    • G06F21/568Computer malware detection or handling, e.g. anti-virus arrangements eliminating virus, restoring damaged files
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F21/00Security arrangements for protecting computers, components thereof, programs or data against unauthorised activity
    • G06F21/50Monitoring users, programs or devices to maintain the integrity of platforms, e.g. of processors, firmware or operating systems
    • G06F21/57Certifying or maintaining trusted computer platforms, e.g. secure boots or power-downs, version controls, system software checks, secure updates or assessing vulnerabilities
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04LTRANSMISSION OF DIGITAL INFORMATION, e.g. TELEGRAPHIC COMMUNICATION
    • H04L9/00Cryptographic mechanisms or cryptographic arrangements for secret or secure communications; Network security protocols
    • H04L9/40Network security protocols
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F2221/00Indexing scheme relating to security arrangements for protecting computers, components thereof, programs or data against unauthorised activity
    • G06F2221/21Indexing scheme relating to G06F21/00 and subgroups addressing additional information or applications relating to security arrangements for protecting computers, components thereof, programs or data against unauthorised activity
    • G06F2221/2101Auditing as a secondary aspect

Definitions

  • the elements of sample behavior, family behavior, and generic malware behavior can be combined within malware detection module 120 and compared with analysis log 126 to identify malware.
  • Analysis log 126 can be a log of activities on a system suspected of being infected with malware.
  • the elements of sample behavior, family behavior, and generic malware behavior can be combined within malware mitigation module 122 to generate detection tasks that can be configured to be executed to gather relevant environment details, file system and registry information, and indicators of infection and evasion within an electronic device.
  • a feedback loop to malware mitigation module 122 can be used to analyze the results of the detection task to generate further specific tasks for detection and repair of infection on an infected electronic device. The results of these tasks can be again fed back to malware mitigation module 122 which can generate further tasks for execution. This sequence of actions can iterate until malware mitigation module 122 determines that the electronic device is clean of the infection as indicated by the sample behavior and the family behavior.
  • packet refers to a unit of data that can be routed between a source node and a destination node on a packet switched network.
  • a packet includes a source network address and a destination network address. These network addresses can be Internet Protocol (IP) addresses in a TCP/IP messaging protocol.
  • IP Internet Protocol
  • data refers to any type of binary, numeric, voice, video, textual, or script data, or any type of source or object code, or any other suitable information in any appropriate format that may be communicated from one point to another in electronic devices and/or networks. Additionally, messages, requests, responses, and queries are forms of network traffic, and therefore, may comprise packets, frames, signals, data, etc.
  • any of the memory items discussed herein should be construed as being encompassed within the broad term 'memory element.
  • the information being used, tracked, sent, or received in communication systems lOOa-lOOc could be provided in any database, register, queue, table, cache, control list, or other storage structure, all of which can be referenced at any suitable timeframe. Any such storage options may also be included within the broad term 'memory element' as used herein.
  • Electronic devices 102a-102c can be a network element and include, for example, desktop computers, laptop computers, mobile devices, personal digital assistants, smartphones, tablets, or other similar devices.
  • Cloud services 104 is configured to provide cloud services to electronic devices 102a-102c. Cloud services may generally be defined as the use of computing resources that are delivered as a service over a network, such as the Internet. Typically, compute, storage, and network resources are offered in a cloud infrastructure, effectively shifting the workload from a local network to the cloud network.
  • Server 106 can be a network element such as a server or virtual server and can be associated with clients, customers, endpoints, or end users wishing to initiate a communication in communication systems lOOa-lOOc via some network (e.g., network 108).
  • Program emulation may be used to let malware sample execute in an emulated environment and study that environment for changes made by the malware sample and identify evasion techniques used by the malware. For example, if malware uses Windows ® API hooking mechanism to hide from a list of running processes, the same behavioral information is recorded and made available as an evasion technique.
  • malware detection module 120 Such information consisting of family specific behavior is fed to malware detection module 120 and malware mitigation module 122 which can use this information to generate detection/mitigation tasks. For example, for samples of families that change the name of the file but keep the folder-name and md5 checksum same, an identification task can be generated to look for specific md5 in a specific folder within the electronic device
  • FIGURE 7 is a simplified block diagram associated with an example ARM ecosystem SOC 700 of the present disclosure.
  • At least one example implementation of the present disclosure can include the mitigation of malware features discussed herein and an ARM component.
  • the example of FIGURE 7 can be associated with any ARM core (e.g., A-7, A-15, etc.).
  • the architecture can be part of any type of tablet, smartphone (inclusive of AndroidTM phones, iPhonesTM), iPadTM, Google NexusTM, Microsoft SurfaceTM, personal computer, server, video processing components, laptop computer (inclusive of any type of notebook), Ultra bookTM system, any type of touch- enabled input device, etc.
  • a processor may include other elements on a chip with processor core 800, at least some of which were shown and described herein with reference to FIGURE 6.
  • a processor may include memory control logic along with processor core 800.
  • the processor may include I/O control logic and/or may include I/O control logic integrated with memory control logic.
  • Example CI is at least one machine readable storage medium having one or more instructions that when executed by at least one processor cause the at least one processor to allow malware to execute in a system, record changes to the system caused by the execution of the malware, and create detection tasks for the detection of the malware in an electronic device, wherein the detection tasks are at least partially based on the changes to the system caused by the execution of the malware.
  • the subject matter of Example CI can optionally include where the detection tasks are created using one or more of pattern matching, global reputation analysis, program emulation, static analysis, and dynamic analysis of the malware.
  • an apparatus can include a pattern behavior generation module, where the pattern behavior generation module is configured to allow malware to execute in a system, record changes to the system caused by the execution of the malware, and create detection tasks for the detection of the malware in an electronic device, wherein the detection tasks are at least partially based on the changes to the system caused by the execution of the malware.
  • the subject matter of Example Al can optionally include where the detection tasks are created using one or more of pattern matching, global reputation analysis, program emulation, static analysis, and dynamic analysis of the malware.
  • Example SI is a system for remediation of malware, the system including a pattern behavior generation module configured to allow malware to execute in a system, record changes to the system caused by the execution of the malware, create detection tasks for the detection of the malware in an electronic device, where the detection tasks are at least partially based on the changes to the system caused by the execution of the malware and a security module configured to identify an infected electronic device using the created detection tasks and create mitigation tasks that mitigate the changes to the infected electronic device caused by the malware.
  • a pattern behavior generation module configured to allow malware to execute in a system, record changes to the system caused by the execution of the malware, create detection tasks for the detection of the malware in an electronic device, where the detection tasks are at least partially based on the changes to the system caused by the execution of the malware and a security module configured to identify an infected electronic device using the created detection tasks and create mitigation tasks that mitigate the changes to the infected electronic device caused by the malware.

Landscapes

  • Engineering & Computer Science (AREA)
  • Computer Security & Cryptography (AREA)
  • Computer Hardware Design (AREA)
  • General Engineering & Computer Science (AREA)
  • Software Systems (AREA)
  • Theoretical Computer Science (AREA)
  • General Physics & Mathematics (AREA)
  • Physics & Mathematics (AREA)
  • Health & Medical Sciences (AREA)
  • Virology (AREA)
  • General Health & Medical Sciences (AREA)
  • Computer Networks & Wireless Communication (AREA)
  • Signal Processing (AREA)
  • Debugging And Monitoring (AREA)

Abstract

Particular embodiments described herein provide for an electronic device that can be configured to allow malware to execute in a system, record changes to the system caused by the execution of the malware, and create detection tasks for the detection of the malware in an electronic device, where the detection tasks are at least partially based on the changes to the system caused by the execution of the malware. The detection tasks can be used to identify an infected electronic device and to create mitigation tasks that mitigate the changes to the infected electronic device caused by the malware.

Description

MITIGATION OF MALWARE
CROSS-REFERENCE TO RELATED APPLICATION
[0001] This application claims the benefit of and priority to Indian Patent Application No. 3247/CHE/2015 filed 27 June 2015 entitled "MITIGATION OF MALWARE", which is incorporated herein by reference in its entirety.
TECHNICAL FIELD
[0002] This disclosure relates in general to the field of information security, and more particularly, to the mitigation of malware.
BACKGROUND
[0003] The field of network security has become increasingly important in today's society. The Internet has enabled interconnection of different computer networks all over the world. In particular, the Internet provides a medium for exchanging data between different users connected to different computer networks via various types of client devices. While the use of the Internet has transformed business and personal communications, it has also been used as a vehicle for malicious operators to gain unauthorized access to computers and computer networks and for intentional or inadvertent disclosure of sensitive information.
[0004] Malicious software ("malware") that infects a host computer may be able to perform any number of malicious actions, such as stealing sensitive information from a business or individual associated with the host computer, propagating to other host computers, and/or assisting with distributed denial of service attacks, sending out spam or malicious emails from the host computer, etc. Hence, significant administrative challenges remain for protecting computers and computer networks from malicious and inadvertent exploitation by malicious software.
BRIEF DESCRIPTION OF THE DRAWINGS
[0005] To provide a more complete understanding of the present disclosure and features and advantages thereof, reference is made to the following description, taken in conjunction with the accompanying figures, wherein like reference numerals represent like parts, in which:
[0006] FIGURE 1A is a simplified block diagram of a communication system for the mitigation of malware in accordance with an embodiment of the present disclosure;
[0007] FIGURE IB is a simplified block diagram of a communication system for the mitigation of malware in accordance with an embodiment of the present disclosure;
[0008] FIGURE 1C is a simplified block diagram of a communication system for the mitigation of malware in accordance with an embodiment of the present disclosure;
[0009] FIGURE 2 is a simplified block diagram of a portion of a communication system for the mitigation of malware in accordance with an embodiment of the present disclosure;
[0010] FIGURE 3 is a simplified diagram of example details of a communication system for the mitigation of malware in accordance with an embodiment of the present disclosure;
[0011] FIGURE 4 is a simplified flowchart illustrating potential operations that may be associated with the communication system in accordance with an embodiment;
[0012] FIGURE 5 is a simplified flowchart illustrating potential operations that may be associated with the communication system in accordance with an embodiment;
[0013] FIGURE 6 is a block diagram illustrating an example computing system that is arranged in a point-to-point configuration in accordance with an embodiment;
[0014] FIGURE 7 is a simplified block diagram associated with an example ARM ecosystem system on chip (SOC) of the present disclosure; and
[0015] FIGURE 8 is a block diagram illustrating an example processor core in accordance with an embodiment.
[0016] The FIGURES of the drawings are not necessarily drawn to scale, as their dimensions can be varied considerably without departing from the scope of the present disclosure. DETAILED DESCRIPTION OF EXAMPLE EMBODIMENTS
EXAMPLE EMBODIMENTS
[0017] FIGURE 1A is a simplified block diagram of a communication system 100a for the mitigation of malware in accordance with an embodiment of the present disclosure. Communication system 100a can include an electronic device 102a, cloud services 104, and a server 106. Electronic device 102a can include a processor 110, memory 112, an operating system 114, a sandbox 116, and a security module 118. Security module 118 can include a malware detection module 120, a malware mitigation module 122, and malware pattern behavior 124. Malware detection module 120 can include an analysis log 126. Malware mitigation module 122 can include reverse malware behavior actions 128. Cloud services 104 and server 106 can each include a network security module 130. Network security module 130 can include a pattern behavior generation module 132 and malware pattern behavior 124. Electronic device 102a, cloud service 104, and server 106 can each be in communication using network 108.
[0018] Turning to FIGURE IB, FIGURE IB is a simplified block diagram of a communication system 100b for the mitigation of malware in accordance with an embodiment of the present disclosure. Communication system 100b can include an electronic device 102b, cloud services 104, and server 106. Electronic device 102b can include processor 110, memory 112, operating system 114, security module 118, and a recovery environment 136. Electronic device 102b, cloud service 104, and server 106 can each be in communication using network 108.
[0019] Turning to FIGURE 1C, FIGURE 1C is a simplified block diagram of a communication system 100c for the mitigation of malware in accordance with an embodiment of the present disclosure. Communication system 100c can include an electronic device 102c, cloud services 104, and server 106. Electronic device 102c can include processor 110, memory 112, operating system 114, security module 118, and a secondary operating system 138. Electronic device 102c, cloud service 104, and server 106 can each be in communication using network 108.
[0020] In example embodiments, communication systems lOOa-lOOc can be configured to study a malicious application's behavior in detail. The knowledge of the malicious application's behavior can then be used to identify an infected electronic device and remediate or mitigate the effects of the malware on the electronic device without the need to re-image the electronic device. For example, communication systems lOOa-lOOc can be configured to use dynamic analysis of a malware sample to remediate an infected machine that has been infection by the malware sample. The remediation or mitigation can involve using an iterative multistage approach. In an example, a recovery environment may be used to bypass evasion techniques used by the malware during mitigation. In another example, behavioral knowledge about the malware, a family of malware that the malware belongs to, as well as sample and generic malware behavior may be used to detect the malware and remediate a machine infected with the malware. Using the dynamic analysis of the malware sample can allow for the ability to detect and remediate malware which may not be detected by traditional antivirus software. In addition, the malware may be remediated without having to re-image an infected electronic device and prevent some of the data loss that typically arises out of re-imaging.
[0021] Elements of FIGURES 1A-1C may be coupled to one another through one or more interfaces employing any suitable connections (wired or wireless), which provide viable pathways for network (e.g., network 108) communications. Additionally, any one or more of these elements of FIGURES 1A-1C may be combined or removed from the architecture based on particular configuration needs. Communication systems lOOa-lOOc may each include a configuration capable of transmission control protocol/Internet protocol (TCP/IP) communications for the transmission or reception of packets in a network. Communication systems lOOa-lOOc may also operate in conjunction with a user datagram protocol/IP (UDP/IP) or any other suitable protocol where appropriate and based on particular needs.
[0022] For purposes of illustrating certain example techniques of communication systems lOOa-lOOc, it is important to understand the communications that may be traversing the network environment. The following foundational information may be viewed as a basis from which the present disclosure may be properly explained.
[0023] Most enterprises remediate an electronic device infected with malware by re-formatting the entire hard disk and re-imaging the electronic device. This is an inconvenient process and not only renders the electronic device unproductive but also causes data loss on the electronic device by removing data that was not backed-up prior to re-imaging. Currently, re-imaging is one of a very few reliable techniques of malware removal because malware uses various techniques to evade detection and remediation. Using currently prevalent detection technologies, it is not only difficult to remediate the presence of malicious files on an infected electronic device, it is also very difficult to identify if a host is infected at all. What is needed is a system and method that can mitigate or remediate malware on an infected electronic device without having to re-image the electronic device.
[0024] A communication system for the mitigation of malware, as outlined in FIGURES 1A-1C, can resolve these issues (and others). Communication systems lOOa-lOOc may be configured to analyze a malware suspect sample using behavioral analysis techniques. For example, pattern behavior generation module 132 can be configured to analyze a malware suspect sample using behavioral analysis techniques. The techniques may involve use of a combination of one or more of pattern matching, global reputation, program emulation, static analysis, dynamic analysis, or some other behavioral analysis technique. Once the malware suspect sample has been analyzed, the system can be configured to generate malware pattern behavior (e.g., malware pattern behavior 124) of the malware based on the analysis. The malware pattern behavior can include various evasion and obfuscation techniques used by the malware as captured by the analysis. The malware pattern behavior can be an indicator of the specific malware sample behavior.
[0025] Based on the sample behavior, a behavioral knowledge of typical malware families can be used to identify a malware family associated with the sample behavior. As a result of research and analysis of known and new malware families, a set of behavior patterns specific to the identified malware family can be created as family behavior. Most malware engages in behavior patterns that are not exhibited by benevolent or benign software. As a result of research and analysis, a set of behavior patterns that are shown by malware in general can be prepared as generic malware behavior. The specific malware sample behavior, family behavior, and generic malware behavior can be combined into malware pattern behavior 124.
[0026] The elements of sample behavior, family behavior, and generic malware behavior can be combined within malware detection module 120 and compared with analysis log 126 to identify malware. Analysis log 126 can be a log of activities on a system suspected of being infected with malware. In addition, the elements of sample behavior, family behavior, and generic malware behavior can be combined within malware mitigation module 122 to generate detection tasks that can be configured to be executed to gather relevant environment details, file system and registry information, and indicators of infection and evasion within an electronic device. A feedback loop to malware mitigation module 122 can be used to analyze the results of the detection task to generate further specific tasks for detection and repair of infection on an infected electronic device. The results of these tasks can be again fed back to malware mitigation module 122 which can generate further tasks for execution. This sequence of actions can iterate until malware mitigation module 122 determines that the electronic device is clean of the infection as indicated by the sample behavior and the family behavior.
[0027] To mitigate the malware's ability to evade detection, the tasks may be executed on top of different platforms. For example, the tasks may be executed in a live operating system, which may be infected by malware and can interfere with the accuracy of detection or mitigation tasks. The tasks may be executed in recovery environment 136 (e.g., a Windows® recovery environment (RE)) which is less likely to be infected by malware that might interfere with the detection or mitigation tasks. The tasks may be executed using secondary operating system 138 on the electronic device. The secondary operating system may be pushed on the electronic device using a file which can be mounted as a boot disk. The malware is least likely to interfere with the mitigation tasks executed on the secondary operating system.
[0028] Turning to the infrastructure of FIGURES 1A-1C, communication systems lOOa-lOOc in accordance with an example embodiment is shown. Generally, communication systems lOOa-lOOc can be implemented in any type or topology of networks. Network 108 represents a series of points or nodes of interconnected communication paths for receiving and transmitting packets of information that propagate through communication systems lOOa-lOOc. Network 108 offers a communicative interface between nodes, and may be configured as any local area network (LAN), virtual local area network (VLAN), wide area network (WAN), wireless local area network (WLAN), metropolitan area network (MAN), Intranet, Extranet, virtual private network (VPN), and any other appropriate architecture or system that facilitates communications in a network environment, or any suitable combination thereof, including wired and/or wireless communication.
[0029] In communication systems lOOa-lOOc, network traffic, which is inclusive of packets, frames, signals, data, etc., can be sent and received according to any suitable communication messaging protocols. Suitable communication messaging protocols can include a multi-layered scheme such as Open Systems Interconnection (OSI) model, or any derivations or variants thereof (e.g., Transmission Control Protocol/Internet Protocol (TCP/IP), user datagram protocol/IP (UDP/IP)). Additionally, radio signal communications over a cellular network may also be provided in communication systems lOOa-lOOc. Suitable interfaces and infrastructure may be provided to enable communication with the cellular network.
[0030] The term "packet" as used herein, refers to a unit of data that can be routed between a source node and a destination node on a packet switched network. A packet includes a source network address and a destination network address. These network addresses can be Internet Protocol (IP) addresses in a TCP/IP messaging protocol. The term "data" as used herein, refers to any type of binary, numeric, voice, video, textual, or script data, or any type of source or object code, or any other suitable information in any appropriate format that may be communicated from one point to another in electronic devices and/or networks. Additionally, messages, requests, responses, and queries are forms of network traffic, and therefore, may comprise packets, frames, signals, data, etc.
[0031] In an example implementation, electronic devices 102a-102c, cloud services 104, and server 106, and are network elements, which are meant to encompass network appliances, servers, routers, switches, gateways, bridges, load balancers, processors, modules, or any other suitable device, component, element, or object operable to exchange information in a network environment. Network elements may include any suitable hardware, software, components, modules, or objects that facilitate the operations thereof, as well as suitable interfaces for receiving, transmitting, and/or otherwise communicating data or information in a network environment. This may be inclusive of appropriate algorithms and communication protocols that allow for the effective exchange of data or information. [0032] In regards to the internal structure associated with communication systems lOOa-lOOc, each of electronic devices 102a-102c, cloud services 104, and server 106 can include memory elements for storing information to be used in the operations outlined herein. Each of electronic devices 102a-102c, cloud services 104, and server 106 may keep information in any suitable memory element (e.g., random access memory (RAM), read-only memory (ROM), erasable programmable ROM (EPROM), electrically erasable programmable ROM (EEPROM), application specific integrated circuit (ASIC), etc.), software, hardware, firmware, or in any other suitable component, device, element, or object where appropriate and based on particular needs. Any of the memory items discussed herein should be construed as being encompassed within the broad term 'memory element.' Moreover, the information being used, tracked, sent, or received in communication systems lOOa-lOOc could be provided in any database, register, queue, table, cache, control list, or other storage structure, all of which can be referenced at any suitable timeframe. Any such storage options may also be included within the broad term 'memory element' as used herein.
[0033] In certain example implementations, the functions outlined herein may be implemented by logic encoded in one or more tangible media (e.g., embedded logic provided in an ASIC, digital signal processor (DSP) instructions, software (potentially inclusive of object code and source code) to be executed by a processor, or other similar machine, etc.), which may be inclusive of non-transitory computer-readable media. In some of these instances, memory elements can store data used for the operations described herein. This includes the memory elements being able to store software, logic, code, or processor instructions that are executed to carry out the activities described herein.
[0034] In an example implementation, network elements of communication systems lOOa-lOOc, such as electronic devices 102a-102c, cloud services 104, and server 106 may include software modules (e.g., security module 118, malware detection module 120, malware mitigation module 122, network security module 130, and pattern behavior generation module module 132) to achieve, or to foster, operations as outlined herein. These modules may be suitably combined in any appropriate manner, which may be based on particular configuration and/or provisioning needs. In example embodiments, such operations may be carried out by hardware, implemented externally to these elements, or included in some other network device to achieve the intended functionality. Furthermore, the modules can be implemented as software, hardware, firmware, or any suitable combination thereof. These elements may also include software (or reciprocating software) that can coordinate with other network elements in order to achieve the operations, as outlined herein.
[0035] Additionally, each of electronic devices 102a-102c, cloud services 104, and server 106 may include a processor that can execute software or an algorithm to perform activities as discussed herein. A processor can execute any type of instructions associated with the data to achieve the operations detailed herein. In one example, the processors could transform an element or an article (e.g., data) from one state or thing to another state or thing. In another example, the activities outlined herein may be implemented with fixed logic or programmable logic (e.g., software/computer instructions executed by a processor) and the elements identified herein could be some type of a programmable processor, programmable digital logic (e.g., a field programmable gate array (FPGA), an EPROM, an EEPROM) or an ASIC that includes digital logic, software, code, electronic instructions, or any suitable combination thereof. Any of the potential processing elements, modules, and machines described herein should be construed as being encompassed within the broad term 'processor.'
[0036] Electronic devices 102a-102c can be a network element and include, for example, desktop computers, laptop computers, mobile devices, personal digital assistants, smartphones, tablets, or other similar devices. Cloud services 104 is configured to provide cloud services to electronic devices 102a-102c. Cloud services may generally be defined as the use of computing resources that are delivered as a service over a network, such as the Internet. Typically, compute, storage, and network resources are offered in a cloud infrastructure, effectively shifting the workload from a local network to the cloud network. Server 106 can be a network element such as a server or virtual server and can be associated with clients, customers, endpoints, or end users wishing to initiate a communication in communication systems lOOa-lOOc via some network (e.g., network 108). The term 'server' is inclusive of devices used to serve the requests of clients and/or perform some computational task on behalf of clients within communication systems lOOa-lOOc. Although security module 118 is represented in FIGURES 1A-1C as being located in electronic devices 102a-102c respectively, this is for illustrative purposes only. Security module 118 could be combined or separated in any suitable configuration. Furthermore, security module 118 could be integrated with or distributed in another network accessible by electronic devices 102a-102c such as cloud services 104 or server 106.
[0037] Turning to FIGURE 2, FIGURE 2 is a simplified block diagram of a portion of communication system lOOa-lOOc for the mitigation of malware in accordance with an embodiment of the present disclosure. Pattern behavior generation module 132 can be configured to use of a combination of one or more of pattern matching, global reputation, static analysis, dynamic analysis, program emulation, or some other behavioral analysis technique to analyze a sample or application. The analyzed sample can be generalized and used to create malware pattern behavior 124. Malware pattern behavior 124 can include sample behavior 140 and family behavior 142. Sample behavior 140 can include specific malware sample behavior. Family behavior 142 can be created from a family of malware associated with or that includes the analyzed sample.
[0038] Because malware can use various techniques to evade detection, a combination of various techniques is used to analyze a sample and detect if it is indeed malicious. For example, pattern matching may be used to identify well known samples and the same knowledge can be used to identify what changes the sample makes to the system and what evasion techniques the malware uses. Similarly, a global reputation of a sample can be used to identify the malware's behavior based on input from a global database.
[0039] Program emulation may be used to let malware sample execute in an emulated environment and study that environment for changes made by the malware sample and identify evasion techniques used by the malware. For example, if malware uses Windows® API hooking mechanism to hide from a list of running processes, the same behavioral information is recorded and made available as an evasion technique.
[0040] Static Analysis is another way pattern behavior generation module 132 can identify malware behavior. The technique uses pre-known patterns of assembly instructions to identify specific behavior within the malware executable image. Sometimes the malware executable is encrypted or obfuscated and is decrypted only when the malware sample actually executes on a machine. In such cases, the malware is allowed to execute on a machine and the static analysis is done on extracted content. Patterns of code used by evasive techniques can be identified and made a part of pattern behavior generation module 132.
[0041] Dynamic analysis can be used by pattern behavior generation module 132 to study the changes made to a live virtual machine after a malware sample has executed. Different techniques like API hooking can be used to detect the changes made to the system by the malware. Various evasion techniques can be identified by using the knowledge that the malware was actually executed on the machine and analyzing various system artifacts like running processes or loaded modules in that light. For example, if a malware launches a process called malwareSample.exe and the process is not visible in the list of running processes on the system, the process is made available as an evasion technique that the malware hides the process it executes in
[0042] After the sample has been analyzed, reports/ artifacts about the analysis of the malware at different stages can be generated. The reports are analyzed and a set of behavior logs can be generated. The logs analyze indications of malware behavior from different stages of analysis and combine them into a complete behavioral analysis log. The analysis log is the sample behavior artifact (e.g., sample behavior 140) which can be used by malware mitigation module 122.
[0043] Example of elements of 'sample behavior' can include names of files, registry entries and kernel objects (like mutex) created by the malware. Evasion techniques can also be indicated in sample behavior artifacts. Whether the malware creates hidden files, the malware depicts rootkit-like behavior, or processes of the malware can be enumerated may also be ascertained.
[0044] The sample behavior log can be analyzed and matched against the behavior of known malware families to identify a family of a specific malware sample. A database of malware family behavior 150 may be maintained for this purpose. Once the family is identified, common behavior indicators of all members of this particular malware family can be gathered from database of malware family behavior 150 and family behavior 142 for the malware sample can be generated.
[0045] One example of family behavior can include information about polymorphic malware behavior. Some malware can create files in a user's AppData folder. When executed at different instances, the same sample creates files with different names randomizing the file name, so that the malware cannot be identified by the files it created. In such cases, the malware can be identified by patterns of commonality in randomized behavior. For example, some malware can create files with a different name but use the same md5 hash, within the same sub-folder in the AppData folder. If the characteristic of creating files of different names but the same md5 checksum can be attributed to all samples of the malware family, this characteristic can be used as an indicator to identify the infection of all samples of that family. Such information consisting of family specific behavior is fed to malware detection module 120 and malware mitigation module 122 which can use this information to generate detection/mitigation tasks. For example, for samples of families that change the name of the file but keep the folder-name and md5 checksum same, an identification task can be generated to look for specific md5 in a specific folder within the electronic device
[0046] The identification of generic or common malware like behavior can involve establishing behavioral parameters that are common to known malware and are almost never exhibited by benign software. This involves identifying common patterns like creation of specific registry entries, installation in startup folder of unsigned or unverified binaries, etc. For example, it is known that unsigned programs present in startup folder are commonly malicious, especially if they are not accompanied by corresponding entries in a list of installed programs or the binaries of such components are not signed from a verified publisher. Malware detection module 120 and malware mitigation module 122 might generate detection tasks (e.g. analysis log 126) and mitigation tasks (e.g., reverse malware behavior actions 128) to look for such binaries in the system and compare their checksum with known malware or family behavior to identify possible infection and remediate accordingly.
[0047] Turning to FIGURE 3, FIGURE 3 is a simplified diagram of a portion of communication system lOOa-lOOc for the mitigation of malware in accordance with an embodiment of the present disclosure. Security module 118 can be configured to communicate malware pattern behavior 124 to malware detection module 120. Malware detection module 120 can be configured to compile the various indicators of malware behavior (e.g. sample behavior 140, family behavior 142, generic malware behavior 144, etc.) to identify if electronic device 102a-102c is infected and if infected, malware mitigation module 122 can be used to remediate electronic device 102a-102c of the infection. In an example, malware mitigation module 122 may analyze all the indicators and generate tasks to identify the indicators of infection in electronic device 102a-102c. Based on specific sample, family and generic behavior (e.g., sample behavior 140, family behavior 142, and generic malware behavior 144), the tasks may be executed in an execution environment in which the malware infection has possibly taken place before and a machine restart may not have taken place. The malware may have infected the memory, registry and file system of the machine. Some indicators of infection like file system and registry changes made by the malware may be difficult to find in this environment because malware may have subverted the operating system to hide the indicators of its presence. Some other indicators can only be found in this environment because they are available in operating system live memory and will be lost on a machine restart. These include mutexes, events, API hooks etc. installed on electronic device 102a-102c.
[0048] In another example, the tasks may be executed in an environment that executes out of a write protected disk image file present in the operating system. This can be useful for recovering operating system files in case of infection of corruption of OS files. The same environment can be used to bypass malwares ability to evade detection and eliminate boot-persistence of the malware sample to rid the machine of infection
[0049] For malware that evades detection in Windows® RE, the only way to identify and remediate or mitigate the malware may be to boot the machine in a secondary OS and use an NTFS driver to iterate through the files of the machine and look for file system based indicators of infection. The choice of a secondary OS that supports iterating though NTFS file system can help with the detection and removal of evasive file system artifacts. The tasks may be generated in line with the behavior indicators from the specific sample, family, and generic malware behavior. For example, if a sample was found to create registry entries such as to launch an executable program as a part of windows startup process, detection task may be generated to look for that specific registry key
[0050] The results of the execution of detection and mitigation tasks can fed back to malware mitigation module 122 via a feedback loop 146. Malware mitigation module 122 may determine the next action based on analyzing the results from the tasks. In another example, malware detection module 120 can analyze the results of the tasks. If the results indicate that the electronic device is infected, malware mitigation module 122 can generate mitigation tasks to be executed in one of the three environments mentioned above (e.g., sandbox 116, recovery environment 136, or secondary operating system 138). If the results indicate no infection, malware mitigation module 122 may decide to check again in another environment and generate tasks accordingly. Similarly, if malware mitigation module 122 can confirm that there are not any known indicators of infection, malware mitigation module 122 may declare the electronic device as clean or benign.
[0051] Feedback loop 146 may be executed re-iteratively to completely de-infect the electronic device. For example, if malware detection module 120 determines that a malware sample is found to infect a file system and registry but the family behavior indicates that the file name and md5 are randomized from one instance to another, malware mitigation module 122 may generate task to look for the malware sample in a live environment. If the malware sample is not found, malware mitigation module 122 may again generate a task to look for malware sample in a recovery environment. If evidence of the malware sample is not found, a task may be generated to look for a specific folder. On finding the specific folder, a task may be generated to calculate md5 of all files in the folder. If no md5 match is found from known bad md5 checksums of the particular family, a task may be generated to look for registry entries which are modified to execute a program on machine startup. If a suspicious program is found registered as a startup program, a task may be generated to look for an executable file associated with the program and the md5 of the malware sample may be found to be amongst the known bad md5 checksums. A task may be generated to delete this malware sample and registry entry and re-boot the machine in live windows OS and look for in-memory indicators like mutexes and events. Finally, after a result of several iterations, malware mitigation module 122 can conclude if the machine is completely rid of the particular malware sample.
[0052] Turning to FIGURE4, FIGURE 4 is an example flowchart illustrating possible operations of a flow 400 that may be associated with the mitigation of malware, in accordance with an embodiment. In an embodiment, one or more operations of flow 400 may be performed by security module 118, malware detection module 120, malware mitigation module 122, network security module 130, and pattern behavior generation module 132. At 402, malware is allowed to run. At 404, actions performed by the malware are observed and changes to the system are recorded. For example, pattern behavior generation module 132 may observe actions performed by malware and create malware pattern behavior 124. At 406, undo actions that reverse the changes to the system caused by the malware are created.
[0053] Turning to FIGURE5, FIGURE 5 is an example flowchart illustrating possible operations of a flow 500 that may be associated with the mitigation of malware, in accordance with an embodiment. In an embodiment, one or more operations of flow 500 may be performed by security module 118, malware detection module 120, malware mitigation module 122, network security module 130, and pattern behavior generation module 132. At 502, malware has changed a system. For example, malware detection module 120 may have determined that malware changed or altered electronic device 102a. At 504, the malware is identified. For example, using analysis log 126, the malware may be identified. Analysis log 126 can be created by malware detection module 120 by logging changes to electronic device 102a. At 506, malware pattern behavior is determined. At 508, reverse malware behavior actions are determined. For example, reverse malware behavior actions 128 are determined. At 510, the reverse malware behavior actions are performed to undo the changes the malware made to the system. For example, malware mitigation module 122 may execute reverse malware behavior actions 128 to undo the changes the malware made to electronic device 102a.
[0054] FIGURE 6 illustrates a computing system 600 that is arranged in a point-to- point (PtP) configuration according to an embodiment. In particular, FIGURE 6 shows a system where processors, memory, and input/output devices are interconnected by a number of point-to-point interfaces. Generally, one or more of the network elements of communication system 10 may be configured in the same or similar manner as computing system 600.
[0055] As illustrated in FIGURE 6, system 600 may include several processors, of which only two, processors 670 and 680, are shown for clarity. While two processors 670 and 680 are shown, it is to be understood that an embodiment of system 600 may also include only one such processor. Processors 670 and 680 may each include a set of cores (i.e., processor cores 674A and 674B and processor cores 684A and 684B) to execute multiple threads of a program. The cores may be configured to execute instruction code in a manner similar to that discussed above with reference to FIGURES 1-5. Each processor 670, 680 may include at least one shared cache 671, 681. Shared caches 671, 681 may store data (e.g., instructions) that are utilized by one or more components of processors 670, 680, such as processor cores 674 and 684.
[0056] Processors 670 and 680 may also each include integrated memory controller logic (MC) 672 and 682 to communicate with memory elements 632 and 634. Memory elements 632 and/or 634 may store various data used by processors 670 and 680. In alternative embodiments, memory controller logic 672 and 682 may be discrete logic separate from processors 670 and 680.
[0057] Processors 670 and 680 may be any type of processor and may exchange data via a point-to-point (PtP) interface 650 using point-to-point interface circuits 678 and 688, respectively. Processors 670 and 680 may each exchange data with a chipset 690 via individual point-to-point interfaces 652 and 654 using point-to-point interface circuits 676, 686, 694, and 698. Chipset 690 may also exchange data with a high-performance graphics circuit 638 via a high-performance graphics interface 639, using an interface circuit 692, which could be a PtP interface circuit. In alternative embodiments, any or all of the PtP links illustrated in FIGURE 6 could be implemented as a multi-drop bus rather than a PtP link.
[0058] Chipset 690 may be in communication with a bus 620 via an interface circuit 696. Bus 620 may have one or more devices that communicate over it, such as a bus bridge 618 and I/O devices 616. Via a bus 610, bus bridge 618 may be in communication with other devices such as a keyboard/mouse 612 (or other input devices such as a touch screen, trackball, etc.), communication devices 626 (such as modems, network interface devices, or other types of communication devices that may communicate through a computer network 660), audio I/O devices 614, and/or a data storage device 628. Data storage device 628 may store code 630, which may be executed by processors 670 and/or 680. In alternative embodiments, any portions of the bus architectures could be implemented with one or more PtP links.
[0059] The computer system depicted in FIGURE 6 is a schematic illustration of an embodiment of a computing system that may be utilized to implement various embodiments discussed herein. It will be appreciated that various components of the system depicted in FIGURE 6 may be combined in a system-on-a-chip (SoC) architecture or in any other suitable configuration. For example, embodiments disclosed herein can be incorporated into systems including mobile devices such as smart cellular telephones, tablet computers, personal digital assistants, portable gaming devices, etc. It will be appreciated that these mobile devices may be provided with SoC architectures in at least some embodiments.
[0060] Turning to FIGURE 7, FIGURE 7 is a simplified block diagram associated with an example ARM ecosystem SOC 700 of the present disclosure. At least one example implementation of the present disclosure can include the mitigation of malware features discussed herein and an ARM component. For example, the example of FIGURE 7 can be associated with any ARM core (e.g., A-7, A-15, etc.). Further, the architecture can be part of any type of tablet, smartphone (inclusive of Android™ phones, iPhones™), iPad™, Google Nexus™, Microsoft Surface™, personal computer, server, video processing components, laptop computer (inclusive of any type of notebook), Ultra book™ system, any type of touch- enabled input device, etc.
[0061] In this example of FIGURE 7, ARM ecosystem SOC 700 may include multiple cores 706-907, an L2 cache control 708, a bus interface unit 709, an L2 cache 710, a graphics processing unit (GPU) 715, an interconnect 702, a video codec 720, and a liquid crystal display (LCD) l/F 725, which may be associated with mobile industry processor interface (MIPI)/ high-definition multimedia interface (HDMI) links that couple to an LCD.
[0062] ARM ecosystem SOC 700 may also include a subscriber identity module (SIM) l/F 730, a boot read-only memory (ROM) 735, a synchronous dynamic random access memory (SDRAM) controller 740, a flash controller 745, a serial peripheral interface (SPI) master 750, a suitable power control 755, a dynamic RAM (DRAM) 760, and flash 765. In addition, one or more example embodiments include one or more communication capabilities, interfaces, and features such as instances of Bluetooth™ 770, a 3G modem 775, a global positioning system (GPS) 780, and an 802.11 Wi-Fi 785.
[0063] In operation, the example of FIGURE 7 can offer processing capabilities, along with relatively low power consumption to enable computing of various types (e.g., mobile computing, high-end digital home, servers, wireless infrastructure, etc.). In addition, such an architecture can enable any number of software applications (e.g., Android™, Adobe® Flash® Player, Java Platform Standard Edition (Java SE), JavaFX, Linux, Microsoft Windows Embedded, Symbian and Ubuntu, etc.). In at least one example embodiment, the core processor may implement an out-of-order superscalar pipeline with a coupled low-latency level-2 cache.
[0064] FIGURE 8 illustrates a processor core 800 according to an embodiment. Processor core 800 may be the core for any type of processor, such as a micro-processor, an embedded processor, a digital signal processor (DSP), a network processor, or other device to execute code. Although only one processor core 800 is illustrated in Figure 8, a processor may alternatively include more than one of the processor core 800 illustrated in Figure 8. For example, processor core 800 represents one example embodiment of processors cores 674a, 674b, 684a, and 684b shown and described with reference to processors 670 and 680 of FIGURE 6. Processor core 800 may be a single-threaded core or, for at least one embodiment, processor core 800 may be multithreaded in that it may include more than one hardware thread context (or "logical processor") per core.
[0065] FIGURE 8 also illustrates a memory 802 coupled to processor core 800 in accordance with an embodiment. Memory 802 may be any of a wide variety of memories (including various layers of memory hierarchy) as are known or otherwise available to those of skill in the art. Memory 802 may include code 804, which may be one or more instructions, to be executed by processor core 800. Processor core 800 can follow a program sequence of instructions indicated by code 804. Each instruction enters a front- end logic 806 and is processed by one or more decoders 808. The decoder may generate, as its output, a micro operation such as a fixed width micro operation in a predefined format, or may generate other instructions, microinstructions, or control signals that reflect the original code instruction. Front-end logic 806 also includes register renaming logic 810 and scheduling logic 812, which generally allocate resources and queue the operation corresponding to the instruction for execution.
[0066] Processor core 800 can also include execution logic 814 having a set of execution units 816-1 through 816-N. Some embodiments may include a number of execution units dedicated to specific functions or sets of functions. Other embodiments may include only one execution unit or one execution unit that can perform a particular function. Execution logic 814 performs the operations specified by code instructions. [0067] After completion of execution of the operations specified by the code instructions, back-end logic 818 can retire the instructions of code 804. In one embodiment, processor core 800 allows out of order execution but requires in order retirement of instructions. Retirement logic 820 may take a variety of known forms (e.g., re-order buffers or the like). In this manner, processor core 800 is transformed during execution of code 804, at least in terms of the output generated by the decoder, hardware registers and tables utilized by register renaming logic 810, and any registers (not shown) modified by execution logic 814.
[0068] Although not illustrated in FIGURE 8, a processor may include other elements on a chip with processor core 800, at least some of which were shown and described herein with reference to FIGURE 6. For example, as shown in FIGURE 6, a processor may include memory control logic along with processor core 800. The processor may include I/O control logic and/or may include I/O control logic integrated with memory control logic.
[0069] Note that with the examples provided herein, interaction may be described in terms of two, three, or more network elements. However, this has been done for purposes of clarity and example only. In certain cases, it may be easier to describe one or more of the functionalities of a given set of flows by only referencing a limited number of network elements. It should be appreciated that communication systems lOOa-lOOc and their teachings are readily scalable and can accommodate a large number of components, as well as more complicated/sophisticated arrangements and configurations. Accordingly, the examples provided should not limit the scope or inhibit the broad teachings of communication systems lOOa-lOOc as potentially applied to a myriad of other architectures.
[0070] It is also important to note that the operations in the preceding flow diagrams (i.e., FIGURES 4 and 5) illustrate only some of the possible correlating scenarios and patterns that may be executed by, or within, communication systems lOOa-lOOc. Some of these operations may be deleted or removed where appropriate, or these operations may be modified or changed considerably without departing from the scope of the present disclosure. In addition, a number of these operations have been described as being executed concurrently with, or in parallel to, one or more additional operations. However, the timing of these operations may be altered considerably. The preceding operational flows have been offered for purposes of example and discussion. Substantial flexibility is provided by communication systems lOOa-lOOc in that any suitable arrangements, chronologies, configurations, and timing mechanisms may be provided without departing from the teachings of the present disclosure.
[0071] Although the present disclosure has been described in detail with reference to particular arrangements and configurations, these example configurations and arrangements may be changed significantly without departing from the scope of the present disclosure. Moreover, certain components may be combined, separated, eliminated, or added based on particular needs and implementations. Additionally, although communication systems lOOa-lOOc have been illustrated with reference to particular elements and operations that facilitate the communication process, these elements and operations may be replaced by any suitable architecture, protocols, and/or processes that achieve the intended functionality of communication systems lOOa-lOOc.
[0072] Numerous other changes, substitutions, variations, alterations, and modifications may be ascertained to one skilled in the art and it is intended that the present disclosure encompass all such changes, substitutions, variations, alterations, and modifications as falling within the scope of the appended claims. In order to assist the United States Patent and Trademark Office (USPTO) and, additionally, any readers of any patent issued on this application in interpreting the claims appended hereto, Applicant wishes to note that the Applicant: (a) does not intend any of the appended claims to invoke paragraph six (6) of 35 U.S.C. section 112 as it exists on the date of the filing hereof unless the words "means for" or "step for" are specifically used in the particular claims; and (b) does not intend, by any statement in the specification, to limit this disclosure in any way that is not otherwise reflected in the appended claims.
OTHER NOTES AND EXAMPLES
[0073] Example CI is at least one machine readable storage medium having one or more instructions that when executed by at least one processor cause the at least one processor to allow malware to execute in a system, record changes to the system caused by the execution of the malware, and create detection tasks for the detection of the malware in an electronic device, wherein the detection tasks are at least partially based on the changes to the system caused by the execution of the malware. [0074] In Example C2, the subject matter of Example CI can optionally include where the detection tasks are created using one or more of pattern matching, global reputation analysis, program emulation, static analysis, and dynamic analysis of the malware.
[0075] In Example C3, the subject matter of any one of Examples C1-C2 can optionally include where the instructions, when executed by the at least one processor, further cause the at least one processor to identify a malware family associated with the malware, wherein the malware family includes family behavior and the detection tasks are partially based on the family behavior.
[0076] In Example C4, the subject matter of any one of Examples C1-C3 can optionally include where the detection tasks are partially based on generic malware behavior.
[0077] In Example C5, the subject matter of any one of Examples C1-C4 can optionally include where the instructions, when executed by the at least one processor, further cause the at least one processor to identify an infected electronic device using the created detection tasks and create mitigation tasks that mitigate the changes to the infected electronic device caused by the malware.
[0078] In Example C6, the subject matter of any one of Example C1-C5 can optionally include where the mitigation tasks are executed in a recovery environment.
[0079] In Example C7, the subject matter of any one of Examples C1-C6 can optionally include where the mitigation tasks are executed using a secondary operating system on the infected electronic device.
[0080] In Example C8, the subject matter of any one of Examples C1-C7 can optionally include where the secondary operating system is pushed on the infected electronic device as a boot disk.
[0081] In Example Al, an apparatus can include a pattern behavior generation module, where the pattern behavior generation module is configured to allow malware to execute in a system, record changes to the system caused by the execution of the malware, and create detection tasks for the detection of the malware in an electronic device, wherein the detection tasks are at least partially based on the changes to the system caused by the execution of the malware. [0082] In Example, A2, the subject matter of Example Al can optionally include where the detection tasks are created using one or more of pattern matching, global reputation analysis, program emulation, static analysis, and dynamic analysis of the malware.
[0083] In Example A3, the subject matter of any one of Examples A1-A2 can optionally include where the monitoring module is further configured to identify a malware family associated with the malware, wherein the malware family includes family behavior and the detection tasks are partially based on the family behavior.
[0084] In Example A4, the subject matter of any one of Examples A1-A3 can optionally include where the detection tasks are partially based on generic malware behavior.
[0085] In Example A5, the subject matter of any one of Examples A1-A4 can optionally include where the monitoring module is further configured to communicate the detection tasks to a security module, where the security module is configured to identify an infected electronic device using the created detection tasks and create mitigation tasks that mitigate the changes to the infected electronic device caused by the malware.
[0086] In Example A6, the subject matter of any one of Examples A1-A5 can optionally include where the mitigation tasks are executed in a recovery environment.
[0087] In Example A7, the subject matter of any one of Examples A1-A6 can optionally include where the mitigation tasks are executed using a secondary operating system on the infected electronic device.
[0088] In Example A8, the subject matter of any one of Examples A1-A7 can optionally include where the secondary operating system is pushed on the infected electronic device as a boot disk.
[0089] Example Ml is a method including allowing malware to execute in a system, recording changes to the system caused by the execution of the malware, and creating detection tasks for the detection of the malware in an electronic device, wherein the detection tasks are at least partially based on the changes to the system caused by the execution of the malware.
[0090] In Example M2, the subject matter of Example Ml can optionally include where the detection tasks are created using one or more of pattern matching, global reputation analysis, program emulation, static analysis, and dynamic analysis of the malware.
[0091] In Example M3, the subject matter of any one of the Examples M1-M2 can optionally include identifying a malware family associated with the malware, wherein the malware family includes family behavior and the detection tasks are partially based on the family behavior.
[0092] In Example M4, the subject matter of any one of the Examples M1-M3 can optionally include the detection tasks are partially based on generic malware behavior.
[0093] In Example M5, the subject matter of any one of the Examples M1-M4 can optionally include identifying an infected electronic device using the created detection tasks and creating mitigation tasks that mitigate the changes to the infected electronic device caused by the malware.
[0094] In Example M6, the subject matter of any one of the Examples M1-M5 can optionally include where the mitigation tasks are executed in a recovery environment.
[0095] In Example M7, the subject matter of any one of the Examples M1-M6 can optionally include where the mitigation tasks are executed using a secondary operating system on the infected electronic device.
[0096] Example SI is a system for remediation of malware, the system including a pattern behavior generation module configured to allow malware to execute in a system, record changes to the system caused by the execution of the malware, create detection tasks for the detection of the malware in an electronic device, where the detection tasks are at least partially based on the changes to the system caused by the execution of the malware and a security module configured to identify an infected electronic device using the created detection tasks and create mitigation tasks that mitigate the changes to the infected electronic device caused by the malware.
[0097] In Example S2, the subject matter of Example SI can optionally include where the detection tasks are created using one or more of pattern matching, global reputation analysis, program emulation, static analysis, and dynamic analysis of the malware.
[0098] Example XI is a machine-readable storage medium including machine- readable instructions to implement a method or realize an apparatus as in any one of the Examples A1-A8, or M1-M7. Example Yl is an apparatus comprising means for performing of any of the Example methods M1-M7. In Example Y2, the subject matter of Example Yl can optionally include the means for performing the method comprising a processor and a memory. In Example Y3, the subject matter of Example Y2 can optionally include the memory comprising machine-readable instructions.

Claims

CLAIMS:
1. At least one computer-readable medium comprising one or more instructions that when executed by at least one processor, cause the at least one processor to:
allow malware to execute in a system;
record changes to the system caused by the execution of the malware; and create detection tasks for the detection of the malware in an electronic device, wherein the detection tasks are at least partially based on the changes to the system caused by the execution of the malware.
2. The at least one computer-readable medium of Claim 1, wherein the detection tasks are created using one or more of pattern matching, global reputation analysis, program emulation, static analysis, and dynamic analysis of the malware.
3. The at least one computer-readable medium of any of Claims 1 and 2, further comprising one or more instructions that when executed by the at least one processor, cause the at least one processor to:
identify a malware family associated with the malware, wherein the malware family includes family behavior and the detection tasks are partially based on the family behavior.
4. The at least one computer-readable medium of any of Claims 1-3, wherein the detection tasks are partially based on generic malware behavior.
5. The at least one computer-readable medium of any of Claims 1-4, further comprising one or more instructions that when executed by the at least one processor, cause the at least one processor to:
identify an infected electronic device using the created detection tasks; and create mitigation tasks that mitigate the changes to the infected electronic device caused by the malware.
6. The at least one computer-readable medium of any of Claims 1-5, wherein the mitigation tasks are executed in a recovery environment.
7. The at least one computer-readable medium of any of Claims 1-6, wherein the mitigation tasks are executed using a secondary operating system on the infected electronic device.
8. The at least one computer-readable medium of any of Claims 1-7, wherein the secondary operating system is pushed on the infected electronic device as a boot disk.
9. An apparatus comprising:
a pattern behavior generation module configured to:
allow malware to execute in a system;
record changes to the system caused by the execution of the malware; and create detection tasks for the detection of the malware in an electronic device, wherein the detection tasks are at least partially based on the changes to the system caused by the execution of the malware.
10. The apparatus of Claim 9, wherein the detection tasks are created using one or more of pattern matching, global reputation analysis, program emulation, static analysis, and dynamic analysis of the malware.
11. The apparatus of any of Claims 9 and 10, wherein the pattern behavior generation module is further configured to:
identify a malware family associated with the malware, wherein the malware family includes family behavior and the detection tasks are partially based on the family behavior.
12. The apparatus of any of Claims 9-11, wherein the detection tasks are partially based on generic malware behavior.
13. The apparatus of any of Claims 9-12, wherein the pattern behavior generation module is further configured to:
communicate the detection tasks to a security module, wherein the security module is configured to:
identify an infected electronic device using the created detection tasks; and create mitigation tasks that mitigate the changes to the infected electronic device caused by the malware.
14. The apparatus of any of Claims 9-13, wherein the mitigation tasks are executed in a recovery environment.
15. The apparatus of any of Claims 9-14, wherein the mitigation tasks are executed using a secondary operating system on the infected electronic device.
16. The apparatus of any of Claims 9-15, wherein the secondary operating system is pushed on the infected electronic device as a boot disk.
17. A method comprising:
executing malware in a system;
recording changes to the system caused by the execution of the malware; and creating detection tasks for the detection of the malware in an electronic device, wherein the detection tasks are at least partially based on the changes to the system caused by the execution of the malware.
18. The method of Claim 17, wherein the detection tasks are created using one or more of pattern matching, global reputation analysis, program emulation, static analysis, and dynamic analysis of the malware.
19. The method of any of Claims 17 and 18, further comprising: identifying a malware family associated with the malware, wherein the malware family includes family behavior and the detection tasks are partially based on the family behavior.
20. The method of any of Claims 17-19, wherein the detection tasks are partially based on generic malware behavior.
21. The method of any of Claims 17-20, further comprising:
identifying an infected electronic device using the created detection tasks; and creating mitigation tasks that mitigate the changes to the infected electronic device caused by the malware.
22. The method of any of Claims 17-21, wherein the mitigation tasks are executed in a recovery environment.
23. The method of any of Claims 17-22, wherein the mitigation tasks are executed using a secondary operating system on the infected electronic device.
24. A system for the remediation of malware, the system comprising:
a pattern behavior generation module configured to:
allow malware to execute in a system;
record changes to the system caused by the execution of the malware;
create detection tasks for the detection of the malware in an electronic device, wherein the detection tasks are at least partially based on the changes to the system caused by the execution of the malware; and
a security module configured to:
identify an infected electronic device using the created detection tasks; and create mitigation tasks that mitigate the changes to the infected electronic device caused by the malware.
25. The system of Claim 24, wherein the detection tasks are created using one or more of pattern matching, global reputation analysis, program emulation, static analysis, and dynamic analysis of the malware.
PCT/US2016/033846 2015-06-27 2016-05-24 Mitigation of malware WO2017003580A1 (en)

Priority Applications (3)

Application Number Priority Date Filing Date Title
EP16818395.2A EP3314509A4 (en) 2015-06-27 2016-05-24 Mitigation of malware
CN201680037878.XA CN108064384A (en) 2015-06-27 2016-05-24 The mitigation of Malware
JP2017567410A JP6668390B2 (en) 2015-06-27 2016-05-24 Malware mitigation

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
IN3247/CHE/2015 2015-06-27
IN3247CH2015 2015-06-27

Publications (1)

Publication Number Publication Date
WO2017003580A1 true WO2017003580A1 (en) 2017-01-05

Family

ID=57608987

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/US2016/033846 WO2017003580A1 (en) 2015-06-27 2016-05-24 Mitigation of malware

Country Status (4)

Country Link
EP (1) EP3314509A4 (en)
JP (2) JP6668390B2 (en)
CN (1) CN108064384A (en)
WO (1) WO2017003580A1 (en)

Cited By (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2022012821A1 (en) * 2020-07-15 2022-01-20 British Telecommunications Public Limited Company Computer-implemented automatic security methods and systems
WO2022012822A1 (en) * 2020-07-15 2022-01-20 British Telecommunications Public Limited Company Computer-implemented automatic security methods and systems

Families Citing this family (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN109492399B (en) * 2019-01-17 2022-02-01 腾讯科技(深圳)有限公司 Risk file detection method and device and computer equipment
KR102308477B1 (en) * 2020-12-07 2021-10-06 주식회사 샌즈랩 Method for Generating Information of Malware Which Describes the Attack Charateristics of the Malware
CN113722705B (en) * 2021-11-02 2022-02-08 北京微步在线科技有限公司 Malicious program clearing method and device
CN113722714A (en) * 2021-11-03 2021-11-30 北京微步在线科技有限公司 Network threat processing method and device

Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20080256633A1 (en) * 2002-05-08 2008-10-16 International Business Machines Corporation Method and Apparatus for Determination of the Non-Replicative Behavior of a Malicious Program
US20110271341A1 (en) * 2010-04-28 2011-11-03 Symantec Corporation Behavioral signature generation using clustering
US20120144488A1 (en) * 2010-12-01 2012-06-07 Symantec Corporation Computer virus detection systems and methods
US20130333033A1 (en) * 2012-06-06 2013-12-12 Empire Technology Development Llc Software protection mechanism
US20140130157A1 (en) * 2010-01-27 2014-05-08 Ahmed Said Sallam Method and system for discrete stateful behavioral analysis

Family Cites Families (9)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP4755658B2 (en) * 2008-01-30 2011-08-24 日本電信電話株式会社 Analysis system, analysis method and analysis program
JP2010049627A (en) * 2008-08-25 2010-03-04 Hitachi Software Eng Co Ltd Computer virus detection system
US8667583B2 (en) * 2008-09-22 2014-03-04 Microsoft Corporation Collecting and analyzing malware data
US8479286B2 (en) * 2009-12-15 2013-07-02 Mcafee, Inc. Systems and methods for behavioral sandboxing
CN102314561B (en) * 2010-07-01 2014-07-23 电子科技大学 Automatic analysis method and system of malicious codes based on API (application program interface) HOOK
US8756693B2 (en) * 2011-04-05 2014-06-17 The United States Of America As Represented By The Secretary Of The Air Force Malware target recognition
US8677493B2 (en) * 2011-09-07 2014-03-18 Mcafee, Inc. Dynamic cleaning for malware using cloud technology
CN103186740B (en) * 2011-12-27 2015-09-23 北京大学 A kind of automated detection method of Android malware
US9591003B2 (en) * 2013-08-28 2017-03-07 Amazon Technologies, Inc. Dynamic application security verification

Patent Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20080256633A1 (en) * 2002-05-08 2008-10-16 International Business Machines Corporation Method and Apparatus for Determination of the Non-Replicative Behavior of a Malicious Program
US20140130157A1 (en) * 2010-01-27 2014-05-08 Ahmed Said Sallam Method and system for discrete stateful behavioral analysis
US20110271341A1 (en) * 2010-04-28 2011-11-03 Symantec Corporation Behavioral signature generation using clustering
US20120144488A1 (en) * 2010-12-01 2012-06-07 Symantec Corporation Computer virus detection systems and methods
US20130333033A1 (en) * 2012-06-06 2013-12-12 Empire Technology Development Llc Software protection mechanism

Non-Patent Citations (1)

* Cited by examiner, † Cited by third party
Title
See also references of EP3314509A4 *

Cited By (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2022012821A1 (en) * 2020-07-15 2022-01-20 British Telecommunications Public Limited Company Computer-implemented automatic security methods and systems
WO2022012822A1 (en) * 2020-07-15 2022-01-20 British Telecommunications Public Limited Company Computer-implemented automatic security methods and systems

Also Published As

Publication number Publication date
CN108064384A (en) 2018-05-22
JP2018524720A (en) 2018-08-30
JP6668390B2 (en) 2020-03-18
JP2020113290A (en) 2020-07-27
EP3314509A1 (en) 2018-05-02
EP3314509A4 (en) 2018-12-05

Similar Documents

Publication Publication Date Title
US20210019411A1 (en) Mitigation of ransomware
US11870793B2 (en) Determining a reputation for a process
US11328063B2 (en) Identification of malicious execution of a process
JP6668390B2 (en) Malware mitigation
US20200272733A1 (en) Malware detection using a digital certificate
US9961102B2 (en) Detection of stack pivoting
WO2017052888A1 (en) Enforcement of file characteristics
US10366228B2 (en) Detection and mitigation of malicious invocation of sensitive code
US20160381051A1 (en) Detection of malware
US9984230B2 (en) Profiling event based exploit detection
US20150379268A1 (en) System and method for the tracing and detection of malware
US10129291B2 (en) Anomaly detection to identify malware
US11627145B2 (en) Determining a reputation of data using a data visa including information indicating a reputation
US11386205B2 (en) Detection of malicious polyglot files
US10574672B2 (en) System and method to detect bypass of a sandbox application

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 16818395

Country of ref document: EP

Kind code of ref document: A1

ENP Entry into the national phase

Ref document number: 2017567410

Country of ref document: JP

Kind code of ref document: A

NENP Non-entry into the national phase

Ref country code: DE

WWE Wipo information: entry into national phase

Ref document number: 2016818395

Country of ref document: EP