WO2023001380A1 - Generation of a security policy for a software program - Google Patents

Generation of a security policy for a software program Download PDF

Info

Publication number
WO2023001380A1
WO2023001380A1 PCT/EP2021/070667 EP2021070667W WO2023001380A1 WO 2023001380 A1 WO2023001380 A1 WO 2023001380A1 EP 2021070667 W EP2021070667 W EP 2021070667W WO 2023001380 A1 WO2023001380 A1 WO 2023001380A1
Authority
WO
WIPO (PCT)
Prior art keywords
security policy
access
software program
blocked
processing unit
Prior art date
Application number
PCT/EP2021/070667
Other languages
French (fr)
Inventor
Irena BEREZOVSKY
Dima KUZNETSOV
Natan BROSZTEIN
Omer ANSON
Original Assignee
Huawei Cloud Computing Technologies Co., Ltd.
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Huawei Cloud Computing Technologies Co., Ltd. filed Critical Huawei Cloud Computing Technologies Co., Ltd.
Priority to CN202180100566.XA priority Critical patent/CN117730322A/en
Priority to PCT/EP2021/070667 priority patent/WO2023001380A1/en
Publication of WO2023001380A1 publication Critical patent/WO2023001380A1/en

Links

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F21/00Security arrangements for protecting computers, components thereof, programs or data against unauthorised activity
    • G06F21/60Protecting data
    • G06F21/62Protecting access to data via a platform, e.g. using keys or access control rules
    • G06F21/6218Protecting access to data via a platform, e.g. using keys or access control rules to a system of files or objects, e.g. local or distributed file system or database
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F21/00Security arrangements for protecting computers, components thereof, programs or data against unauthorised activity
    • G06F21/50Monitoring users, programs or devices to maintain the integrity of platforms, e.g. of processors, firmware or operating systems
    • G06F21/57Certifying or maintaining trusted computer platforms, e.g. secure boots or power-downs, version controls, system software checks, secure updates or assessing vulnerabilities
    • G06F21/577Assessing vulnerabilities and evaluating computer system security
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04LTRANSMISSION OF DIGITAL INFORMATION, e.g. TELEGRAPHIC COMMUNICATION
    • H04L63/00Network architectures or network communication protocols for network security
    • H04L63/20Network architectures or network communication protocols for network security for managing network security; network security policies in general

Definitions

  • Some embodiments described in the present disclosure relate to controlling access to computer resources and, more specifically, but not exclusively, to generation of a security policy for a software program.
  • system is used to mean a computerized system and the terms are used interchangeably.
  • a computer resource may be an area of memory or a file.
  • a digital communication network resource for example an identified port number or a software network endpoint such as a network socket.
  • Some other examples of a computer resource include, but are not limited to, a service provided by an operating system, an inter-process communication access point, and a device connected to a hardware processor executing at least part of the software program.
  • resources is used to mean “computer resources”, and the terms are used interchangeably.
  • a security policy blocks access by all software programs to all of the system’s computer resources, unless access to one or more resources is explicitly unblocked for one or more software programs.
  • a security policy allows access by all software programs to all of the system’s resources, unless access to one or more computer resources is explicitly blocked for one or more software programs.
  • some security policies some of the system’s resources are blocked by default and some other resources are unblocked by default, and an access to one or more computer resources are explicitly unblocked or blocked, respectively.
  • the present disclosure includes a description of a Systems and methods according to this disclosure allow for generating a security policy for a software program using a sequence of policy building iterations.
  • execution of a software program by the system is monitored to identify one or more blocked accesses where access by the software program to one or more blocked resources of the system’s plurality of computer resources is blocked by the system executing an identified security policy.
  • a modified security policy is computed to unblock access to the one or more blocked resources.
  • these steps are repeated in each of a sequence of policy building iterations, where the modified security policy of one iteration is used as the identified security policy of a next iteration of the plurality of policy building iterations.
  • a system for generating a security policy for a software program comprises a processing unit configured for in each of a sequence of policy building iterations: monitoring execution of a software program by the system where the system is configured for executing an identified security policy, thereby blocking access by the software program to a plurality of blocked computer resources, to produce monitoring data describing a plurality of accesses of the software program to a plurality of computer resources of the system; identifying in the plurality of accesses at least one blocked access by the software program to at least one of the plurality of blocked computer resources; computing a modified security policy to unblock access by the software program to the at least one blocked computer resource; and instructing configuration of the system to execute the modified security policy as the identified security policy in a next iteration of the sequence of policy building iterations.
  • Modifying an identified security policy to unblock access by the software program to one or more blocked computer resources according to one or more blocked access identified when executing the software program by a system configured to execute the identified security policy increases accuracy of the modified security policy, reducing an amount of unblocked resources not required by the software program, and additionally or alternatively reducing an amount of blocked resources required by the software program.
  • a method for generating a security policy for a software program comprises in each of a sequence of policy building iterations: monitoring execution of a software program by a system where the system is configured for executing an identified security policy, thereby blocking access by the software program to a plurality of blocked computer resources, to produce monitoring data describing a plurality of accesses of the software program to a plurality of computer resources of the system; identifying in the plurality of accesses at least one blocked access by the software program to at least one of the plurality of blocked computer resources; computing a modified security policy to unblock access by the software program to the at least one blocked computer resource; and instructing configuration of the system to execute the modified security policy as the identified security policy in a next iteration of the sequence of policy building iterations.
  • a software program product for generating a security policy for a software program comprises: a non-transitory computer readable storage medium; first program instructions for monitoring execution of a software program by a system where the system is configured for executing an identified security policy, thereby blocking access by the software program to a plurality of blocked computer resources, to produce monitoring data describing a plurality of accesses of the software program to a plurality of computer resources of the system; second program instructions for identifying in the plurality of accesses at least one blocked access by the software program to at least one of the plurality of blocked computer resources; third program instructions for computing a modified security policy to unblock access by the software program to the at least one blocked computer resource; and fourth program instructions for instructing configuration of the system to execute the modified security policy as the identified security policy in a next iteration of the sequence of policy building iterations.
  • the first, second, third, and fourth program instructions are executed by at least one computerized processor from the non-transitory computer readable storage medium.
  • At least one of the plurality of blocked computer resources is selected from the list of computer resources comprising at least one of: an area of a memory, a file, a device connected to the processing unit, an inter-process communication access point, a digital communication network resource, and a service provided by an operating system executed by the processing unit.
  • at least one of the plurality of blocked computer resources is selected from the group consisting of: an area of a memory, a file, a device connected to the processing unit, an inter-process communication access point, a digital communication network resource, and a service provided by an operating system executed by the processing unit.
  • the plurality of blocked computer resources is the plurality of computer resources of the system. Blocking access to the plurality of computer resources of the system in the first policy building iteration increases accuracy of the preferred security policy, reducing an amount of unblocked resources not needed by the software program.
  • the processing unit is further configured for: determining the identified security policy is a preferred security policy subject to failing to identify the at least one blocked access; and providing the preferred security policy to at least one user of the system. Determining the identified security policy is a preferred security policy subject to failing to identify the at least one blocked access increases accuracy of the preferred security policy, reducing an amount of unblocked resources not needed by the software program.
  • providing the preferred security policy comprises at least one of: storing the preferred security policy in a non-volatile digital storage connected to the processing unit; adding a log entry to one or more of: a log file, a log of the system; sending a message to at least one other processing unit via a digital communication network interface connected to the processing unit; and displaying a message on a display device connected to the processing unit.
  • monitoring execution of the software program comprises at least one of: monitoring a log of the system, executing a command, and capturing digital communication network traffic.
  • the identified security policy comprises a plurality of access entries.
  • computing the modified security policy comprises: computing at least one new access entry to allow access by the software program to the at least one blocked computer resource; and adding the at least one new access entry to the identified security policy.
  • computing the modified security policy comprises modifying at least one of the plurality of access entries to allow access by the software program to the at least one blocked computer resource.
  • computing the modified security policy comprises deleting from the identified security policy at least one of the plurality of access entries.
  • identifying the at least one blocked access comprises: extracting from the monitoring data a plurality of access violation entries; and identifying in the plurality of access violation entries at least one access violation entry indicative of the at least one blocked access.
  • computing the modified security policy comprises: extracting at least one access value from the at least one access violation entry; and computing the modified security policy using the at least one access value. Using at least one access value extracted from the monitoring data increases accuracy of the modified security policy.
  • execution of the software program comprises providing the software program with an identified set of input values.
  • execution of the software program comprises executing the software program in a software testing environment.
  • the processing unit is further configured for: identifying in the monitoring data at least one unexpected outcome of executing the software program; and providing a notification of the at least one unexpected outcome to another user of the system subj ect to failing to identify the at least one blocked access or failing to identify an association between the at least one blocked access and the at least one unexpected outcome.
  • Providing the software program with an identified set of input values, and additionally or alternatively executing the software program in a software testing environment allows identifying one or more unexpected outcomes when executing the software program and notifying thereof a software developer of the software program, facilitating increasing accuracy of the software program.
  • FIG. 1 is a flowchart schematically representing an optional flow of operations
  • FIG. 2 is a schematic block diagram of an exemplary system
  • FIG. 3 is a flowchart schematically representing an optional flow of operations for generating a security policy
  • FIG. 4 is a flowchart schematically representing an optional flow of operations for identifying a blocked access.
  • FIG. 5 is a flowchart schematically representing an additional optional flow of operations for generating a security policy.
  • a disclosure in connection with a described method may also hold true for a corresponding apparatus or system configured to perform the method and vice versa.
  • a corresponding device may include one or a plurality of units, e.g. functional units, to perform the described one or plurality of method steps (e.g. one unit performing the one or plurality of steps, or a plurality of units each performing one or more of the plurality of steps), even if such one or more units are not explicitly described or illustrated in the figures.
  • a specific apparatus is described based on one or a plurality of units, e.g.
  • a corresponding method may include one step to perform the functionality of the one or plurality of units (e.g. one step performing the functionality of the one or plurality of units, or a plurality of steps each performing the functionality of one or more of the plurality of units), even if such one or plurality of steps are not explicitly described or illustrated in the figures. Further, it is understood that the features of the various exemplary embodiments and/or aspects described herein may be combined with each other, unless specifically noted otherwise.
  • a security expert In software development it is recommended to design security as part of the development process of a software program, however in practice this is frequently not done. Without automatic processes for generating a security policy during software development, a security expert typically specifies a security policy for the software program manually, after development is completed. Even when working in cooperation with a software developer, such a manually specified security policy is error prone and tends to be too lenient, allowing access to a greater amount of computer resources than is actually needed for correct operation of the software program. This leniency is often a result of a difficulty to achieve fine-grained identification of the computer resources needed by the software program. For example, a software program may require access to an identified amount of ports of a digital communication network.
  • a manually generated security policy may be accurate and comprise explicit permission for the software program to access each of the ports.
  • a manually generated security policy may comprise permission for the software program to access a range of ports, some of which not necessary for the operation of the software program and thus increasing a security risk in a system configured to execute the manually generated security policy. This is an example of an inaccurate security policy.
  • a system is configured for executing an identified security policy blocking access by a software program to a plurality of blocked computer resources, and execution of the software program by the system is monitored to produce data describing a plurality of accesses of the software program to a plurality of computer resources of the system.
  • Monitoring execution of the software program by the system comprises for example monitoring of a log of the system, for example a Security-Enhanced Linux (SELinux) audit-log, for example for a virtual machine executing a flavor of the Linux operating system.
  • SELinux Security-Enhanced Linux
  • monitoring execution of the software program comprises executing a command, for example for the purpose of collecting system information.
  • the present disclosure proposes, in some embodiments, identifying in the plurality of accesses at least one blocked access by the software program to one or more of the plurality of blocked resources, and computing a modified security policy to unblock access by the software program to the one or more blocked resources.
  • Computing the modified security policy optionally comprises modifying the identified security policy to unblock access by the software program to the one or more blocked resources.
  • Executing the software program in a system configured to execute a security policy blocking access by the software program to a plurality of blocked computer resources allows identifying one or more blocked computer resources required by the software program while access to other blocked computer resources of the plurality of blocked computer resources remains blocked.
  • Computing the modified security policy to unblock access by the software program to the one or more blocked computer resources allows unblocking access to the one or more blocked resources that the software program accessed while still blocking access by the software program to the other blocked computer resources.
  • Modifying a security policy to block access by the software program to the other blocked computer resources while allowing access to the one or more blocked resources the software program accessed increases accuracy of the modified security policy, thus increasing security of a system executing the modified security policy.
  • the present disclosure proposes repeating the steps described above iteratively, such that in each of a sequence of policy building iterations the system is configured to execute the modified security policy as the identified security policy in a next iteration of the sequence of policy building iterations.
  • Configuring the system to execute the modified security policy as the identified security policy in a next iteration facilitates an incremental generation of a security policy for the software program, as in the next iteration access to the one or more blocked resources is allowed and thus access to the one or more blocked resources will not be identified as a blocked access.
  • Such incremental generation of the security policy increases accuracy of the security policy as the incremental generation allows fine grained identification of the blocked resources for which the software program needs access to, compared to a manual examination of the code or relying on communication between a software developer and a security expert to identify blocked resources required by the software program.
  • Fine grained identification of the blocked resources for which the software program needs access to increases accuracy of a modified security policy, increasing an amount of blocked resources that the software program needs accesses to and are unblocked in the modified security policy, while reducing an amount of other blocked resources that the software program did not try to access and are unblocked in the modified security policy.
  • the plurality of blocked resources is the plurality of computer resources of the system, such that the identified security policy blocks access to the plurality of computer resources of the system.
  • a security policy that blocks access to the plurality of computer resources of the system increases accuracy of a modified security policy as this reduces a likelihood of the modified security policy unblocking access to a blocked resource not needed by the software program.
  • the identified security policy used in the iteration is determined to be a preferred security policy, and may be provided to one or more users of the system, for example a security expert of the system or a software developer of the software program.
  • the preferred security policy may be stored in a non-volatile digital storage of the system.
  • providing the preferred security policy comprises adding a log entry, for example to a log of the system or to a log file.
  • Other optional methods of providing the preferred security policy include sending a message via a digital communication network, and displaying a message on a display device of the system.
  • Some examples of a message are an electronic-mail (email) message and a message on an instant messaging service, some examples being Slack and Signal.
  • a preferred security policy may be added to a software product of the software program.
  • the present disclosure proposes identifying in the monitoring data one or more unexpected outcomes of executing the software program.
  • Such an unexpected outcome may be unrelated to the plurality of accesses, that is the unexpected outcome cannot be explained by a blocked access to a blocked resource or when there are no blocked accesses.
  • Such an unexpected outcome may indicate an error in implementation of the software program (colloquially know as a bug).
  • a notification of the one or more unexpected outcomes is provided to another user of the system, for example a quality assurance professional.
  • Embodiments may be a system, a method, and/or a computer program product.
  • the computer program product may include a computer readable storage medium (or media) having computer readable program instructions thereon for causing a processor to carry out aspects of the embodiments.
  • the computer readable storage medium can be a tangible device that can retain and store instructions for use by an instruction execution device.
  • the computer readable storage medium may be, for example, but is not limited to, an electronic storage device, a magnetic storage device, an optical storage device, an electromagnetic storage device, a semiconductor storage device, or any suitable combination of the foregoing.
  • a non-exhaustive list of more specific examples of the computer readable storage medium includes the following: a portable computer diskette, a hard disk, a random access memory (RAM), a read-only memory (ROM), an erasable programmable read-only memory (EPROM or Flash memory), a static random access memory (SRAM), a portable compact disc read-only memory (CD-ROM), a digital versatile disk (DVD), a memory stick, a floppy disk, and any suitable combination of the foregoing.
  • RAM random access memory
  • ROM read-only memory
  • EPROM or Flash memory erasable programmable read-only memory
  • SRAM static random access memory
  • CD-ROM compact disc read-only memory
  • DVD digital versatile disk
  • memory stick a floppy disk, and any suitable combination of the foregoing.
  • a computer readable storage medium is not to be construed as being transitory signals per se, such as radio waves or other freely propagating electromagnetic waves, electromagnetic waves propagating through a waveguide or other transmission media (e.g., light pulses passing through a fiber-optic cable), or electrical signals transmitted through a wire.
  • Computer readable program instructions described herein can be downloaded to respective computing/processing devices from a computer readable storage medium or to an external computer or external storage device via a network, for example, the Internet, a local area network, a wide area network and/or a wireless network.
  • the network may comprise copper transmission cables, optical transmission fibers, wireless transmission, routers, firewalls, switches, gateway computers and/or edge servers.
  • a network adapter card or network interface in each computing/processing device receives computer readable program instructions from the network and forwards the computer readable program instructions for storage in a computer readable storage medium within the respective computing/processing device.
  • Computer readable program instructions for carrying out operations of embodiments may be assembler instructions, instruction-set-architecture (ISA) instructions, machine instructions, machine dependent instructions, microcode, firmware instructions, state-setting data, or either source code or object code, natively compiled or compiled just-in-time (JIT), written in any combination of one or more programming languages, including an object oriented programming language such as Smalltalk, C++, Java, Object-Oriented Fortran or the like, an interpreted programming language such as JavaScript, Python or the like, and conventional procedural programming languages, such as the "C" programming language, Fortran, or similar programming languages.
  • object oriented programming language such as Smalltalk, C++, Java, Object-Oriented Fortran or the like
  • an interpreted programming language such as JavaScript, Python or the like
  • conventional procedural programming languages such as the "C" programming language, Fortran, or similar programming languages.
  • the computer readable program instructions may execute entirely on the user's computer, partly on the user's computer, as a stand-alone software package, partly on the user's computer and partly on a remote computer or entirely on the remote computer or server.
  • the remote computer may be connected to the user's computer through any type of network, including a local area network (LAN) or a wide area network (WAN), or the connection may be made to an external computer (for example, through the Internet using an Internet Service Provider).
  • LAN local area network
  • WAN wide area network
  • Internet Service Provider for example, AT&T, MCI, Sprint, EarthLink, MSN, GTE, etc.
  • electronic circuitry including, for example, programmable logic circuitry, field-programmable gate arrays (FPGA), or programmable logic arrays (PLA) may execute the computer readable program instructions by utilizing state information of the computer readable program instructions to personalize the electronic circuitry, in order to perform aspects of embodiments.
  • FPGA field-programmable gate arrays
  • PLA programmable logic arrays
  • These computer readable program instructions may be provided to a processor of a general purpose computer, special purpose computer, or other programmable data processing apparatus to produce a machine, such that the instructions, which execute via the processor of the computer or other programmable data processing apparatus, create means for implementing the functions/acts specified in the flowchart and/or block diagram block or blocks.
  • These computer readable program instructions may also be stored in a computer readable storage medium that can direct a computer, a programmable data processing apparatus, and/or other devices to function in a particular manner, such that the computer readable storage medium having instructions stored therein comprises an article of manufacture including instructions which implement aspects of the function/act specified in the flowchart and/or block diagram block or blocks.
  • the computer readable program instructions may also be loaded onto a computer, other programmable data processing apparatus, or other device to cause a series of operational steps to be performed on the computer, other programmable apparatus or other device to produce a computer implemented process, such that the instructions which execute on the computer, other programmable apparatus, or other device implement the fimctions/acts specified in the flowchart and/or block diagram block or blocks.
  • each block in the flowchart or block diagrams may represent a module, segment, or portion of instructions, which comprises one or more executable instructions for implementing the specified logical function(s).
  • the functions noted in the block may occur out of the order noted in the figures.
  • two blocks shown in succession may, in fact, be executed substantially concurrently, or the blocks may sometimes be executed in the reverse order, depending upon the functionality involved.
  • a system in 101 is configured to execute an identified security policy, governing access to a plurality of computer resources of the system.
  • the identified security policy allows a software program access to a plurality of unblocked resources of the system’s plurality of computer resources.
  • the identified security policy blocks access by the software program to a plurality of blocked resources of the system’s plurality of computer resources.
  • the software program accesses some of the system’s plurality of computer resources.
  • the software program may access one or more of the plurality of unblocked resources, to which access is allowed (i.e., unblocked).
  • access by the software program to one or more blocked resources of the plurality of blocked resources is blocked.
  • execution of the software program by the system is monitored to produce in 111 monitoring data describing a plurality of accesses by the software program to the plurality of computer resources, for example in 102, 103 and 104.
  • the one or more accesses of 104 are identified, i.e. one or more blocked accesses to the one or more blocked resources.
  • at least some of the one or more blocked resources is added to the plurality of unblocked resources and in 114 a modified security policy is computed allowing the software program access to the one or more blocked resources.
  • the modified security policy is a most restrictive security policy allowing the software program access to the plurality of unblocked resources including the one or more blocked resources added thereto.
  • method 100 is executed in each of a sequence of policy building iterations.
  • the modified security policy is used as the identified security policy, such that when 101 is executed in a next iteration of the sequence of iterations the system is configured to execute the modified security policy.
  • system 200 comprises a processing unit 201.
  • Processing unit 201 may be any kind of programmable or non programmable circuitry that is configured to carry out the operations described in the present disclosure.
  • the processing unit may comprise hardware as well as software.
  • the processing unit may comprise one or more processors and a transitory or non-transitory memory that carries a program which causes the processing unit to perform the respective operations when the program is executed by the one or more processors.
  • processing unit 201 comprises memory 206.
  • processing unit 201 is connected to memory 206.
  • memory 206 is the memory that carries a program executed by processing unit 201.
  • memory 206 is a memory remote to processing unit 201, connected to another processing unit.
  • processing unit 201 is connected to one or more non-volatile digital storage 203.
  • a non-volatile digital storage are a hard disk drive, a solid state drive, a networked connected storage and a network storage.
  • processing unit 201 is connected to one or more digital communication network interface 202.
  • network interface is used to mean “one or more digital communication network interface”.
  • network interface 202 is connected to a local area network (LAN), for example an Ethernet network or a Wi-Fi network.
  • network interface 202 is connected to a wide area network (WAN), for example a cellular network or the Internet.
  • digital storage 203 is connected to processing unit 201 via network interface 202.
  • processing unit 201 is connected to one or more display devices 204, for example a monitor or a flat-panel display.
  • processing unit 201 is connected to one or more devices 205.
  • one or more devices 205 comprise one or more display devices 204.
  • one or more devices 205 comprise network interface 202.
  • one or more devices 205 comprise one or more digital storage 203.
  • Some other examples of a device are a camera, a microphone, a speaker, a touchscreen, and a sensor.
  • system 200 executes the following optional method.
  • processing unit 201 executes a sequence of policy building iterations.
  • processing unit 201 configures system 200 to execute an identified security policy thereby blocking access by the software program to a plurality of blocked computer resources.
  • the plurality of blocked computer resources is at least some of a plurality of computer resources of the system.
  • a computer resource may be an area of memory 206, optionally identified by an address or by a range of addresses.
  • a computer resource is a file, for example a file stored on digital storage 203.
  • a computer resource is one of one or more devices 205.
  • the device is identified by a handle, for example a file descriptor or an identifier.
  • the device is network interface 202.
  • the device is one or more digital storage 203.
  • Another example of a computer resource is a digital communication network resource, for example a port number, a network address of another processing unit or a network socket.
  • Some other examples of a computer resource include, but are not limited to, an inter-process communication access point executed by processing unit 201, and a service provided by an operating system executed by processing unit 201.
  • the service may be a service that requires privileges, for example a service that creates or deletes a process executed by processing unit 201.
  • the identified security policy comprises a plurality of access entries, for example a plurality of access rules each blocking or unblocking access to at least some of the plurality of resources of the system.
  • Another example of an access entry is an iptables chain.
  • the plurality of blocked computer resources is the plurality of computer resources of the system. For example, when configuring the system comprises executing iptables, in the first iteration processing unit 201 may execute the command “iptables -drop-all”.
  • processing unit 201 further executes the software program.
  • the software program is executed in one or more known scenarios such that executing the software program comprises providing the software program with an identified set of input values.
  • the identified set of input values optionally includes one or more of: a configuration value, a user input value, a provided file, a message sent to the software program, and a graphical user interface interaction.
  • execution of the software program comprises executing the software program in a software testing environment, for example within an integrated development environment (IDE) or within a testing platform.
  • IDE integrated development environment
  • processing unit 201 monitors execution of the software program by system
  • processing unit 201 by processing unit 201, producing monitoring data describing a plurality of accesses of the software program to the plurality of computer resources of the system.
  • processing unit 201 monitors a log of system 200, for example an SELinux audit-log.
  • processing unit 201 may configure SELinux to log violations to an identified violation-log file, monitored by processing unit 201 in 320.
  • processing unit 201 executes a command, for example a command to collect system data.
  • a command for example a command to collect system data.
  • processing unit 201 executes a Linux operating system
  • processing unit 201 may execute the command dmesg.
  • processing unit 201 may create a new iptables chain that logs and drops all traffic, and configure iptables to forward to the new iptables chain all network traffic flows unmatched by other flows configured in iptables.
  • the new iptables chain logs to an identified network log.
  • Such configuration facilitates identifying all network flows not explicitly addressed otherwise by iptables and additionally blocking network flows not explicitly addressed otherwise by iptables by dropping.
  • processing unit 201 optionally identifies in the plurality of accesses one or more blocked accesses by the software program to one or more of the plurality of blocked resources.
  • processing unit 201 extracts a plurality of access violation entries from the monitoring data. For example, when system 200 implements SELinux, processing unit 201 may provide the identified violation-log file to the audit2allow utility. An output of executing audit2allow utility in response to the identified violation-log file is optionally an SELinux policy rule. An access violation entry is optionally an SELinux policy rule produces by executing audit2allow.
  • processing unit 201 optionally analyzes the network log to identify one or more security violations.
  • processing unit 201 identifies in the plurality of access violations one or more access violation entries indicative of the one or more blocked accesses.
  • processing unit 201 optionally determines whether one or more blocked accesses were identified in 330. When one or more blocked accesses are identified, in 341 processing unit 201 optionally computes a modified security policy.
  • processing unit 201 extracts one or more access values from the one or more violation entries identified in 402. Some examples of an access value are a port number, a network address, a file name and a file descriptor name.
  • processing unit 201 optionally computes the modified security policy using the one or more access values. For example, processing unit 201 may use the one or more access values to identify one or more blocking rules in the identified security policy, blocking access by the software program to the one or more blocked resources.
  • processing unit 201 deletes from the identified security policy one or more of the plurality of access entries, for example an access entry that blocks access to the one or more blocked resources.
  • processing unit 201 identifies the one or more access entries according to the one or more access values.
  • when computing the modified security policy processing unit 201 may modify one or more of the plurality of access entries that block access to the one or more blocked resources.
  • processing unit 201 computes one or more new access entries to allow access by the software program to the one or more blocked resources.
  • processing unit 201 uses the one or more access values to compute the one or more new access entries.
  • the one or more new access entries may be one or more new SELinux rules, for example one or more rules generated by audit2allow.
  • the one or more new access entries are one or more rows in a table, for example in iptables.
  • a new access entry is a blacklist entry, blocking access to one or more computer resources.
  • a new access entry is a whitelist entry, allowing access to one or more computer resources.
  • the one or more new access entries comprise a combination of whitelist and blacklist entries to allow access by the software program to the one or more blocked resources.
  • processing unit 201 may modify the blacklist rule to block access to addresses in the range of memory addresses less than the identified memory address, and additionally processing unit 201 may add a new rule to block access to addresses in the range of memory addresses greater than the identified memory address.
  • processing unit 201 may delete the blacklist rule and add two new rules, one blocking access to addresses in the range of memory addresses less than the identified memory address and the other blocking access to addresses in the range of memory addresses greater than the identified memory address.
  • processing unit 201 adds the one or more new access entries to the identified security policy.
  • processing unit 201 optionally instructs configuration of system 200 to execute the modified security policy as the identified security policy in a next iteration of the sequence of policy building iterations.
  • processing unit 201 determines a failure to identify the one or more blocked accesses in 330.
  • FIG. 5 showing a flowchart schematically representing an additional optional flow of operations 500 for generating a security policy, according to some embodiments.
  • processing unit 201 determines that the identified security policy is a preferred security policy.
  • an iteration in which processing unit 201 determines the preferred security policy is a last iteration of the sequence of policy building iterations.
  • processing unit 201 provides the preferred security policy to one or more users of the system, for example a security professional or a software developer.
  • processing unit 201 provides the preferred security policy by saving thereof to one or more digital storage 203.
  • processing unit 201 provides the preferred security policy by sending a message to at least one other processing unit via network interface 202.
  • Some examples of a message sent via a network interface are an email message and a message in an instant messaging service.
  • processing unit 201 provides the preferred security policy by displaying a message on one or more display device 204.
  • processing unit 201 provides the preferred security policy by adding a log entry to a log file or a log of system 200.
  • the log file is a log file of a policy generation process implementing method 100
  • processing unit 201 identifies one or more unexpected outcomes of executing the software program, for example when the software program is provided with an identified set of input values.
  • the one or more unexpected outcomes are identified when the software program is executed in a software testing environment.
  • processing unit 201 identifies the one or more unexpected outcomes when no blocked accesses are identified in 330.
  • processing unit 201 fails to identify an association between the one or more unexpected outcomes and the one or more blocked accesses identified in 330.
  • processing unit 201 optionally provides a notification of the one or more unexpected outcomes identified in 510 to one or more other users of the system, for example a quality assurance professional.
  • processing unit 201 determines the identified security policy is the preferred security policy subject to failing to identify the one or more unexpected outcomes in 510.
  • monitoring data and computer resource
  • compositions comprising, “comprising”, “includes”, “including”, “having” and their conjugates mean “including but not limited to”. This term encompasses the terms “consisting of and “consisting essentially of'.
  • Consisting essentially of' means that the composition or method may include additional ingredients and/or steps, but only if the additional ingredients and/or steps do not materially alter the basic and novel characteristics of the claimed composition or method.
  • a compound or “at least one compound” may include a plurality of compounds, including mixtures thereof.
  • range format is merely for convenience and brevity and should not be construed as an inflexible limitation on the scope of embodiments. Accordingly, the description of a range should be considered to have specifically disclosed all the possible subranges as well as individual numerical values within that range. For example, description of a range such as from 1 to 6 should be considered to have specifically disclosed subranges such as from 1 to 3, from 1 to 4, from 1 to 5, from 2 to 4, from 2 to 6, from 3 to 6 etc., as well as individual numbers within that range, for example, 1, 2, 3, 4, 5, and 6. This applies regardless of the breadth of the range.
  • a numerical range is indicated herein, it is meant to include any cited numeral (fractional or integral) within the indicated range.
  • the phrases “ranging/ranges between” a first indicate number and a second indicate number and “ranging/ranges from” a first indicate number “to” a second indicate number are used herein interchangeably and are meant to include the first and second indicated numbers and all the fractional and integral numerals therebetween.

Abstract

A system for generating a security policy for a software program, comprising a processing unit configured for executing a sequence of policy building iterations. In each of the sequence of policy building iterations, the processing unit is configured for: producing monitoring data describing a plurality of accesses of the software program to a plurality of computer resources of the system by monitoring execution of the software program when the system is configured for executing a security policy which blocks access by the software program to a plurality of blocked computer resources; identifying in the plurality of accesses a blocked access to one the plurality of blocked resources; computing a modified security policy to unblock access to the blocked resource; and instructing configuration of the system to execute the modified security policy as the identified security policy in a next iteration of the sequence of policy building iterations.

Description

GENERATION OF A SECURITY POLICY FOR A SOFTWARE PROGRAM
BACKGROUND
Some embodiments described in the present disclosure relate to controlling access to computer resources and, more specifically, but not exclusively, to generation of a security policy for a software program.
For brevity, unless explicitly noted otherwise henceforth the term “system” is used to mean a computerized system and the terms are used interchangeably.
In a system executing a software program, the software program may require access to one or more computer resources in order to operate correctly. A computer resource may be an area of memory or a file. Another example of a computer resource is a digital communication network resource, for example an identified port number or a software network endpoint such as a network socket. Some other examples of a computer resource include, but are not limited to, a service provided by an operating system, an inter-process communication access point, and a device connected to a hardware processor executing at least part of the software program.
Henceforth the term “resources” is used to mean “computer resources”, and the terms are used interchangeably.
There is an increasing need to protect systems from malicious attacks. To do so, some systems regulate access to one or more of the system’s computer resources, when accessed by one or more programs executed by the system. A set of rules and practices that govern access to the system’s computer resources is also known as a security policy.
There exist various types of security policies. In one example, by default a security policy blocks access by all software programs to all of the system’s computer resources, unless access to one or more resources is explicitly unblocked for one or more software programs. Conversely, in another example a security policy allows access by all software programs to all of the system’s resources, unless access to one or more computer resources is explicitly blocked for one or more software programs. In some security policies, some of the system’s resources are blocked by default and some other resources are unblocked by default, and an access to one or more computer resources are explicitly unblocked or blocked, respectively. In order for a system executing a software program to operate correctly, there is a need to configure the system to execute a security policy such that the software program has access to all computer resources needed for the software program’s operation. However, to reduce a likelihood of the software program causing damage to the system, for example due to malicious code injected into the software program, it is desirable that the system execute a security policy that additionally blocks the software program from accessing computer resources that are not required for the software program’s operation.
SUMMARY
The present disclosure includes a description of a Systems and methods according to this disclosure allow for generating a security policy for a software program using a sequence of policy building iterations.
In some implementations of such a system, execution of a software program by the system is monitored to identify one or more blocked accesses where access by the software program to one or more blocked resources of the system’s plurality of computer resources is blocked by the system executing an identified security policy. In such implementations, when a blocked access is identified, a modified security policy is computed to unblock access to the one or more blocked resources. In such implementations these steps are repeated in each of a sequence of policy building iterations, where the modified security policy of one iteration is used as the identified security policy of a next iteration of the plurality of policy building iterations.
The foregoing and other objects are achieved by the features of the independent claims. Further implementation forms are apparent from the dependent claims, the description and the figures.
According to a first aspect, a system for generating a security policy for a software program comprises a processing unit configured for in each of a sequence of policy building iterations: monitoring execution of a software program by the system where the system is configured for executing an identified security policy, thereby blocking access by the software program to a plurality of blocked computer resources, to produce monitoring data describing a plurality of accesses of the software program to a plurality of computer resources of the system; identifying in the plurality of accesses at least one blocked access by the software program to at least one of the plurality of blocked computer resources; computing a modified security policy to unblock access by the software program to the at least one blocked computer resource; and instructing configuration of the system to execute the modified security policy as the identified security policy in a next iteration of the sequence of policy building iterations. Modifying an identified security policy to unblock access by the software program to one or more blocked computer resources according to one or more blocked access identified when executing the software program by a system configured to execute the identified security policy increases accuracy of the modified security policy, reducing an amount of unblocked resources not required by the software program, and additionally or alternatively reducing an amount of blocked resources required by the software program.
According to a second aspect, a method for generating a security policy for a software program comprises in each of a sequence of policy building iterations: monitoring execution of a software program by a system where the system is configured for executing an identified security policy, thereby blocking access by the software program to a plurality of blocked computer resources, to produce monitoring data describing a plurality of accesses of the software program to a plurality of computer resources of the system; identifying in the plurality of accesses at least one blocked access by the software program to at least one of the plurality of blocked computer resources; computing a modified security policy to unblock access by the software program to the at least one blocked computer resource; and instructing configuration of the system to execute the modified security policy as the identified security policy in a next iteration of the sequence of policy building iterations.
According to a third aspect, a software program product for generating a security policy for a software program comprises: a non-transitory computer readable storage medium; first program instructions for monitoring execution of a software program by a system where the system is configured for executing an identified security policy, thereby blocking access by the software program to a plurality of blocked computer resources, to produce monitoring data describing a plurality of accesses of the software program to a plurality of computer resources of the system; second program instructions for identifying in the plurality of accesses at least one blocked access by the software program to at least one of the plurality of blocked computer resources; third program instructions for computing a modified security policy to unblock access by the software program to the at least one blocked computer resource; and fourth program instructions for instructing configuration of the system to execute the modified security policy as the identified security policy in a next iteration of the sequence of policy building iterations. The first, second, third, and fourth program instructions are executed by at least one computerized processor from the non-transitory computer readable storage medium.
In a possible implementation form of the apparatus according to the first aspect or the method according to the second aspect, at least one of the plurality of blocked computer resources is selected from the list of computer resources comprising at least one of: an area of a memory, a file, a device connected to the processing unit, an inter-process communication access point, a digital communication network resource, and a service provided by an operating system executed by the processing unit. Optionally, at least one of the plurality of blocked computer resources is selected from the group consisting of: an area of a memory, a file, a device connected to the processing unit, an inter-process communication access point, a digital communication network resource, and a service provided by an operating system executed by the processing unit. Optionally, in a first policy building iteration of the sequence of policy building iterations the plurality of blocked computer resources is the plurality of computer resources of the system. Blocking access to the plurality of computer resources of the system in the first policy building iteration increases accuracy of the preferred security policy, reducing an amount of unblocked resources not needed by the software program.
In a possible implementation form of the apparatus according to the first aspect or the method according to the second aspect, the processing unit is further configured for: determining the identified security policy is a preferred security policy subject to failing to identify the at least one blocked access; and providing the preferred security policy to at least one user of the system. Determining the identified security policy is a preferred security policy subject to failing to identify the at least one blocked access increases accuracy of the preferred security policy, reducing an amount of unblocked resources not needed by the software program. Optionally, providing the preferred security policy comprises at least one of: storing the preferred security policy in a non-volatile digital storage connected to the processing unit; adding a log entry to one or more of: a log file, a log of the system; sending a message to at least one other processing unit via a digital communication network interface connected to the processing unit; and displaying a message on a display device connected to the processing unit.
In a possible implementation form of the apparatus according to the first aspect or the method according to the second aspect, monitoring execution of the software program comprises at least one of: monitoring a log of the system, executing a command, and capturing digital communication network traffic. In a possible implementation form of the apparatus according to the first aspect or the method according to the second aspect, the identified security policy comprises a plurality of access entries. Optionally, computing the modified security policy comprises: computing at least one new access entry to allow access by the software program to the at least one blocked computer resource; and adding the at least one new access entry to the identified security policy. Optionally, computing the modified security policy comprises modifying at least one of the plurality of access entries to allow access by the software program to the at least one blocked computer resource. Optionally, computing the modified security policy comprises deleting from the identified security policy at least one of the plurality of access entries. Optionally, identifying the at least one blocked access comprises: extracting from the monitoring data a plurality of access violation entries; and identifying in the plurality of access violation entries at least one access violation entry indicative of the at least one blocked access. Optionally, computing the modified security policy comprises: extracting at least one access value from the at least one access violation entry; and computing the modified security policy using the at least one access value. Using at least one access value extracted from the monitoring data increases accuracy of the modified security policy.
In a possible implementation form of the apparatus according to the first aspect or the method according to the second aspect, execution of the software program comprises providing the software program with an identified set of input values. Optionally, execution of the software program comprises executing the software program in a software testing environment. Optionally, the processing unit is further configured for: identifying in the monitoring data at least one unexpected outcome of executing the software program; and providing a notification of the at least one unexpected outcome to another user of the system subj ect to failing to identify the at least one blocked access or failing to identify an association between the at least one blocked access and the at least one unexpected outcome. Providing the software program with an identified set of input values, and additionally or alternatively executing the software program in a software testing environment allows identifying one or more unexpected outcomes when executing the software program and notifying thereof a software developer of the software program, facilitating increasing accuracy of the software program.
Other systems, methods, features, and advantages of the present disclosure will be or become apparent to one with skill in the art upon examination of the following drawings and detailed description. It is intended that all such additional systems, methods, features, and advantages be included within this description, be within the scope of the present disclosure, and be protected by the accompanying claims.
Unless otherwise defined, all technical and/or scientific terms used herein have the same meaning as commonly understood by one of ordinary skill in the art to which embodiments. Although methods and materials similar or equivalent to those described herein can be used in the practice or testing of embodiments, exemplary methods and/or materials are described below. In case of conflict, the patent specification, including definitions, will control. In addition, the materials, methods, and examples are illustrative only and are not intended to be necessarily limiting.
BRIEF DESCRIPTION OF THE DRAWINGS
The present invention will now be described by way of example with reference to the accompanying drawings. In the drawings:
FIG. 1 is a flowchart schematically representing an optional flow of operations;
FIG. 2 is a schematic block diagram of an exemplary system;
FIG. 3 is a flowchart schematically representing an optional flow of operations for generating a security policy;
FIG. 4 is a flowchart schematically representing an optional flow of operations for identifying a blocked access; and
FIG. 5 is a flowchart schematically representing an additional optional flow of operations for generating a security policy.
DETAILED DESCRIPTION OF THE DRAWINGS
In the following description, reference is made to the accompanying drawings, which form part of the disclosure, and which show, by way of illustration, specific aspects of embodiments of the invention or specific aspects in which embodiments of the present invention may be used. It is understood that embodiments of the invention may be used in other aspects and comprise structural or logical changes not depicted in the figures. The following detailed description, therefore, is not to be taken in a limiting sense, and the scope of the present invention is defined by the appended claims.
For instance, it is understood that a disclosure in connection with a described method may also hold true for a corresponding apparatus or system configured to perform the method and vice versa. For example, if one or a plurality of specific method steps are described, a corresponding device may include one or a plurality of units, e.g. functional units, to perform the described one or plurality of method steps (e.g. one unit performing the one or plurality of steps, or a plurality of units each performing one or more of the plurality of steps), even if such one or more units are not explicitly described or illustrated in the figures. On the other hand, for example, if a specific apparatus is described based on one or a plurality of units, e.g. functional units, a corresponding method may include one step to perform the functionality of the one or plurality of units (e.g. one step performing the functionality of the one or plurality of units, or a plurality of steps each performing the functionality of one or more of the plurality of units), even if such one or plurality of steps are not explicitly described or illustrated in the figures. Further, it is understood that the features of the various exemplary embodiments and/or aspects described herein may be combined with each other, unless specifically noted otherwise.
As described above, there is a need to configure a system to execute a security policy such that a software program executed by the system has access to all computer resources needed for the software program’s operation, while access by the software program to other computer resources is blocked.
In software development it is recommended to design security as part of the development process of a software program, however in practice this is frequently not done. Without automatic processes for generating a security policy during software development, a security expert typically specifies a security policy for the software program manually, after development is completed. Even when working in cooperation with a software developer, such a manually specified security policy is error prone and tends to be too lenient, allowing access to a greater amount of computer resources than is actually needed for correct operation of the software program. This leniency is often a result of a difficulty to achieve fine-grained identification of the computer resources needed by the software program. For example, a software program may require access to an identified amount of ports of a digital communication network. For a small amount of ports, a manually generated security policy may be accurate and comprise explicit permission for the software program to access each of the ports. However, as these ports may change over time when the software program is developed and when the amount of ports exceeds some threshold, a manually generated security policy may comprise permission for the software program to access a range of ports, some of which not necessary for the operation of the software program and thus increasing a security risk in a system configured to execute the manually generated security policy. This is an example of an inaccurate security policy.
There exist methods for reducing security risks in software containers, for example in Linux containers, however such methods apply to containers and do not apply to software programs that are executed as native programs by a processing circuitry. In addition, such methods do not block software programs executing within a container; according to such methods a software program executing in a container is allowed to access any computer resource within the boundaries of its container. Such access may be broader than is needed for correct operation of the software program, creating a potential security risk.
There is a need to increase accuracy of a security policy for a software program, reducing an amount of unblocked computer resources not used by the software program, while reducing an amount of blocked computer resources needed for correct operation of the software program.
To increase accuracy of a security policy for a software program, some embodiments described in the present application propose an iterative method of automatically generating a security policy. In such embodiments, a system is configured for executing an identified security policy blocking access by a software program to a plurality of blocked computer resources, and execution of the software program by the system is monitored to produce data describing a plurality of accesses of the software program to a plurality of computer resources of the system. Monitoring execution of the software program by the system comprises for example monitoring of a log of the system, for example a Security-Enhanced Linux (SELinux) audit-log, for example for a virtual machine executing a flavor of the Linux operating system. Optionally, monitoring execution of the software program comprises executing a command, for example for the purpose of collecting system information. The present disclosure proposes, in some embodiments, identifying in the plurality of accesses at least one blocked access by the software program to one or more of the plurality of blocked resources, and computing a modified security policy to unblock access by the software program to the one or more blocked resources. Computing the modified security policy optionally comprises modifying the identified security policy to unblock access by the software program to the one or more blocked resources. Executing the software program in a system configured to execute a security policy blocking access by the software program to a plurality of blocked computer resources allows identifying one or more blocked computer resources required by the software program while access to other blocked computer resources of the plurality of blocked computer resources remains blocked. Computing the modified security policy to unblock access by the software program to the one or more blocked computer resources allows unblocking access to the one or more blocked resources that the software program accessed while still blocking access by the software program to the other blocked computer resources. Modifying a security policy to block access by the software program to the other blocked computer resources while allowing access to the one or more blocked resources the software program accessed increases accuracy of the modified security policy, thus increasing security of a system executing the modified security policy.
In addition, in some embodiments the present disclosure proposes repeating the steps described above iteratively, such that in each of a sequence of policy building iterations the system is configured to execute the modified security policy as the identified security policy in a next iteration of the sequence of policy building iterations. Configuring the system to execute the modified security policy as the identified security policy in a next iteration facilitates an incremental generation of a security policy for the software program, as in the next iteration access to the one or more blocked resources is allowed and thus access to the one or more blocked resources will not be identified as a blocked access. Such incremental generation of the security policy increases accuracy of the security policy as the incremental generation allows fine grained identification of the blocked resources for which the software program needs access to, compared to a manual examination of the code or relying on communication between a software developer and a security expert to identify blocked resources required by the software program. Fine grained identification of the blocked resources for which the software program needs access to increases accuracy of a modified security policy, increasing an amount of blocked resources that the software program needs accesses to and are unblocked in the modified security policy, while reducing an amount of other blocked resources that the software program did not try to access and are unblocked in the modified security policy.
Optionally, in a first iteration of the sequence of policy building iterations, the plurality of blocked resources is the plurality of computer resources of the system, such that the identified security policy blocks access to the plurality of computer resources of the system. Starting with a security policy that blocks access to the plurality of computer resources of the system increases accuracy of a modified security policy as this reduces a likelihood of the modified security policy unblocking access to a blocked resource not needed by the software program.
Optionally, when no blocked access is identified in the plurality of accesses of an iteration, the identified security policy used in the iteration is determined to be a preferred security policy, and may be provided to one or more users of the system, for example a security expert of the system or a software developer of the software program. To provide the preferred security policy to the one or more users the preferred security policy may be stored in a non-volatile digital storage of the system. Additionally, or alternatively, providing the preferred security policy comprises adding a log entry, for example to a log of the system or to a log file. Other optional methods of providing the preferred security policy include sending a message via a digital communication network, and displaying a message on a display device of the system. Some examples of a message are an electronic-mail (email) message and a message on an instant messaging service, some examples being Slack and Signal. A preferred security policy may be added to a software product of the software program.
In addition, in some embodiments the present disclosure proposes identifying in the monitoring data one or more unexpected outcomes of executing the software program. Such an unexpected outcome may be unrelated to the plurality of accesses, that is the unexpected outcome cannot be explained by a blocked access to a blocked resource or when there are no blocked accesses. Such an unexpected outcome may indicate an error in implementation of the software program (colloquially know as a bug). Optionally, in such embodiments a notification of the one or more unexpected outcomes is provided to another user of the system, for example a quality assurance professional.
Before explaining at least one embodiment in detail, it is to be understood that embodiments are not necessarily limited in its application to the details of construction and the arrangement of the components and/or methods set forth in the following description and/or illustrated in the drawings and/or the Examples. Implementations described herein are capable of other embodiments or of being practiced or carried out in various ways.
Embodiments may be a system, a method, and/or a computer program product. The computer program product may include a computer readable storage medium (or media) having computer readable program instructions thereon for causing a processor to carry out aspects of the embodiments.
The computer readable storage medium can be a tangible device that can retain and store instructions for use by an instruction execution device. The computer readable storage medium may be, for example, but is not limited to, an electronic storage device, a magnetic storage device, an optical storage device, an electromagnetic storage device, a semiconductor storage device, or any suitable combination of the foregoing. A non-exhaustive list of more specific examples of the computer readable storage medium includes the following: a portable computer diskette, a hard disk, a random access memory (RAM), a read-only memory (ROM), an erasable programmable read-only memory (EPROM or Flash memory), a static random access memory (SRAM), a portable compact disc read-only memory (CD-ROM), a digital versatile disk (DVD), a memory stick, a floppy disk, and any suitable combination of the foregoing. A computer readable storage medium, as used herein, is not to be construed as being transitory signals per se, such as radio waves or other freely propagating electromagnetic waves, electromagnetic waves propagating through a waveguide or other transmission media (e.g., light pulses passing through a fiber-optic cable), or electrical signals transmitted through a wire.
Computer readable program instructions described herein can be downloaded to respective computing/processing devices from a computer readable storage medium or to an external computer or external storage device via a network, for example, the Internet, a local area network, a wide area network and/or a wireless network. The network may comprise copper transmission cables, optical transmission fibers, wireless transmission, routers, firewalls, switches, gateway computers and/or edge servers. A network adapter card or network interface in each computing/processing device receives computer readable program instructions from the network and forwards the computer readable program instructions for storage in a computer readable storage medium within the respective computing/processing device.
Computer readable program instructions for carrying out operations of embodiments may be assembler instructions, instruction-set-architecture (ISA) instructions, machine instructions, machine dependent instructions, microcode, firmware instructions, state-setting data, or either source code or object code, natively compiled or compiled just-in-time (JIT), written in any combination of one or more programming languages, including an object oriented programming language such as Smalltalk, C++, Java, Object-Oriented Fortran or the like, an interpreted programming language such as JavaScript, Python or the like, and conventional procedural programming languages, such as the "C" programming language, Fortran, or similar programming languages. The computer readable program instructions may execute entirely on the user's computer, partly on the user's computer, as a stand-alone software package, partly on the user's computer and partly on a remote computer or entirely on the remote computer or server. In the latter scenario, the remote computer may be connected to the user's computer through any type of network, including a local area network (LAN) or a wide area network (WAN), or the connection may be made to an external computer (for example, through the Internet using an Internet Service Provider). In some embodiments, electronic circuitry including, for example, programmable logic circuitry, field-programmable gate arrays (FPGA), or programmable logic arrays (PLA) may execute the computer readable program instructions by utilizing state information of the computer readable program instructions to personalize the electronic circuitry, in order to perform aspects of embodiments.
Aspects of embodiments are described herein with reference to flowchart illustrations and/or block diagrams of methods, apparatus (systems), and computer program products according to embodiments. It will be understood that each block of the flowchart illustrations and/or block diagrams, and combinations of blocks in the flowchart illustrations and/or block diagrams, can be implemented by computer readable program instructions.
These computer readable program instructions may be provided to a processor of a general purpose computer, special purpose computer, or other programmable data processing apparatus to produce a machine, such that the instructions, which execute via the processor of the computer or other programmable data processing apparatus, create means for implementing the functions/acts specified in the flowchart and/or block diagram block or blocks. These computer readable program instructions may also be stored in a computer readable storage medium that can direct a computer, a programmable data processing apparatus, and/or other devices to function in a particular manner, such that the computer readable storage medium having instructions stored therein comprises an article of manufacture including instructions which implement aspects of the function/act specified in the flowchart and/or block diagram block or blocks.
The computer readable program instructions may also be loaded onto a computer, other programmable data processing apparatus, or other device to cause a series of operational steps to be performed on the computer, other programmable apparatus or other device to produce a computer implemented process, such that the instructions which execute on the computer, other programmable apparatus, or other device implement the fimctions/acts specified in the flowchart and/or block diagram block or blocks.
The flowchart and block diagrams in the Figures illustrate the architecture, functionality, and operation of possible implementations of systems, methods, and computer program products according to various embodiments. In this regard, each block in the flowchart or block diagrams may represent a module, segment, or portion of instructions, which comprises one or more executable instructions for implementing the specified logical function(s). In some alternative implementations, the functions noted in the block may occur out of the order noted in the figures. For example, two blocks shown in succession may, in fact, be executed substantially concurrently, or the blocks may sometimes be executed in the reverse order, depending upon the functionality involved. It will also be noted that each block of the block diagrams and/or flowchart illustration, and combinations of blocks in the block diagrams and/or flowchart illustration, can be implemented by special purpose hardware-based systems that perform the specified functions or acts or carry out combinations of special purpose hardware and computer instructions.
Reference is now made to FIG. 1, showing a flowchart schematically representing an optional flow of operations 100, according to some embodiments. In such embodiments, in 101 a system is configured to execute an identified security policy, governing access to a plurality of computer resources of the system. Optionally, the identified security policy allows a software program access to a plurality of unblocked resources of the system’s plurality of computer resources. Optionally, the identified security policy blocks access by the software program to a plurality of blocked resources of the system’s plurality of computer resources. When the system executes the software program, in 102 the software program accesses some of the system’s plurality of computer resources. For example, in 103 the software program may access one or more of the plurality of unblocked resources, to which access is allowed (i.e., unblocked). Optionally, in 104 access by the software program to one or more blocked resources of the plurality of blocked resources is blocked.
Optionally, execution of the software program by the system is monitored to produce in 111 monitoring data describing a plurality of accesses by the software program to the plurality of computer resources, for example in 102, 103 and 104. In 112, the one or more accesses of 104 are identified, i.e. one or more blocked accesses to the one or more blocked resources. Optionally, in 113 at least some of the one or more blocked resources is added to the plurality of unblocked resources and in 114 a modified security policy is computed allowing the software program access to the one or more blocked resources. Optionally, the modified security policy is a most restrictive security policy allowing the software program access to the plurality of unblocked resources including the one or more blocked resources added thereto. Optionally, method 100 is executed in each of a sequence of policy building iterations. Thus, in 115 the modified security policy is used as the identified security policy, such that when 101 is executed in a next iteration of the sequence of iterations the system is configured to execute the modified security policy.
Reference is now made also to FIG. 2, showing a schematic block diagram of an exemplary system 200, according to some embodiments. In such embodiments, system 200 comprises a processing unit 201. Processing unit 201 may be any kind of programmable or non programmable circuitry that is configured to carry out the operations described in the present disclosure. The processing unit may comprise hardware as well as software. For example, the processing unit may comprise one or more processors and a transitory or non-transitory memory that carries a program which causes the processing unit to perform the respective operations when the program is executed by the one or more processors. Optionally, processing unit 201 comprises memory 206. Optionally, processing unit 201 is connected to memory 206. Optionally, memory 206 is the memory that carries a program executed by processing unit 201. Optionally, memory 206 is a memory remote to processing unit 201, connected to another processing unit.
Optionally, processing unit 201 is connected to one or more non-volatile digital storage 203. Some examples of a non-volatile digital storage are a hard disk drive, a solid state drive, a networked connected storage and a network storage. Optionally, processing unit 201 is connected to one or more digital communication network interface 202. For brevity, henceforth the term “network interface” is used to mean “one or more digital communication network interface”. Optionally, network interface 202 is connected to a local area network (LAN), for example an Ethernet network or a Wi-Fi network. Optionally, network interface 202 is connected to a wide area network (WAN), for example a cellular network or the Internet. Optionally, digital storage 203 is connected to processing unit 201 via network interface 202.
Optionally, processing unit 201 is connected to one or more display devices 204, for example a monitor or a flat-panel display. Optionally, processing unit 201 is connected to one or more devices 205. Optionally, one or more devices 205 comprise one or more display devices 204. Optionally, one or more devices 205 comprise network interface 202. Optionally, one or more devices 205 comprise one or more digital storage 203. Some other examples of a device are a camera, a microphone, a speaker, a touchscreen, and a sensor.
To generate a security policy for a software program, in some embodiments system 200 executes the following optional method.
Reference is now made also to FIG. 3, showing a flowchart schematically representing an optional flow of operations 300 for generating a security policy for a software program, according to some embodiments. In such embodiments, processing unit 201 executes a sequence of policy building iterations. Optionally, in each of the sequence of policy building iterations, in 301 processing unit 201 configures system 200 to execute an identified security policy thereby blocking access by the software program to a plurality of blocked computer resources. Optionally, the plurality of blocked computer resources is at least some of a plurality of computer resources of the system. A computer resource may be an area of memory 206, optionally identified by an address or by a range of addresses. Optionally, a computer resource is a file, for example a file stored on digital storage 203. Optionally, a computer resource is one of one or more devices 205. Optionally, the device is identified by a handle, for example a file descriptor or an identifier. Optionally the device is network interface 202. Optionally the device is one or more digital storage 203. Another example of a computer resource is a digital communication network resource, for example a port number, a network address of another processing unit or a network socket. Some other examples of a computer resource include, but are not limited to, an inter-process communication access point executed by processing unit 201, and a service provided by an operating system executed by processing unit 201. The service may be a service that requires privileges, for example a service that creates or deletes a process executed by processing unit 201.
Optionally, the identified security policy comprises a plurality of access entries, for example a plurality of access rules each blocking or unblocking access to at least some of the plurality of resources of the system. Another example of an access entry is an iptables chain. Optionally, in a first iteration of the sequence of policy building iterations, the plurality of blocked computer resources is the plurality of computer resources of the system. For example, when configuring the system comprises executing iptables, in the first iteration processing unit 201 may execute the command “iptables -drop-all”. When configuring the system comprises executing a policy based on one or more allow rules each explicitly allowing access to one or more computer resources, for example SELinux, in the first iteration the policy may be an empty policy, having no access entries, or access rules. Optionally, in 310 processing unit 201 further executes the software program. Optionally, the software program is executed in one or more known scenarios such that executing the software program comprises providing the software program with an identified set of input values. The identified set of input values optionally includes one or more of: a configuration value, a user input value, a provided file, a message sent to the software program, and a graphical user interface interaction. Optionally, execution of the software program comprises executing the software program in a software testing environment, for example within an integrated development environment (IDE) or within a testing platform.
Optionally, in 320 processing unit 201 monitors execution of the software program by system
200 by processing unit 201, producing monitoring data describing a plurality of accesses of the software program to the plurality of computer resources of the system. Optionally, in 320 processing unit 201 monitors a log of system 200, for example an SELinux audit-log. When system 200 implements SELinux, processing unit 201 may configure SELinux to log violations to an identified violation-log file, monitored by processing unit 201 in 320. Optionally, in 320 processing unit 201 executes a command, for example a command to collect system data. For example, when processing unit 201 executes a Linux operating system, in 320 processing unit
201 may execute the command dmesg. In another example, when executing 301 comprises executing the command “iptables”, processing unit 201 may create a new iptables chain that logs and drops all traffic, and configure iptables to forward to the new iptables chain all network traffic flows unmatched by other flows configured in iptables. Optionally, the new iptables chain logs to an identified network log. Such configuration facilitates identifying all network flows not explicitly addressed otherwise by iptables and additionally blocking network flows not explicitly addressed otherwise by iptables by dropping.
In 330, processing unit 201 optionally identifies in the plurality of accesses one or more blocked accesses by the software program to one or more of the plurality of blocked resources.
Reference is now made also to FIG. 4, showing a flowchart schematically representing an optional flow of operations 400 for identifying a blocked access, according to some embodiments. In such embodiments, in 401 processing unit 201 extracts a plurality of access violation entries from the monitoring data. For example, when system 200 implements SELinux, processing unit 201 may provide the identified violation-log file to the audit2allow utility. An output of executing audit2allow utility in response to the identified violation-log file is optionally an SELinux policy rule. An access violation entry is optionally an SELinux policy rule produces by executing audit2allow. In another example, when system 200 executes iptables, in 401 processing unit 201 optionally analyzes the network log to identify one or more security violations. Optionally, in 402 processing unit 201 identifies in the plurality of access violations one or more access violation entries indicative of the one or more blocked accesses.
Reference is now made again to FIG. 3. In 340 processing unit 201 optionally determines whether one or more blocked accesses were identified in 330. When one or more blocked accesses are identified, in 341 processing unit 201 optionally computes a modified security policy.
Reference is now made again to FIG. 4. Optionally, in 410 processing unit 201 extracts one or more access values from the one or more violation entries identified in 402. Some examples of an access value are a port number, a network address, a file name and a file descriptor name. In 412, processing unit 201 optionally computes the modified security policy using the one or more access values. For example, processing unit 201 may use the one or more access values to identify one or more blocking rules in the identified security policy, blocking access by the software program to the one or more blocked resources. Optionally, when computing the modified security policy, processing unit 201 deletes from the identified security policy one or more of the plurality of access entries, for example an access entry that blocks access to the one or more blocked resources. Optionally, processing unit 201 identifies the one or more access entries according to the one or more access values. In another example, when computing the modified security policy processing unit 201 may modify one or more of the plurality of access entries that block access to the one or more blocked resources.
Optionally, when computing the modified security policy, processing unit 201 computes one or more new access entries to allow access by the software program to the one or more blocked resources. Optionally, processing unit 201 uses the one or more access values to compute the one or more new access entries. For example, when system 200 implements SELinux, the one or more new access entries may be one or more new SELinux rules, for example one or more rules generated by audit2allow. Optionally, the one or more new access entries are one or more rows in a table, for example in iptables. Optionally, a new access entry is a blacklist entry, blocking access to one or more computer resources. Optionally, a new access entry is a whitelist entry, allowing access to one or more computer resources. Optionally, the one or more new access entries comprise a combination of whitelist and blacklist entries to allow access by the software program to the one or more blocked resources. For example, when the identified security policy comprises a blacklist rule blocking access to a range of memory addresses and the one or more blocked resources comprises an identified memory address in the range of memory addresses, processing unit 201 may modify the blacklist rule to block access to addresses in the range of memory addresses less than the identified memory address, and additionally processing unit 201 may add a new rule to block access to addresses in the range of memory addresses greater than the identified memory address. Alternatively, in this example processing unit 201 may delete the blacklist rule and add two new rules, one blocking access to addresses in the range of memory addresses less than the identified memory address and the other blocking access to addresses in the range of memory addresses greater than the identified memory address.
Optionally, processing unit 201 adds the one or more new access entries to the identified security policy.
Reference is now made again to FIG. 3. In 344, processing unit 201 optionally instructs configuration of system 200 to execute the modified security policy as the identified security policy in a next iteration of the sequence of policy building iterations.
Optionally, in 340 processing unit 201 determines a failure to identify the one or more blocked accesses in 330. Reference is now made to FIG. 5, showing a flowchart schematically representing an additional optional flow of operations 500 for generating a security policy, according to some embodiments. In such embodiments, subject to failing to identify in the one or more blocked accesses, in 501 processing unit 201 determines that the identified security policy is a preferred security policy. Optionally, an iteration in which processing unit 201 determines the preferred security policy is a last iteration of the sequence of policy building iterations. Optionally, in 502 processing unit 201 provides the preferred security policy to one or more users of the system, for example a security professional or a software developer. Optionally, processing unit 201 provides the preferred security policy by saving thereof to one or more digital storage 203. Optionally, processing unit 201 provides the preferred security policy by sending a message to at least one other processing unit via network interface 202. Some examples of a message sent via a network interface are an email message and a message in an instant messaging service. Optionally, processing unit 201 provides the preferred security policy by displaying a message on one or more display device 204. Optionally, processing unit 201 provides the preferred security policy by adding a log entry to a log file or a log of system 200. Optionally, the log file is a log file of a policy generation process implementing method 100
Optionally, in 510 processing unit 201 identifies one or more unexpected outcomes of executing the software program, for example when the software program is provided with an identified set of input values. Optionally the one or more unexpected outcomes are identified when the software program is executed in a software testing environment. Optionally, processing unit 201 identifies the one or more unexpected outcomes when no blocked accesses are identified in 330. Optionally, processing unit 201 fails to identify an association between the one or more unexpected outcomes and the one or more blocked accesses identified in 330. Subject to failing to identify the one or more blocked accesses or failing to identify the association, in 511 processing unit 201 optionally provides a notification of the one or more unexpected outcomes identified in 510 to one or more other users of the system, for example a quality assurance professional. Optionally, processing unit 201 determines the identified security policy is the preferred security policy subject to failing to identify the one or more unexpected outcomes in 510.
The descriptions of the various embodiments have been presented for purposes of illustration, but are not intended to be exhaustive or limited to the embodiments disclosed. Many modifications and variations will be apparent to those of ordinary skill in the art without departing from the scope and spirit of the described embodiments. The terminology used herein was chosen to best explain the principles of the embodiments, the practical application or technical improvement over technologies found in the marketplace, or to enable others of ordinary skill in the art to understand the embodiments disclosed herein.
It is expected that during the life of a patent maturing from this application many relevant monitoring data and computer resources will be developed and the scope of the terms “monitoring data” and “computer resource” is intended to include all such new technologies a priori.
As used herein the term “about” refers to ± 10 %.
The terms "comprises", "comprising", "includes", "including", “having” and their conjugates mean "including but not limited to". This term encompasses the terms "consisting of and "consisting essentially of'. The phrase "consisting essentially of' means that the composition or method may include additional ingredients and/or steps, but only if the additional ingredients and/or steps do not materially alter the basic and novel characteristics of the claimed composition or method.
As used herein, the singular form "a", "an" and "the" include plural references unless the context clearly dictates otherwise. For example, the term "a compound" or "at least one compound" may include a plurality of compounds, including mixtures thereof.
The word “exemplary” is used herein to mean “serving as an example, instance or illustration”. Any embodiment described as “exemplary” is not necessarily to be construed as preferred or advantageous over other embodiments and/or to exclude the incorporation of features from other embodiments.
The word “optionally” is used herein to mean “is provided in some embodiments and not provided in other embodiments”. Any particular embodiment may include a plurality of “optional” features unless such features conflict.
Throughout this application, various embodiments may be presented in a range format. It should be understood that the description in range format is merely for convenience and brevity and should not be construed as an inflexible limitation on the scope of embodiments. Accordingly, the description of a range should be considered to have specifically disclosed all the possible subranges as well as individual numerical values within that range. For example, description of a range such as from 1 to 6 should be considered to have specifically disclosed subranges such as from 1 to 3, from 1 to 4, from 1 to 5, from 2 to 4, from 2 to 6, from 3 to 6 etc., as well as individual numbers within that range, for example, 1, 2, 3, 4, 5, and 6. This applies regardless of the breadth of the range.
Whenever a numerical range is indicated herein, it is meant to include any cited numeral (fractional or integral) within the indicated range. The phrases “ranging/ranges between” a first indicate number and a second indicate number and “ranging/ranges from” a first indicate number “to” a second indicate number are used herein interchangeably and are meant to include the first and second indicated numbers and all the fractional and integral numerals therebetween.
It is appreciated that certain features of embodiments, which are, for clarity, described in the context of separate embodiments, may also be provided in combination in a single embodiment. Conversely, various features of embodiments, which are, for brevity, described in the context of a single embodiment, may also be provided separately or in any suitable subcombination or as suitable in any other described embodiment. Certain features described in the context of various embodiments are not to be considered essential features of those embodiments, unless the embodiment is inoperative without those elements.
Although embodiments have been described in conjunction with specific embodiments thereof, it is evident that many alternatives, modifications and variations will be apparent to those skilled in the art. Accordingly, it is intended to embrace all such alternatives, modifications and variations that fall within the spirit and broad scope of the appended claims. It is the intent of the applicant(s) that all publications, patents and patent applications referred to in this specification are to be incorporated in their entirety by reference into the specification, as if each individual publication, patent or patent application was specifically and individually noted when referenced that it is to be incorporated herein by reference. In addition, citation or identification of any reference in this application shall not be construed as an admission that such reference is available as prior art to the present disclosure. To the extent that section headings are used, they should not be construed as necessarily limiting. In addition, any priority document(s) of this application is/are hereby incorporated herein by reference in its/their entirety.

Claims

CLAIMS:
1. A system for generating a security policy for a software program, comprising a processing unit configured for in each of a sequence of policy building iterations: monitoring execution of a software program by the system where the system is configured for executing an identified security policy, thereby blocking access by the software program to a plurality of blocked computer resources, to produce monitoring data describing a plurality of accesses of the software program to a plurality of computer resources of the system; identifying in the plurality of accesses at least one blocked access by the software program to at least one of the plurality of blocked computer resources; computing a modified security policy to unblock access by the software program to the at least one blocked computer resource; and instructing configuration of the system to execute the modified security policy as the identified security policy in a next iteration of the sequence of policy building iterations.
2. The system of claim 1, wherein at least one of the plurality of blocked computer resources is selected from the list of computer resources comprising at least one of: an area of a memory, a file, a device connected to the processing unit, an inter-process communication access point, a digital communication network resource, and a service provided by an operating system executed by the processing unit.
3. The system of any of claims 1 and 2, wherein in a first policy building iteration of the sequence of policy building iterations the plurality of blocked computer resources is the plurality of computer resources of the system.
4. The system of any of claims 1-3, wherein the processing unit is further configured for: determining the identified security policy is a preferred security policy subject to failing to identify the at least one blocked access; and providing the preferred security policy to at least one user of the system.
5. The system of claim 4, wherein providing the preferred security policy comprises at least one of: storing the preferred security policy in a non-volatile digital storage connected to the processing unit; adding a log entry to one or more of: a log file, a log of the system; sending a message to at least one other processing unit via a digital communication network interface connected to the processing unit; and displaying a message on a display device connected to the processing unit.
6. The system of any of claims 1-5, wherein monitoring execution of the software program comprises at least one of: monitoring a log of the system, executing a command, and capturing digital communication network traffic.
7. The system of any of claims 1-6, wherein the identified security policy comprises a plurality of access entries; and wherein computing the modified security policy comprises: computing at least one new access entry to allow access by the software program to the at least one blocked computer resource; and adding the at least one new access entry to the identified security policy.
8. The system of any of claims 1-6, wherein the identified security policy comprises a plurality of access entries; and wherein computing the modified security policy comprises: modifying at least one of the plurality of access entries to allow access by the software program to the at least one blocked computer resource.
9. The system of any of claims 1-6, wherein the identified security policy comprises a plurality of access entries; and wherein computing the modified security policy comprises: deleting from the identified security policy at least one of the plurality of access entries.
10. The system of any of claims 1-9, wherein identifying the at least one blocked access comprises: extracting from the monitoring data a plurality of access violation entries; and identifying in the plurality of access violation entries at least one access violation entry indicative of the at least one blocked access; and wherein computing the modified security policy comprises: extracting at least one access value from the at least one access violation entry; and computing the modified security policy using the at least one access value.
11. The system of any of claims 1-10, wherein execution of the software program comprises providing the software program with an identified set of input values.
12. The system of any of claims 1-11, wherein execution of the software program comprises executing the software program in a software testing environment.
13. The system of any of claim 1-12, wherein the processing unit is further configured for: identifying in the monitoring data at least one unexpected outcome of executing the software program; and providing a notification of the at least one unexpected outcome to another user of the system subject to failing to identify the at least one blocked access or failing to identify an association between the at least one blocked access and the at least one unexpected outcome.
14. A method for generating a security policy for a software program, comprising in each of a sequence of policy building iterations: monitoring execution of a software program by a system where the system is configured for executing an identified security policy, thereby blocking access by the software program to a plurality of blocked computer resources, to produce monitoring data describing a plurality of accesses of the software program to a plurality of computer resources of the system; identifying in the plurality of accesses at least one blocked access by the software program to at least one of the plurality of blocked computer resources; computing a modified security policy to unblock access by the software program to the at least one blocked computer resource; and instructing configuration of the system to execute the modified security policy as the identified security policy in a next iteration of the sequence of policy building iterations.
15. A software program product for generating a security policy for a software program, comprising: a non-transitory computer readable storage medium; first program instructions for monitoring execution of a software program by a system where the system is configured for executing an identified security policy, thereby blocking access by the software program to a plurality of blocked computer resources, to produce monitoring data describing a plurality of accesses of the software program to a plurality of computer resources of the system; second program instructions for identifying in the plurality of accesses at least one blocked access by the software program to at least one of the plurality of blocked computer resources; third program instructions for computing a modified security policy to unblock access by the software program to the at least one blocked computer resource; and fourth program instructions for instructing configuration of the system to execute the modified security policy as the identified security policy in a next iteration of the sequence of policy building iterations; wherein the first, second, third, and fourth program instructions are executed by at least one computerized processor from the non-transitory computer readable storage medium.
PCT/EP2021/070667 2021-07-23 2021-07-23 Generation of a security policy for a software program WO2023001380A1 (en)

Priority Applications (2)

Application Number Priority Date Filing Date Title
CN202180100566.XA CN117730322A (en) 2021-07-23 2021-07-23 Generation of security policies for software programs
PCT/EP2021/070667 WO2023001380A1 (en) 2021-07-23 2021-07-23 Generation of a security policy for a software program

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
PCT/EP2021/070667 WO2023001380A1 (en) 2021-07-23 2021-07-23 Generation of a security policy for a software program

Publications (1)

Publication Number Publication Date
WO2023001380A1 true WO2023001380A1 (en) 2023-01-26

Family

ID=77168234

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/EP2021/070667 WO2023001380A1 (en) 2021-07-23 2021-07-23 Generation of a security policy for a software program

Country Status (2)

Country Link
CN (1) CN117730322A (en)
WO (1) WO2023001380A1 (en)

Citations (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20140040638A1 (en) * 2011-10-11 2014-02-06 Citrix Systems, Inc. Policy-Based Application Management
US20190318100A1 (en) * 2018-04-17 2019-10-17 Oracle International Corporation High granularity application and data security in cloud environments
WO2020106973A1 (en) * 2018-11-21 2020-05-28 Araali Networks, Inc. Systems and methods for securing a workload

Patent Citations (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20140040638A1 (en) * 2011-10-11 2014-02-06 Citrix Systems, Inc. Policy-Based Application Management
US20190318100A1 (en) * 2018-04-17 2019-10-17 Oracle International Corporation High granularity application and data security in cloud environments
WO2020106973A1 (en) * 2018-11-21 2020-05-28 Araali Networks, Inc. Systems and methods for securing a workload

Also Published As

Publication number Publication date
CN117730322A (en) 2024-03-19

Similar Documents

Publication Publication Date Title
US11372997B2 (en) Automatic audit logging of events in software applications performing regulatory workloads
US20180330103A1 (en) Automatic Generation of Data-Centric Attack Graphs
US9973472B2 (en) Methods and systems for orchestrating physical and virtual switches to enforce security boundaries
US20200036725A1 (en) Graphical user interface privacy, security and anonymization
US9762439B2 (en) Configuration command template creation assistant using cross-model analysis to identify common syntax and semantics
US9202063B1 (en) Monitoring network-based printing for data loss prevention (DLP)
US10929568B2 (en) Application control
US11470119B2 (en) Native tag-based configuration for workloads in a virtual computing environment
US11687655B2 (en) Secure execution guest owner environmental controls
US20160378987A1 (en) Self-repair and distributed-repair of applications
WO2020205619A1 (en) Intent-based governance service
EP3057282A1 (en) Network flow control device, and security strategy configuration method and device thereof
US9749346B2 (en) Security with respect to managing a shared pool of configurable computing resources
US20160077859A1 (en) Expediting host maintenance mode in cloud computing environments
US20140208320A1 (en) Creating a virtual resource package
US9781013B2 (en) Homogenizing tooling for a heterogeneous cloud environment
CN108450033B (en) Cross-platform streaming data streams
WO2023275665A1 (en) Managing application security vulnerabilities
US10671325B2 (en) Selective secure deletion of data in distributed systems and cloud
US8793783B2 (en) Dynamic allocation of network security credentials for alert notification recipients
WO2023001380A1 (en) Generation of a security policy for a software program
US10296737B2 (en) Security enforcement in the presence of dynamic code loading
EP3139298B1 (en) Information processing system, control method, and control program
EP3188071B1 (en) Application accessing control method and device
CN116097259A (en) Computer file metadata segmentation security system

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 21749152

Country of ref document: EP

Kind code of ref document: A1

NENP Non-entry into the national phase

Ref country code: DE