WO2023240428A1 - Mitigation of production secrets in development environments - Google Patents

Mitigation of production secrets in development environments Download PDF

Info

Publication number
WO2023240428A1
WO2023240428A1 PCT/CN2022/098561 CN2022098561W WO2023240428A1 WO 2023240428 A1 WO2023240428 A1 WO 2023240428A1 CN 2022098561 W CN2022098561 W CN 2022098561W WO 2023240428 A1 WO2023240428 A1 WO 2023240428A1
Authority
WO
WIPO (PCT)
Prior art keywords
certificate
production
computing device
cryptographic
development
Prior art date
Application number
PCT/CN2022/098561
Other languages
French (fr)
Inventor
Ziyu Guo
Xiaobo Kang
Kevin Alexander LO
Daiqian HU
Yi Zeng
Xiangzhong WU
Liechuan OU
Original Assignee
Microsoft Technology Licensing, Llc
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Microsoft Technology Licensing, Llc filed Critical Microsoft Technology Licensing, Llc
Priority to CN202280066598.7A priority Critical patent/CN118056379A/en
Priority to PCT/CN2022/098561 priority patent/WO2023240428A1/en
Publication of WO2023240428A1 publication Critical patent/WO2023240428A1/en

Links

Images

Classifications

    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04LTRANSMISSION OF DIGITAL INFORMATION, e.g. TELEGRAPHIC COMMUNICATION
    • H04L63/00Network architectures or network communication protocols for network security
    • H04L63/08Network architectures or network communication protocols for network security for authentication of entities
    • H04L63/0823Network architectures or network communication protocols for network security for authentication of entities using certificates
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04LTRANSMISSION OF DIGITAL INFORMATION, e.g. TELEGRAPHIC COMMUNICATION
    • H04L63/00Network architectures or network communication protocols for network security
    • H04L63/10Network architectures or network communication protocols for network security for controlling access to devices or network resources
    • H04L63/105Multiple levels of security
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04LTRANSMISSION OF DIGITAL INFORMATION, e.g. TELEGRAPHIC COMMUNICATION
    • H04L63/00Network architectures or network communication protocols for network security
    • H04L63/20Network architectures or network communication protocols for network security for managing network security; network security policies in general

Definitions

  • Embodiments described herein generally relate to computer security and, in some embodiments, more specifically to detection and mitigation of production computing environment cryptographic secrets found in use in a development computing environment.
  • a development computing environment may be used to develop, deploy, or operate production services.
  • the development computing environment may include a variety of computing systems that provide developers with resources to develop software applications and services that may eventually be placed in a production computing environment.
  • the production environment and the development environment may have differing security protocols in place to allow developers more flexibility to design and test features.
  • an engineer in the development environment may interact with production services or production data and may accidentally leak production secrets or production data to a party with nefarious intents.
  • Production infrastructures or services may also have dependencies on systems in the development environment.
  • the potential for production secrets to be present in the development data may pose a security risk. For example, an attacker may initiate a brute force attack to obtain a production secret from the development environment. The attacker may use the production secret obtained from a compromised development system to compromise other development and production systems.
  • FIG. 1 is a block diagram of an example of propagation of an attack using a compromised production secret.
  • FIG. 2 is a block diagram of an example of an attack compromising a production secret in a development environment that poses a risk to a production environment.
  • FIG. 3 is a block diagram of an example of a system for mitigation of production secrets in development environments, according to an embodiment.
  • FIG. 4 is a block diagram of an example of agent deployment for mitigation of production secrets in development environments, according to an embodiment.
  • FIG. 5 illustrates a flow diagram of an example of a process for mitigation of production secrets in development environments, according to an embodiment.
  • FIG. 6 illustrates an example of a method for mitigation of production secrets in development environments, according to an embodiment.
  • FIG. 7 is a block diagram illustrating an example of a machine upon which one or more embodiments may be implemented.
  • Development computing environments may be used by engineering and development teams to develop and test new computing systems and configurations.
  • a developer may include a production secret in the configuration.
  • a production secret refers to security data that may be used in production systems such as, by way of example and not limitation, cryptographic keys.
  • the presence of a production secret in the development environment may compromise security of the production network if a nefarious user intentionally or inadvertently gains access to the production secret from the development environment.
  • the development environment may not employ security measures that are as stringent as the production environment because confidential data may not be expected to be present in the development environment and more lenient security policies may be in place to provide flexibility to developers that may be testing new features for which security policies may not be defined or that are incompatible with production security policies.
  • a production secret may be inadvertently included in the development environment when a development instance of a production system is created in the development environment or may be used in configuration of a development system or by a user of a development system through a misunderstanding of the security risks of using a production secret in the development environment.
  • the systems and techniques discussed herein address risks of the use of production secrets for development system services by identifying production secrets present in the development environment, treating the presence of the production secret in the development environment as a preach, triggering automatic protection, and automatically applying security management to development machine dependencies. For example, development computing systems and services are monitored for breaches to identify attacker actions related to the production services, the development environment is scanned for risk exposure to identify risks to production services, and datacenter access may be restricted for at-risk development computing systems.
  • the technological solution discussed herein secures production services by reducing service compromise risks by proactively monitoring the development environment for production secrets, removing the production secrets from the development environment, and rotating production secrets leaked by development computing systems.
  • the systems and techniques discussed herein provide capability for monitoring and protecting production secrets compromise in development environments by combining production secret definitions, production secret collection for development and production environments, production secret compromise detection, and auto remediation that targets the source of a compromised production secret.
  • FIG. 1 is a block diagram of an example of propagation of an attack 100 using a compromised production secret.
  • an attacker 105 may target several computing device in a development environment 110 to attempt to discover secrets (e.g., cryptographic keys, authentication credentials, etc. ) .
  • secrets e.g., cryptographic keys, authentication credentials, etc.
  • the attacker 105 may be able to use the discovered secret to breach a security boundary 115 between the development environment 110 and a production environment 120.
  • the attacker 105 may use the discovered secret to perform a variety of cyberattacks including ransomware attacks, unauthorized data exfiltration, etc.
  • FIG. 2 is a block diagram of an example of an attack 200 compromising a production secret in a development environment 215 that poses a risk to a production environment 210.
  • an attacker 205 may connect to the development environment 215 via a network 220 (e.g., the internet, etc. ) to compromise a reverse proxy 220 that may be implemented between outside users and inside resources to process incoming resource requests.
  • the attacker 205 may use a brute force attack (e.g., trying a variety of passwords and authentication schemes to gain access to resources, etc. ) to discover an authentication password for an account with administrative level resource access.
  • the success of the attack may be the result of use of a weak password associated with the administrator account.
  • the discovered authentication credentials may be used to propagate the attack laterally to a computing device 225 inside the development environment.
  • the attacker 205 may collect a set of credentials including secrets using the discovered administrator credentials.
  • the set of credentials may include production secrets.
  • the attacker 205 may be able to use a collected production secret to compromise a production reverse proxy 230 and may be able to gain unauthorized access to production computing systems behind the production reverse proxy 230.
  • FIG. 3 is a block diagram of an example of a system 300 for mitigation of production secrets in development environments, according to an embodiment.
  • the system 300 addresses the attacks described in FIGS. 1 and 2 by removing production certificates from the development environment to prevent certificates retrieved from compromise of the development environment from being used to compromise the production environment.
  • the system 300 addresses the risk of compromised production secrets present in a development environment.
  • the system 300 may include a variety of components including a production certificate inventory 305, a client computing device 330 that may include a key manager client and intrusion detection system client, a development certificate inventory, a matching engine 340, a key performance indicator (KPI) manager 345, a detection engine 350, an automatic remediation agent 355, and a scan controller 360.
  • KPI key performance indicator
  • the production certificate inventory 305 may include a certificate inventory database 320 that is populated with production certificate definitions generated by reconciling key vault metadata 310 and key manager store metadata 315 with certificate mapping data 325.
  • the production certificate definitions in the certificate inventory database 320 define, by way of example and not limitation, certificates that are in-use in a production environment, systems using the certificates, a certificate owner, an owner of production systems using a certificate, etc.
  • the key vault metadata 310 and the key store metadata 315 may provide identification of certificates in use in the production environment.
  • the certificate mapping data 325 may include data that describes systems that are using certificates.
  • the content of the certificate inventory database 320 is generated and updated based on intersections between the certificate identities (e.g., from the key vault metadata 310 and the key manager store metadata 315) and the usage data in the certificate mapping data 325.
  • the client computing device 330 may operate in a development environment and may include a key manager client that manages keys that are used by the client computing device 330 to access network resources.
  • the client computing device 330 may include an intrusion detection system client that detects and reports attempted (or successful) intrusions (e.g., unauthorized access, attacks, etc. ) into the client computing device 330.
  • the key manager client may upload its certificate store into the development certificate inventory 335.
  • the development certificate inventory may include the uploaded certificate data from the client devices (e.g., including the client computing device 330) to maintain an inventory of certificates in use in the development environment.
  • the matching engine 340 may compare certificates in the development certificate inventory 335 to certificates in the production certificate inventory datastore 320 (e.g., comparing attributes, a bit level comparison, comparing hash values, etc. ) to identify production certificates in use in the development environment.
  • the KPI manager 345 may receive output from the matching engine 340 that identifies the production certificates in use in the development environment.
  • the KPI manager may collect KPIs including, by way of example and not limitation, an identity of the certificate, an identity of a development computing system (e.g., the client computing device 330, etc. ) using the production certificate, an owner of the development computing system using the production certificate, etc. (e.g., from the certificate inventory datastore 320, the development certificate inventory, network configuration data (not shown) , etc. ) .
  • the KPI manager 345 may generate and transmit notifications to owners of systems using the production certificates including the KPIs to provide the owners with information that is used to remediate the risk posed by use of production certificates in the development environment. For example, the KPI manager 345 may calculate a remediation date that provides a deadline for remediation (e.g., certificate removal, rotation, etc. ) . If the production certificate continues to be detected in the development environment after the remediation date, the KPI manager 345 may generate and transmit an escalation message to the owner or a manager of the owner. The KPI manager 345 may transmit messages to an owner of the production certificate and owners of production systems using the production certificate that indicates that the production certificate needs to be rotated (e.g., a new certificate value generated, replace the certificate, etc.
  • a remediation date that provides a deadline for remediation (e.g., certificate removal, rotation, etc. ) .
  • the KPI manager 345 may generate and transmit an escalation message to the owner or a manager of the owner.
  • the indication provides the certificate manager with identification of the certificate and provides the production system owners with notice that the certificate will be rotated. Additional notifications may be transmitted to the owners as remediation activities are completed. For example, the certificate owner may schedule rotation of the certificate via the KPI manager 345 and the owners may be notified of the scheduled rotation so that production systems may be updated.
  • the detection engine 350 may receive output from the matching engine 340 that identifies the production certificates in use in the development environment.
  • the detection engine 305 may initiate automatic remediation of the production certificate in the development environment by transmitting certificate data to the automatic remediation agent 335.
  • the automatic remediation agent 355 may work in conjunction with the scan controller 360 to scan for and remove the production certificate from a key store of the client computing device 330 via the key manager client.
  • the detection engine 350 may receive an indication from the KPI manager 345 that remediation of a production certificate has not been completed by the remediation date and the detection engine 350 may cause the auto remediation agent 355 to initiate removal of the production certificate from certificate stores of the computing devices of the development environment to limit risk of compromise of the production certificate.
  • the detection engine 350 may evaluate inputs received from the matching engine 340 along with input telemetry data received from systems operating within the development environment (e.g., the intrusion detection system, etc. ) to output a probability of a compromise of the production certificate.
  • the intrusion detection system client executing on the client computing device 330 may provide inputs to the detection engine 350 that indicates that a request from an external internet protocol (IP) address made a successful request for a local certificate store that includes the production certificate.
  • IP internet protocol
  • the inputs may be evaluated along with the KPI data for the production certificate to calculate a probability that the production certificate has been compromised. For example, the probability may increase if the external IP address is on a blocked list, unknown, etc.
  • the detection engine 350 may use criticality metrics of systems or certificates in determining remediation actions and instructions to trigger for the detection. For example, the detection engine 350 may provide criticality information to the KPI manager 345 and the remediation date may be adjusted to extend or reduce the period of time between the detection date and the remediation date based on the criticality information.
  • the criticality information may be based on an evaluation of telemetry data for the certificate or systems using the certificate that indicates how much a system is used, security controls in force for data accessible using the certificate, etc.
  • the scan controller 360 e.g., scanning and assessment agent
  • the automatic remediation agent 355 and the KPI manager 345 e.g., mitigation agents
  • the scan controller 360 may cause (e.g., periodically, on demand, etc. ) the client computing device 330 to collect and upload the certificate information to the development certificate inventory 335.
  • the automatic remediation agent 355 and the KPI manager 345 remediate detections of production certificates in the development environment as previously described to prevent the production certificate from being compromised.
  • Mitigation agents may be implemented in the production network to remediate the detections by notifying owners that the production certificate was at risk and providing remediation instructions (or automatically facilitating remediation) to limit risk of attack using a potentially compromised production certificate.
  • FIG. 4 is a block diagram of an example of agent deployment 400 for mitigation of production secrets in development environments, according to an embodiment.
  • a development environment 405 may include a scanning and assessment agent 415 (e.g., the scan controller 360 as described in FIG. 1, etc. ) that may cause a client computing device or a software application executing on a client computing device to collect and report contents of a local certificate store to monitor certificate use in the development environment 405 for early detection of security threats.
  • the mitigation agent e.g., the KPI manager 345 as described in FIG. 3, the automatic remediation agent 355 as described in FIG. 3, etc.
  • mitigate the production certificate detected in the development environment to provide early remediation before a threat may extend to the a production environment 410.
  • the certificate may be removed from the development environment and owners of the certificate and systems utilizing the certificate may be notified with KPIs with descriptive information including detection metrics and remediation activities. This may be accomplished, in part, by implementing mitigation agents 425 in the production environment 410 to notify production personnel that the systems have been exposed to risk of a compromised production certificate to identify attackers and prevent lateral spread of an attack. In an example, detections that indicate a high probability (e.g., probability greater than . 90, etc. ) may cause automated remediation that may prevent systems from authenticating using the production certificate.
  • a high probability e.g., probability greater than . 90, etc.
  • FIG. 5 illustrates a flow diagram of an example of a process 500 for mitigation of production secrets in development environments, according to an embodiment.
  • the process 500 may provide features as described in FIGS. 3 and 4.
  • Development computing systems may be scanned to obtain certificate information (e.g., from a local certificate store, etc. ) for certificates that are used by the computing system (e.g., at operation 505) .
  • a scan controller may periodically (e.g., nightly, etc. ) schedule a scan of the local certificate stores of development machines operating in a development environment via security client executing on the development machines.
  • the security client may report, upload, or otherwise transmit the certificate information to a certificate datastore.
  • a development certificate inventory may be generated using the certificate information (e.g., at operation 510) .
  • certificate information from the development machines may be aggregated, deduplicated, etc. to generate (and update) the certificate inventory.
  • Production certificate metadata and certificate mapping data may be collected (e.g., at operation 515) to assemble a production certificate inventory (e.g., at operation 520) .
  • certificate metadata may be collected from a key vault datastore and from a key manager datastore.
  • the metadata may be evaluated with certificate mapping data from a system configuration datastore to identify certificates that are in use in a production environment.
  • a certificate in the development certificate inventory may be evaluated (e.g., compared, etc. ) to the production certificate inventory (e.g., at operation 525) to determine if the certificate is a production certificate (e.g., at decision 530) . If the certificate from the development certificate inventory is determined to be a production certificate (e.g., at decision 530) , it may be detected that the production certificate has been compromised (e.g., at operation 540) and key performance indicators (KPIs) may be transmitted to an owner of the certificate and owners of computing systems using the certificates (e.g., at operation 535) . The KPIs may be transmitted to owners of computing systems, services, functions, etc.
  • KPIs key performance indicators
  • KPIs may be specific to the owner.
  • the certificate owner may receive KPIs that include an identity of the certificate, a deadline for updating the certificate, systems using the certificate, location of the certificate (e.g., a key store, repository, file location, etc. ) , and other relevant KPIs that provide the certificate owner with information needed to remediate detection of the certificate in the development environment.
  • location of the certificate e.g., a key store, repository, file location, etc.
  • an owner of a system using the certificate may receive KPIs that include the certificate identity, the system using the certificate for which they are the owner, a deadline for remediation, etc.
  • Detection of the compromised certificate may lead to automated remediation instruction generation (e.g., at operation 545) .
  • computer executable instructions 550 may be generated to remove the certificate from a local certificate store in which the certificate was detected, to regenerate the certificate (e.g., to generate a new hash, etc. ) , etc.
  • the instructions 550 may be transmitted to a mitigation agent (e.g., at operation 555) that may execute the mitigation instructions 550 to perform automatic remediations.
  • a certificate removal instruction may schedule removal of the certificate from a certificate store of a development machine via a security application client executing on the development machine.
  • the instructions may cause a key generator to regenerate the certificate with a new hash value.
  • FIG. 6 illustrates an example of a method 600 for mitigation of production secrets in development environments, according to an embodiment.
  • the method 600 may provide features as described in FIGS. 3 to 5.
  • Certificate data may be obtained for a cryptographic certificate in use by the development computing device (e.g., at operation 605) .
  • a certificate data collection request may be transmitted to a key manager application executing on the development computing device and the certificate data may be obtained as output from a scan of the development computing device completed in response to the certificate collection request.
  • the certificate data may be evaluated using data from a production certificate inventory to determine that the cryptographic certificate is a production cryptographic certificate (e.g., at operation 610) .
  • certificate metadata may be obtained from a certificate data source.
  • the certificate metadata may be evaluated using production certificate mapping data to identify production certificates in use in a production environment and the production certificate inventory may be generated using the identified production certificates.
  • KPIs Key performance indicators
  • the KPIs may be collected for the cryptographic certificate for an owner of the production cryptographic certificate (e.g., at operation 615) .
  • the KPIs may include at least one of an identifier of the production cryptographic certificate, an identifier of the development computing device, an identifier of a computing resource accessible by the computing device using the cryptographic certificate, or a remediation completion data for the production cryptographic certificate.
  • the KPIs may be transmitted to the owner of the production cryptographic certificate (e.g., at operation 620) .
  • KPIs may be collected for the cryptographic certificate for an owner of the development computing device and the KPIs may be transmitted to the owner of the development computing device.
  • a production computing device may be identified that is using the production cryptographic certificate using the production certificate inventory. KPIs may be collected for the production cryptographic certificate for an owner of the production computing device and the KPIs may be transmitted to the owner of the production computing device.
  • the cryptographic certificate may be automatically removed from a certificate store of the development computing device (e.g., at operation 625) .
  • certificate removal instructions may be generated for the cryptographic instructions and the certificate removal instructions may be transmitted to a key manager application executing on the development computing device.
  • FIG. 7 illustrates a block diagram of an example machine 700 upon which any one or more of the techniques (e.g., methodologies) discussed herein may perform.
  • the machine 700 may operate as a standalone device or may be connected (e.g., networked) to other machines.
  • the machine 700 may operate in the capacity of a server machine, a client machine, or both in server-client network environments.
  • the machine 700 may act as a peer machine in peer-to-peer (P2P) (or other distributed) network environment.
  • P2P peer-to-peer
  • the machine 700 may be a personal computer (PC) , a tablet PC, a set-top box (STB) , a personal digital assistant (PDA) , a mobile telephone, a web appliance, a network router, switch or bridge, or any machine capable of executing instructions (sequential or otherwise) that specify actions to be taken by that machine.
  • PC personal computer
  • PDA personal digital assistant
  • STB set-top box
  • mobile telephone a web appliance
  • network router, switch or bridge or any machine capable of executing instructions (sequential or otherwise) that specify actions to be taken by that machine.
  • machine shall also be taken to include any collection of machines that individually or jointly execute a set (or multiple sets) of instructions to perform any one or more of the methodologies discussed herein, such as cloud computing, software as a service (SaaS) , other computer cluster configurations.
  • SaaS software as a service
  • Circuit sets are a collection of circuits implemented in tangible entities that include hardware (e.g., simple circuits, gates, logic, etc. ) . Circuit set membership may be flexible over time and underlying hardware variability. Circuit sets include members that may, alone or in combination, perform specified operations when operating. In an example, hardware of the circuit set may be immutably designed to carry out a specific operation (e.g., hardwired) . In an example, the hardware of the circuit set may include variably connected physical components (e.g., execution units, transistors, simple circuits, etc.
  • the computer readable medium including a computer readable medium physically modified (e.g., magnetically, electrically, moveable placement of invariant massed particles, etc. ) to encode instructions of the specific operation.
  • the instructions enable embedded hardware (e.g., the execution units or a loading mechanism) to create members of the circuit set in hardware via the variable connections to carry out portions of the specific operation when in operation.
  • the computer readable medium is communicatively coupled to the other components of the circuit set member when the device is operating.
  • any of the physical components may be used in more than one member of more than one circuit set.
  • execution units may be used in a first circuit of a first circuit set at one point in time and reused by a second circuit in the first circuit set, or by a third circuit in a second circuit set at a different time.
  • Machine 700 may include a hardware processor 702 (e.g., a central processing unit (CPU) , a graphics processing unit (GPU) , a hardware processor core, or any combination thereof) , a main memory 704 and a static memory 706, some or all of which may communicate with each other via an interlink (e.g., bus) 708.
  • the machine 700 may further include a display unit 710, an alphanumeric input device 712 (e.g., a keyboard) , and a user interface (UI) navigation device 714 (e.g., a mouse) .
  • the display unit 710, input device 712 and UI navigation device 714 may be a touch screen display.
  • the machine 700 may additionally include a storage device (e.g., drive unit) 716, a signal generation device 718 (e.g., a speaker) , a network interface device 720, and one or more sensors 721, such as a global positioning system (GPS) sensor, compass, accelerometer, or other sensors.
  • the machine 700 may include an output controller 728, such as a serial (e.g., universal serial bus (USB) , parallel, or other wired or wireless (e.g., infrared (IR) , near field communication (NFC) , etc. ) connection to communicate or control one or more peripheral devices (e.g., a printer, card reader, etc. ) .
  • a serial e.g., universal serial bus (USB)
  • USB universal serial bus
  • IR infrared
  • NFC near field communication
  • peripheral devices e.g., a printer, card reader, etc.
  • the storage device 716 may include a machine readable medium 722 on which is stored one or more sets of data structures or instructions 724 (e.g., software) embodying or utilized by any one or more of the techniques or functions described herein.
  • the instructions 724 may also reside, completely or at least partially, within the main memory 704, within static memory 706, or within the hardware processor 702 during execution thereof by the machine 700.
  • one or any combination of the hardware processor 702, the main memory 704, the static memory 706, or the storage device 716 may constitute machine readable media.
  • machine readable medium 722 is illustrated as a single medium, the term “machine readable medium” may include a single medium or multiple media (e.g., a centralized or distributed database, and/or associated caches and servers) configured to store the one or more instructions 724.
  • machine readable medium may include a single medium or multiple media (e.g., a centralized or distributed database, and/or associated caches and servers) configured to store the one or more instructions 724.
  • machine readable medium may include any medium that is capable of storing, encoding, or carrying instructions for execution by the machine 700 and that cause the machine 700 to perform any one or more of the techniques of the present disclosure, or that is capable of storing, encoding or carrying data structures used by or associated with such instructions.
  • Non-limiting machine readable medium examples may include solid-state memories, and optical and magnetic media.
  • machine readable media may exclude transitory propagating signals (e.g., non-transitory machine-readable storage media) .
  • non-transitory machine-readable storage media may include: non-volatile memory, such as semiconductor memory devices (e.g., Electrically Programmable Read-Only Memory (EPROM) , Electrically Erasable Programmable Read-Only Memory (EEPROM) ) and flash memory devices; magnetic disks, such as internal hard disks and removable disks; magneto-optical disks; and CD-ROM and DVD-ROM disks.
  • non-volatile memory such as semiconductor memory devices (e.g., Electrically Programmable Read-Only Memory (EPROM) , Electrically Erasable Programmable Read-Only Memory (EEPROM) ) and flash memory devices
  • EPROM Electrically Programmable Read-Only Memory
  • EEPROM Electrically Erasable Programmable Read-Only Memory
  • flash memory devices e.g., electrically Erasable Programmable Read-Only Memory (EEPROM)
  • EPROM Electrically Programmable Read-Only Memory
  • EEPROM Electrically Erasable Programmable Read-Only Memory
  • the instructions 724 may further be transmitted or received over a communications network 726 using a transmission medium via the network interface device 720 utilizing any one of a number of transfer protocols (e.g., frame relay, internet protocol (IP) , transmission control protocol (TCP) , user datagram protocol (UDP) , hypertext transfer protocol (HTTP) , etc. ) .
  • transfer protocols e.g., frame relay, internet protocol (IP) , transmission control protocol (TCP) , user datagram protocol (UDP) , hypertext transfer protocol (HTTP) , etc.
  • Example communication networks may include a local area network (LAN) , a wide area network (WAN) , a packet data network (e.g., the Internet) , mobile telephone networks (e.g., cellular networks) , Plain Old Telephone (POTS) networks, and wireless data networks (e.g., Institute of Electrical and Electronics Engineers (IEEE) 802.11 family of standards known as LPWAN standards, etc.
  • LAN local area network
  • WAN wide area network
  • POTS Plain Old Telephone
  • wireless data networks e.g., Institute of Electrical and Electronics Engineers (IEEE) 802.11 family of standards known as LPWAN standards, etc.
  • IEEE Institute of Electrical and Electronics Engineers 802.11 family of standards known as LPWAN standards, etc.
  • the network interface device 720 may include one or more physical jacks (e.g., Ethernet, coaxial, or phone jacks) or one or more antennas to connect to the communications network 726.
  • the network interface device 720 may include a plurality of antennas to wirelessly communicate using at least one of single-input multiple-output (SIMO) , multiple-input multiple-output (MIMO) , or multiple-input single-output (MISO) techniques.
  • SIMO single-input multiple-output
  • MIMO multiple-input multiple-output
  • MISO multiple-input single-output
  • transmission medium shall be taken to include any intangible medium that is capable of storing, encoding or carrying instructions for execution by the machine 700, and includes digital or analog communications signals or other intangible medium to facilitate communication of such software.
  • Example 1 is a system for mitigation of production certificates detected in a development environment comprising: at least one processor; and memory including instructions that, when executed by the at least one processor, cause the at least one processor to perform operations to: obtain certificate data for a cryptographic certificate in use by the development computing device; evaluate the certificate data using data from a production certificate inventory to determine that the cryptographic certificate is a production cryptographic certificate; collect key performance indicators (KPIs) for the cryptographic certificate for an owner of the production cryptographic certificate; transmit the KPIs to the owner of the production cryptographic certificate; and automatically remove the cryptographic certificate from a certificate store of the development computing device.
  • KPIs key performance indicators
  • Example 2 the subject matter of Example 1 includes, the memory further comprising instructions that, when executed by the at least one processor, cause the at least one processor to perform operations to: transmit a certificate data collection request to a key manager application executing on the development computing device; and obtain the certificate data as output from a scan of the development computing device completed in response to the certificate collection request.
  • Example 3 the subject matter of Examples 1–2 includes, the memory further comprising instructions that, when executed by the at least one processor, cause the at least one processor to perform operations to: obtain certificate metadata from a certificate data source; evaluate the certificate metadata using production certificate mapping data to identify production certificates in use in a production environment; and generate the production certificate inventory using the identified production certificates.
  • Example 4 the subject matter of Examples 1–3 wherein, the KPIs include at least one of an identifier of the production cryptographic certificate, an identifier of the development computing device, an identifier of a computing resource accessible by the computing device using the cryptographic certificate, or a remediation completion data for the production cryptographic certificate.
  • Example 5 the subject matter of Examples 1–4 includes, the memory further comprising instructions that, when executed by the at least one processor, cause the at least one processor to perform operations to: collect KPIs for the cryptographic certificate for an owner of the development computing device; and transmit the KPIs to the owner of the development computing device.
  • Example 6 the subject matter of Examples 1–5 includes, the memory further comprising instructions that, when executed by the at least one processor, cause the at least one processor to perform operations to: identify a production computing device using the production cryptographic certificate using the production certificate inventory; collect KPIs for the production cryptographic certificate for an owner of the production computing device; and transmit the KPIs to the owner of the production computing device.
  • Example 7 the subject matter of Examples 1–6 includes, the instructions to automatically remove the cryptographic certificate from the certificate store of the development computing device further comprising instructions that, when executed by the at least one processor, cause the at least one processor to perform operations to: generate certificate removal instructions for the cryptographic instructions; and transmit the certificate removal instructions to a key manager application executing on the development computing device.
  • Example 8 is a method for mitigation of production certificates detected in a development environment comprising: obtaining certificate data for a cryptographic certificate in use by the development computing device; evaluating the certificate data using data from a production certificate inventory to determine that the cryptographic certificate is a production cryptographic certificate; collecting key performance indicators (KPIs) for the cryptographic certificate for an owner of the production cryptographic certificate; transmitting the KPIs to the owner of the production cryptographic certificate; and automatically removing the cryptographic certificate from a certificate store of the development computing device.
  • KPIs key performance indicators
  • Example 9 the subject matter of Example 8 includes, transmitting a certificate data collection request to a key manager application executing on the development computing device; and obtaining the certificate data as output from a scan of the development computing device completed in response to the certificate collection request.
  • Example 10 the subject matter of Examples 8–9 includes, obtaining certificate metadata from a certificate data source; evaluating the certificate metadata using production certificate mapping data to identify production certificates in use in a production environment; and generating the production certificate inventory using the identified production certificates.
  • Example 11 the subject matter of Examples 8–10 wherein, the KPIs include at least one of an identifier of the production cryptographic certificate, an identifier of the development computing device, an identifier of a computing resource accessible by the computing device using the cryptographic certificate, or a remediation completion data for the production cryptographic certificate.
  • Example 12 the subject matter of Examples 8–11 includes, collecting KPIs for the cryptographic certificate for an owner of the development computing device; and transmitting the KPIs to the owner of the development computing device.
  • Example 13 the subject matter of Examples 8–12 includes, identifying a production computing device using the production cryptographic certificate using the production certificate inventory; collecting KPIs for the production cryptographic certificate for an owner of the production computing device; and transmitting the KPIs to the owner of the production computing device.
  • Example 14 the subject matter of Examples 8–13 includes, automatically removing the cryptographic certificate from the certificate store of the development computing device further comprising: generating certificate removal instructions for the cryptographic instructions; and transmitting the certificate removal instructions to a key manager application executing on the development computing device.
  • Example 15 is at least one machine-readable medium including instructions that, when executed by a machine, cause the machine to perform any method of Examples 8–14.
  • Example 16 is a system comprising means to perform any method of Examples 8–14.
  • Example 17 is at least one non-transitory machine-readable medium including instructions for mitigation of production certificates detected in a development environment that, when executed by at least one processor, cause the at least one processor to perform operations to: obtain certificate data for a cryptographic certificate in use by the development computing device; evaluate the certificate data using data from a production certificate inventory to determine that the cryptographic certificate is a production cryptographic certificate; collect key performance indicators (KPIs) for the cryptographic certificate for an owner of the production cryptographic certificate; transmit the KPIs to the owner of the production cryptographic certificate; and automatically remove the cryptographic certificate from a certificate store of the development computing device.
  • KPIs key performance indicators
  • Example 18 the subject matter of Example 17 includes, instructions that, when executed by the at least one processor, cause the at least one processor to perform operations to: transmit a certificate data collection request to a key manager application executing on the development computing device; and obtain the certificate data as output from a scan of the development computing device completed in response to the certificate collection request.
  • Example 19 the subject matter of Examples 17–18 includes, instructions that, when executed by the at least one processor, cause the at least one processor to perform operations to: obtain certificate metadata from a certificate data source; evaluate the certificate metadata using production certificate mapping data to identify production certificates in use in a production environment; and generate the production certificate inventory using the identified production certificates.
  • Example 20 the subject matter of Examples 17–19 wherein, the KPIs include at least one of an identifier of the production cryptographic certificate, an identifier of the development computing device, an identifier of a computing resource accessible by the computing device using the cryptographic certificate, or a remediation completion data for the production cryptographic certificate.
  • Example 21 the subject matter of Examples 17–20 includes, instructions that, when executed by the at least one processor, cause the at least one processor to perform operations to: collect KPIs for the cryptographic certificate for an owner of the development computing device; and transmit the KPIs to the owner of the development computing device.
  • Example 22 the subject matter of Examples 17–21 includes, instructions that, when executed by the at least one processor, cause the at least one processor to perform operations to: identify a production computing device using the production cryptographic certificate using the production certificate inventory; collect KPIs for the production cryptographic certificate for an owner of the production computing device; and transmit the KPIs to the owner of the production computing device.
  • Example 23 the subject matter of Examples 17–22 includes, the instructions to automatically remove the cryptographic certificate from the certificate store of the development computing device further comprising instructions that, when executed by the at least one processor, cause the at least one processor to perform operations to: generate certificate removal instructions for the cryptographic instructions; and transmit the certificate removal instructions to a key manager application executing on the development computing device.
  • Example 24 is at least one machine-readable medium including instructions that, when executed by processing circuitry, cause the processing circuitry to perform operations to implement of any of Examples 1–23.
  • Example 25 is an apparatus comprising means to implement of any of Examples 1–23.
  • Example 26 is a system to implement of any of Examples 1–23.
  • Example 27 is a method to implement of any of Examples 1–23.
  • the terms “a” or “an” are used, as is common in patent documents, to include one or more than one, independent of any other instances or usages of “at least one” or “one or more. ”
  • the term “or” is used to refer to a nonexclusive or, such that “A or B” includes “A but not B, ” “B but not A, ” and “A and B, ” unless otherwise indicated.
  • the terms “including” and “in which” are used as the plain-English equivalents of the respective terms “comprising” and “wherein.

Landscapes

  • Engineering & Computer Science (AREA)
  • Computer Security & Cryptography (AREA)
  • Computer Hardware Design (AREA)
  • Computing Systems (AREA)
  • General Engineering & Computer Science (AREA)
  • Computer Networks & Wireless Communication (AREA)
  • Signal Processing (AREA)
  • Management, Administration, Business Operations System, And Electronic Commerce (AREA)
  • Storage Device Security (AREA)

Abstract

Systems and techniques for mitigation of production secrets in development environments are described herein. Certificate data may be obtained for a cryptographic certificate in use by the development computing device. The certificate data may be evaluated using data from a production certificate inventory to determine that the cryptographic certificate is a production cryptographic certificate. Key performance indicators (KPIs) may be collected for the cryptographic certificate for an owner of the production cryptographic certificate. The KPIs may be transmitted to the owner of the production cryptographic certificate. The cryptographic certificate may be automatically removed from a certificate store of the development computing device.

Description

Mitigation of Production Secrets in Development Environments TECHNICAL FIELD
Embodiments described herein generally relate to computer security and, in some embodiments, more specifically to detection and mitigation of production computing environment cryptographic secrets found in use in a development computing environment.
BACKGROUND
A development computing environment may be used to develop, deploy, or operate production services. The development computing environment may include a variety of computing systems that provide developers with resources to develop software applications and services that may eventually be placed in a production computing environment. The production environment and the development environment may have differing security protocols in place to allow developers more flexibility to design and test features. There is a possibility that an engineer in the development environment may interact with production services or production data and may accidentally leak production secrets or production data to a party with nefarious intents. Production infrastructures or services may also have dependencies on systems in the development environment. The potential for production secrets to be present in the development data may pose a security risk. For example, an attacker may initiate a brute force attack to obtain a production secret from the development environment. The attacker may use the production secret obtained from a compromised development system to compromise other development and production systems.
BRIEF DESCRIPTION OF THE DRAWINGS
In the drawings, which are not necessarily drawn to scale, like numerals may describe similar components in different views. Like numerals having different letter suffixes may represent different instances of similar  components. The drawings illustrate generally, by way of example, but not by way of limitation, various embodiments discussed in the present document.
FIG. 1 is a block diagram of an example of propagation of an attack using a compromised production secret.
FIG. 2 is a block diagram of an example of an attack compromising a production secret in a development environment that poses a risk to a production environment.
FIG. 3 is a block diagram of an example of a system for mitigation of production secrets in development environments, according to an embodiment.
FIG. 4 is a block diagram of an example of agent deployment for mitigation of production secrets in development environments, according to an embodiment.
FIG. 5 illustrates a flow diagram of an example of a process for mitigation of production secrets in development environments, according to an embodiment.
FIG. 6 illustrates an example of a method for mitigation of production secrets in development environments, according to an embodiment.
FIG. 7 is a block diagram illustrating an example of a machine upon which one or more embodiments may be implemented.
DETAILED DESCRIPTION
Development computing environments may be used by engineering and development teams to develop and test new computing systems and configurations. When configuring development systems, a developer may include a production secret in the configuration. As used herein, a production secret refers to security data that may be used in production systems such as, by way of example and not limitation, cryptographic keys. The presence of a production secret in the development environment may compromise security of the production network if a nefarious user intentionally or inadvertently gains access to the production secret from the development environment. The development environment may not employ security measures that are as stringent as the production environment because  confidential data may not be expected to be present in the development environment and more lenient security policies may be in place to provide flexibility to developers that may be testing new features for which security policies may not be defined or that are incompatible with production security policies.
A production secret may be inadvertently included in the development environment when a development instance of a production system is created in the development environment or may be used in configuration of a development system or by a user of a development system through a misunderstanding of the security risks of using a production secret in the development environment. The systems and techniques discussed herein address risks of the use of production secrets for development system services by identifying production secrets present in the development environment, treating the presence of the production secret in the development environment as a preach, triggering automatic protection, and automatically applying security management to development machine dependencies. For example, development computing systems and services are monitored for breaches to identify attacker actions related to the production services, the development environment is scanned for risk exposure to identify risks to production services, and datacenter access may be restricted for at-risk development computing systems. The technological solution discussed herein secures production services by reducing service compromise risks by proactively monitoring the development environment for production secrets, removing the production secrets from the development environment, and rotating production secrets leaked by development computing systems.
Conventional solutions for addressing risks of production secret compromise may address internal datacenter risks while reducing and restricting development computing system dependencies. While this strategy may reduce risks of production secret compromise, some development computing system dependencies may be unavoidable because of flexibility requirements of the development and testing environment. There may be human risks of engineers alternating between interacting with development systems and production systems during a work period. Conventional techniques that reduce dependencies on  development computing systems need to balance security of eliminated dependencies with practicality and engineering productivity that may require dependencies. Thus, conventional production secret protection techniques may not be able to address the flexibility requirements of a development environment while maintaining effective production secret protection. For example, because conventional techniques rely on elimination of dependencies, the persistence of a dependency required for development or testing renders the conventional approach ineffective.
The systems and techniques discussed herein provide capability for monitoring and protecting production secrets compromise in development environments by combining production secret definitions, production secret collection for development and production environments, production secret compromise detection, and auto remediation that targets the source of a compromised production secret.
FIG. 1 is a block diagram of an example of propagation of an attack 100 using a compromised production secret. In the attack 100, an attacker 105 may target several computing device in a development environment 110 to attempt to discover secrets (e.g., cryptographic keys, authentication credentials, etc. ) . Upon the attacker 105 discovering a secret, the attacker 105 may be able to use the discovered secret to breach a security boundary 115 between the development environment 110 and a production environment 120. Additionally or alternatively, the attacker 105 may use the discovered secret to perform a variety of cyberattacks including ransomware attacks, unauthorized data exfiltration, etc.
FIG. 2 is a block diagram of an example of an attack 200 compromising a production secret in a development environment 215 that poses a risk to a production environment 210. In the attack 200, an attacker 205 may connect to the development environment 215 via a network 220 (e.g., the internet, etc. ) to compromise a reverse proxy 220 that may be implemented between outside users and inside resources to process incoming resource requests. The attacker 205 may use a brute force attack (e.g., trying a variety of passwords and authentication schemes to gain access to resources, etc. ) to discover an authentication password for  an account with administrative level resource access. The success of the attack may be the result of use of a weak password associated with the administrator account. The discovered authentication credentials may be used to propagate the attack laterally to a computing device 225 inside the development environment. The attacker 205 may collect a set of credentials including secrets using the discovered administrator credentials. The set of credentials may include production secrets. The attacker 205 may be able to use a collected production secret to compromise a production reverse proxy 230 and may be able to gain unauthorized access to production computing systems behind the production reverse proxy 230.
FIG. 3 is a block diagram of an example of a system 300 for mitigation of production secrets in development environments, according to an embodiment. The system 300 addresses the attacks described in FIGS. 1 and 2 by removing production certificates from the development environment to prevent certificates retrieved from compromise of the development environment from being used to compromise the production environment. The system 300 addresses the risk of compromised production secrets present in a development environment. The system 300 may include a variety of components including a production certificate inventory 305, a client computing device 330 that may include a key manager client and intrusion detection system client, a development certificate inventory, a matching engine 340, a key performance indicator (KPI) manager 345, a detection engine 350, an automatic remediation agent 355, and a scan controller 360.
The production certificate inventory 305 may include a certificate inventory database 320 that is populated with production certificate definitions generated by reconciling key vault metadata 310 and key manager store metadata 315 with certificate mapping data 325. The production certificate definitions in the certificate inventory database 320 define, by way of example and not limitation, certificates that are in-use in a production environment, systems using the certificates, a certificate owner, an owner of production systems using a certificate, etc. The key vault metadata 310 and the key store metadata 315 may provide identification of certificates in use in the production environment. The certificate mapping data 325 may include data that describes systems that are using  certificates. The content of the certificate inventory database 320 is generated and updated based on intersections between the certificate identities (e.g., from the key vault metadata 310 and the key manager store metadata 315) and the usage data in the certificate mapping data 325.
The client computing device 330 may operate in a development environment and may include a key manager client that manages keys that are used by the client computing device 330 to access network resources. The client computing device 330 may include an intrusion detection system client that detects and reports attempted (or successful) intrusions (e.g., unauthorized access, attacks, etc. ) into the client computing device 330. The key manager client may upload its certificate store into the development certificate inventory 335. The development certificate inventory may include the uploaded certificate data from the client devices (e.g., including the client computing device 330) to maintain an inventory of certificates in use in the development environment.
The matching engine 340 may compare certificates in the development certificate inventory 335 to certificates in the production certificate inventory datastore 320 (e.g., comparing attributes, a bit level comparison, comparing hash values, etc. ) to identify production certificates in use in the development environment. The KPI manager 345 may receive output from the matching engine 340 that identifies the production certificates in use in the development environment. The KPI manager may collect KPIs including, by way of example and not limitation, an identity of the certificate, an identity of a development computing system (e.g., the client computing device 330, etc. ) using the production certificate, an owner of the development computing system using the production certificate, etc. (e.g., from the certificate inventory datastore 320, the development certificate inventory, network configuration data (not shown) , etc. ) .
The KPI manager 345 may generate and transmit notifications to owners of systems using the production certificates including the KPIs to provide the owners with information that is used to remediate the risk posed by use of production certificates in the development environment. For example, the KPI manager 345 may calculate a remediation date that provides a deadline for  remediation (e.g., certificate removal, rotation, etc. ) . If the production certificate continues to be detected in the development environment after the remediation date, the KPI manager 345 may generate and transmit an escalation message to the owner or a manager of the owner. The KPI manager 345 may transmit messages to an owner of the production certificate and owners of production systems using the production certificate that indicates that the production certificate needs to be rotated (e.g., a new certificate value generated, replace the certificate, etc. ) . The indication provides the certificate manager with identification of the certificate and provides the production system owners with notice that the certificate will be rotated. Additional notifications may be transmitted to the owners as remediation activities are completed. For example, the certificate owner may schedule rotation of the certificate via the KPI manager 345 and the owners may be notified of the scheduled rotation so that production systems may be updated.
The detection engine 350 may receive output from the matching engine 340 that identifies the production certificates in use in the development environment. The detection engine 305 may initiate automatic remediation of the production certificate in the development environment by transmitting certificate data to the automatic remediation agent 335. The automatic remediation agent 355 may work in conjunction with the scan controller 360 to scan for and remove the production certificate from a key store of the client computing device 330 via the key manager client. In an example, the detection engine 350 may receive an indication from the KPI manager 345 that remediation of a production certificate has not been completed by the remediation date and the detection engine 350 may cause the auto remediation agent 355 to initiate removal of the production certificate from certificate stores of the computing devices of the development environment to limit risk of compromise of the production certificate.
In an example, the detection engine 350 may evaluate inputs received from the matching engine 340 along with input telemetry data received from systems operating within the development environment (e.g., the intrusion detection system, etc. ) to output a probability of a compromise of the production certificate. For example, the intrusion detection system client executing on the client computing  device 330 may provide inputs to the detection engine 350 that indicates that a request from an external internet protocol (IP) address made a successful request for a local certificate store that includes the production certificate. The inputs may be evaluated along with the KPI data for the production certificate to calculate a probability that the production certificate has been compromised. For example, the probability may increase if the external IP address is on a blocked list, unknown, etc. and may decrease if the external IP address is associated with a remote location of a developer, etc. The detection engine 350 may use criticality metrics of systems or certificates in determining remediation actions and instructions to trigger for the detection. For example, the detection engine 350 may provide criticality information to the KPI manager 345 and the remediation date may be adjusted to extend or reduce the period of time between the detection date and the remediation date based on the criticality information. The criticality information may be based on an evaluation of telemetry data for the certificate or systems using the certificate that indicates how much a system is used, security controls in force for data accessible using the certificate, etc.
The scan controller 360 (e.g., scanning and assessment agent) and the automatic remediation agent 355 and the KPI manager 345 (e.g., mitigation agents) may be implemented in the development environment. The scan controller 360 may cause (e.g., periodically, on demand, etc. ) the client computing device 330 to collect and upload the certificate information to the development certificate inventory 335. The automatic remediation agent 355 and the KPI manager 345 remediate detections of production certificates in the development environment as previously described to prevent the production certificate from being compromised. Mitigation agents may be implemented in the production network to remediate the detections by notifying owners that the production certificate was at risk and providing remediation instructions (or automatically facilitating remediation) to limit risk of attack using a potentially compromised production certificate.
FIG. 4 is a block diagram of an example of agent deployment 400 for mitigation of production secrets in development environments, according to an embodiment. A development environment 405 may include a scanning and  assessment agent 415 (e.g., the scan controller 360 as described in FIG. 1, etc. ) that may cause a client computing device or a software application executing on a client computing device to collect and report contents of a local certificate store to monitor certificate use in the development environment 405 for early detection of security threats. The mitigation agent (e.g., the KPI manager 345 as described in FIG. 3, the automatic remediation agent 355 as described in FIG. 3, etc. ) to mitigate the production certificate detected in the development environment to provide early remediation before a threat may extend to the a production environment 410. As previously described, the certificate may be removed from the development environment and owners of the certificate and systems utilizing the certificate may be notified with KPIs with descriptive information including detection metrics and remediation activities. This may be accomplished, in part, by implementing mitigation agents 425 in the production environment 410 to notify production personnel that the systems have been exposed to risk of a compromised production certificate to identify attackers and prevent lateral spread of an attack. In an example, detections that indicate a high probability (e.g., probability greater than . 90, etc. ) may cause automated remediation that may prevent systems from authenticating using the production certificate.
FIG. 5 illustrates a flow diagram of an example of a process 500 for mitigation of production secrets in development environments, according to an embodiment. The process 500 may provide features as described in FIGS. 3 and 4.
Development computing systems may be scanned to obtain certificate information (e.g., from a local certificate store, etc. ) for certificates that are used by the computing system (e.g., at operation 505) . For example, a scan controller may periodically (e.g., nightly, etc. ) schedule a scan of the local certificate stores of development machines operating in a development environment via security client executing on the development machines. The security client may report, upload, or otherwise transmit the certificate information to a certificate datastore. A development certificate inventory may be generated using the certificate information (e.g., at operation 510) . For example, certificate information  from the development machines may be aggregated, deduplicated, etc. to generate (and update) the certificate inventory.
Production certificate metadata and certificate mapping data may be collected (e.g., at operation 515) to assemble a production certificate inventory (e.g., at operation 520) . For example, certificate metadata may be collected from a key vault datastore and from a key manager datastore. The metadata may be evaluated with certificate mapping data from a system configuration datastore to identify certificates that are in use in a production environment.
A certificate in the development certificate inventory may be evaluated (e.g., compared, etc. ) to the production certificate inventory (e.g., at operation 525) to determine if the certificate is a production certificate (e.g., at decision 530) . If the certificate from the development certificate inventory is determined to be a production certificate (e.g., at decision 530) , it may be detected that the production certificate has been compromised (e.g., at operation 540) and key performance indicators (KPIs) may be transmitted to an owner of the certificate and owners of computing systems using the certificates (e.g., at operation 535) . The KPIs may be transmitted to owners of computing systems, services, functions, etc. in the development environment and the production environment and may include a variety of KPIs that may be specific to the owner. For example, the certificate owner may receive KPIs that include an identity of the certificate, a deadline for updating the certificate, systems using the certificate, location of the certificate (e.g., a key store, repository, file location, etc. ) , and other relevant KPIs that provide the certificate owner with information needed to remediate detection of the certificate in the development environment. In another example, an owner of a system using the certificate may receive KPIs that include the certificate identity, the system using the certificate for which they are the owner, a deadline for remediation, etc.
Detection of the compromised certificate (e.g., at operation 540 may lead to automated remediation instruction generation (e.g., at operation 545) . For example, computer executable instructions 550 may be generated to remove the certificate from a local certificate store in which the certificate was detected, to regenerate the certificate (e.g., to generate a new hash, etc. ) , etc. The instructions  550 may be transmitted to a mitigation agent (e.g., at operation 555) that may execute the mitigation instructions 550 to perform automatic remediations. For example, a certificate removal instruction may schedule removal of the certificate from a certificate store of a development machine via a security application client executing on the development machine. In another example, the instructions may cause a key generator to regenerate the certificate with a new hash value.
Upon completing remediation and KPI notification or in the certificate is not a production certificate, it may be determined if there are additional certificates to be evaluated (e.g., at decision 560) . If so, additional certificates are evaluated (e.g., at operation 525) . If there are no additional certificates to be evaluated, the development certificates continue to be monitored (e.g., at operation 505) and newly discovered certificates may be evaluated (e.g., at operation 525) when they are detected.
FIG. 6 illustrates an example of a method 600 for mitigation of production secrets in development environments, according to an embodiment. The method 600 may provide features as described in FIGS. 3 to 5.
Certificate data may be obtained for a cryptographic certificate in use by the development computing device (e.g., at operation 605) . In an example, a certificate data collection request may be transmitted to a key manager application executing on the development computing device and the certificate data may be obtained as output from a scan of the development computing device completed in response to the certificate collection request.
The certificate data may be evaluated using data from a production certificate inventory to determine that the cryptographic certificate is a production cryptographic certificate (e.g., at operation 610) . In an example, certificate metadata may be obtained from a certificate data source. The certificate metadata may be evaluated using production certificate mapping data to identify production certificates in use in a production environment and the production certificate inventory may be generated using the identified production certificates.
Key performance indicators (KPIs) may be collected for the cryptographic certificate for an owner of the production cryptographic certificate  (e.g., at operation 615) . In an example, the KPIs may include at least one of an identifier of the production cryptographic certificate, an identifier of the development computing device, an identifier of a computing resource accessible by the computing device using the cryptographic certificate, or a remediation completion data for the production cryptographic certificate.
The KPIs may be transmitted to the owner of the production cryptographic certificate (e.g., at operation 620) . In an example, KPIs may be collected for the cryptographic certificate for an owner of the development computing device and the KPIs may be transmitted to the owner of the development computing device. In another example, a production computing device may be identified that is using the production cryptographic certificate using the production certificate inventory. KPIs may be collected for the production cryptographic certificate for an owner of the production computing device and the KPIs may be transmitted to the owner of the production computing device.
The cryptographic certificate may be automatically removed from a certificate store of the development computing device (e.g., at operation 625) . In an example, certificate removal instructions may be generated for the cryptographic instructions and the certificate removal instructions may be transmitted to a key manager application executing on the development computing device.
FIG. 7 illustrates a block diagram of an example machine 700 upon which any one or more of the techniques (e.g., methodologies) discussed herein may perform. In alternative embodiments, the machine 700 may operate as a standalone device or may be connected (e.g., networked) to other machines. In a networked deployment, the machine 700 may operate in the capacity of a server machine, a client machine, or both in server-client network environments. In an example, the machine 700 may act as a peer machine in peer-to-peer (P2P) (or other distributed) network environment. The machine 700 may be a personal computer (PC) , a tablet PC, a set-top box (STB) , a personal digital assistant (PDA) , a mobile telephone, a web appliance, a network router, switch or bridge, or any machine capable of executing instructions (sequential or otherwise) that specify actions to be taken by that machine. Further, while only a single machine is illustrated, the term “machine”  shall also be taken to include any collection of machines that individually or jointly execute a set (or multiple sets) of instructions to perform any one or more of the methodologies discussed herein, such as cloud computing, software as a service (SaaS) , other computer cluster configurations.
Examples, as described herein, may include, or may operate by, logic or a number of components, or mechanisms. Circuit sets are a collection of circuits implemented in tangible entities that include hardware (e.g., simple circuits, gates, logic, etc. ) . Circuit set membership may be flexible over time and underlying hardware variability. Circuit sets include members that may, alone or in combination, perform specified operations when operating. In an example, hardware of the circuit set may be immutably designed to carry out a specific operation (e.g., hardwired) . In an example, the hardware of the circuit set may include variably connected physical components (e.g., execution units, transistors, simple circuits, etc. ) including a computer readable medium physically modified (e.g., magnetically, electrically, moveable placement of invariant massed particles, etc. ) to encode instructions of the specific operation. In connecting the physical components, the underlying electrical properties of a hardware constituent are changed, for example, from an insulator to a conductor or vice versa. The instructions enable embedded hardware (e.g., the execution units or a loading mechanism) to create members of the circuit set in hardware via the variable connections to carry out portions of the specific operation when in operation. Accordingly, the computer readable medium is communicatively coupled to the other components of the circuit set member when the device is operating. In an example, any of the physical components may be used in more than one member of more than one circuit set. For example, under operation, execution units may be used in a first circuit of a first circuit set at one point in time and reused by a second circuit in the first circuit set, or by a third circuit in a second circuit set at a different time.
Machine (e.g., computer system) 700 may include a hardware processor 702 (e.g., a central processing unit (CPU) , a graphics processing unit (GPU) , a hardware processor core, or any combination thereof) , a main memory 704 and a static memory 706, some or all of which may communicate with each other  via an interlink (e.g., bus) 708. The machine 700 may further include a display unit 710, an alphanumeric input device 712 (e.g., a keyboard) , and a user interface (UI) navigation device 714 (e.g., a mouse) . In an example, the display unit 710, input device 712 and UI navigation device 714 may be a touch screen display. The machine 700 may additionally include a storage device (e.g., drive unit) 716, a signal generation device 718 (e.g., a speaker) , a network interface device 720, and one or more sensors 721, such as a global positioning system (GPS) sensor, compass, accelerometer, or other sensors. The machine 700 may include an output controller 728, such as a serial (e.g., universal serial bus (USB) , parallel, or other wired or wireless (e.g., infrared (IR) , near field communication (NFC) , etc. ) connection to communicate or control one or more peripheral devices (e.g., a printer, card reader, etc. ) .
The storage device 716 may include a machine readable medium 722 on which is stored one or more sets of data structures or instructions 724 (e.g., software) embodying or utilized by any one or more of the techniques or functions described herein. The instructions 724 may also reside, completely or at least partially, within the main memory 704, within static memory 706, or within the hardware processor 702 during execution thereof by the machine 700. In an example, one or any combination of the hardware processor 702, the main memory 704, the static memory 706, or the storage device 716 may constitute machine readable media.
While the machine readable medium 722 is illustrated as a single medium, the term "machine readable medium" may include a single medium or multiple media (e.g., a centralized or distributed database, and/or associated caches and servers) configured to store the one or more instructions 724.
The term “machine readable medium” may include any medium that is capable of storing, encoding, or carrying instructions for execution by the machine 700 and that cause the machine 700 to perform any one or more of the techniques of the present disclosure, or that is capable of storing, encoding or carrying data structures used by or associated with such instructions. Non-limiting machine readable medium examples may include solid-state memories, and optical  and magnetic media. In an example, machine readable media may exclude transitory propagating signals (e.g., non-transitory machine-readable storage media) . Specific examples of non-transitory machine-readable storage media may include: non-volatile memory, such as semiconductor memory devices (e.g., Electrically Programmable Read-Only Memory (EPROM) , Electrically Erasable Programmable Read-Only Memory (EEPROM) ) and flash memory devices; magnetic disks, such as internal hard disks and removable disks; magneto-optical disks; and CD-ROM and DVD-ROM disks.
The instructions 724 may further be transmitted or received over a communications network 726 using a transmission medium via the network interface device 720 utilizing any one of a number of transfer protocols (e.g., frame relay, internet protocol (IP) , transmission control protocol (TCP) , user datagram protocol (UDP) , hypertext transfer protocol (HTTP) , etc. ) . Example communication networks may include a local area network (LAN) , a wide area network (WAN) , a packet data network (e.g., the Internet) , mobile telephone networks (e.g., cellular networks) , Plain Old Telephone (POTS) networks, and wireless data networks (e.g., Institute of Electrical and Electronics Engineers (IEEE) 802.11 family of standards known as 
Figure PCTCN2022098561-appb-000001
LPWAN standards, etc. ) , IEEE 802.15.4 family of standards, peer-to-peer (P2P) networks, 3 rd Generation Partnership Project (3GPP) standards for 4G and 5G wireless communication including: 3GPP Long-Term evolution (LTE) family of standards, 3GPP LTE Advanced family of standards, 3GPP LTE Advanced Pro family of standards, 3GPP New Radio (NR) family of standards, among others. In an example, the network interface device 720 may include one or more physical jacks (e.g., Ethernet, coaxial, or phone jacks) or one or more antennas to connect to the communications network 726. In an example, the network interface device 720 may include a plurality of antennas to wirelessly communicate using at least one of single-input multiple-output (SIMO) , multiple-input multiple-output (MIMO) , or multiple-input single-output (MISO) techniques. The term “transmission medium” shall be taken to include any intangible medium that is capable of storing, encoding or carrying instructions for  execution by the machine 700, and includes digital or analog communications signals or other intangible medium to facilitate communication of such software.
Additional Notes&Examples
Example 1 is a system for mitigation of production certificates detected in a development environment comprising: at least one processor; and memory including instructions that, when executed by the at least one processor, cause the at least one processor to perform operations to: obtain certificate data for a cryptographic certificate in use by the development computing device; evaluate the certificate data using data from a production certificate inventory to determine that the cryptographic certificate is a production cryptographic certificate; collect key performance indicators (KPIs) for the cryptographic certificate for an owner of the production cryptographic certificate; transmit the KPIs to the owner of the production cryptographic certificate; and automatically remove the cryptographic certificate from a certificate store of the development computing device.
In Example 2, the subject matter of Example 1 includes, the memory further comprising instructions that, when executed by the at least one processor, cause the at least one processor to perform operations to: transmit a certificate data collection request to a key manager application executing on the development computing device; and obtain the certificate data as output from a scan of the development computing device completed in response to the certificate collection request.
In Example 3, the subject matter of Examples 1–2 includes, the memory further comprising instructions that, when executed by the at least one processor, cause the at least one processor to perform operations to: obtain certificate metadata from a certificate data source; evaluate the certificate metadata using production certificate mapping data to identify production certificates in use in a production environment; and generate the production certificate inventory using the identified production certificates.
In Example 4, the subject matter of Examples 1–3 wherein, the KPIs include at least one of an identifier of the production cryptographic certificate, an  identifier of the development computing device, an identifier of a computing resource accessible by the computing device using the cryptographic certificate, or a remediation completion data for the production cryptographic certificate. 
In Example 5, the subject matter of Examples 1–4 includes, the memory further comprising instructions that, when executed by the at least one processor, cause the at least one processor to perform operations to: collect KPIs for the cryptographic certificate for an owner of the development computing device; and transmit the KPIs to the owner of the development computing device.
In Example 6, the subject matter of Examples 1–5 includes, the memory further comprising instructions that, when executed by the at least one processor, cause the at least one processor to perform operations to: identify a production computing device using the production cryptographic certificate using the production certificate inventory; collect KPIs for the production cryptographic certificate for an owner of the production computing device; and transmit the KPIs to the owner of the production computing device.
In Example 7, the subject matter of Examples 1–6 includes, the instructions to automatically remove the cryptographic certificate from the certificate store of the development computing device further comprising instructions that, when executed by the at least one processor, cause the at least one processor to perform operations to: generate certificate removal instructions for the cryptographic instructions; and transmit the certificate removal instructions to a key manager application executing on the development computing device.
Example 8 is a method for mitigation of production certificates detected in a development environment comprising: obtaining certificate data for a cryptographic certificate in use by the development computing device; evaluating the certificate data using data from a production certificate inventory to determine that the cryptographic certificate is a production cryptographic certificate; collecting key performance indicators (KPIs) for the cryptographic certificate for an owner of the production cryptographic certificate; transmitting the KPIs to the owner of the production cryptographic certificate; and automatically removing the cryptographic certificate from a certificate store of the development computing device.
In Example 9, the subject matter of Example 8 includes, transmitting a certificate data collection request to a key manager application executing on the development computing device; and obtaining the certificate data as output from a scan of the development computing device completed in response to the certificate collection request.
In Example 10, the subject matter of Examples 8–9 includes, obtaining certificate metadata from a certificate data source; evaluating the certificate metadata using production certificate mapping data to identify production certificates in use in a production environment; and generating the production certificate inventory using the identified production certificates.
In Example 11, the subject matter of Examples 8–10 wherein, the KPIs include at least one of an identifier of the production cryptographic certificate, an identifier of the development computing device, an identifier of a computing resource accessible by the computing device using the cryptographic certificate, or a remediation completion data for the production cryptographic certificate.
In Example 12, the subject matter of Examples 8–11 includes, collecting KPIs for the cryptographic certificate for an owner of the development computing device; and transmitting the KPIs to the owner of the development computing device.
In Example 13, the subject matter of Examples 8–12 includes, identifying a production computing device using the production cryptographic certificate using the production certificate inventory; collecting KPIs for the production cryptographic certificate for an owner of the production computing device; and transmitting the KPIs to the owner of the production computing device.
In Example 14, the subject matter of Examples 8–13 includes, automatically removing the cryptographic certificate from the certificate store of the development computing device further comprising: generating certificate removal instructions for the cryptographic instructions; and transmitting the certificate removal instructions to a key manager application executing on the development computing device.
Example 15 is at least one machine-readable medium including instructions that, when executed by a machine, cause the machine to perform any method of Examples 8–14.
Example 16 is a system comprising means to perform any method of Examples 8–14.
Example 17 is at least one non-transitory machine-readable medium including instructions for mitigation of production certificates detected in a development environment that, when executed by at least one processor, cause the at least one processor to perform operations to: obtain certificate data for a cryptographic certificate in use by the development computing device; evaluate the certificate data using data from a production certificate inventory to determine that the cryptographic certificate is a production cryptographic certificate; collect key performance indicators (KPIs) for the cryptographic certificate for an owner of the production cryptographic certificate; transmit the KPIs to the owner of the production cryptographic certificate; and automatically remove the cryptographic certificate from a certificate store of the development computing device.
In Example 18, the subject matter of Example 17 includes, instructions that, when executed by the at least one processor, cause the at least one processor to perform operations to: transmit a certificate data collection request to a key manager application executing on the development computing device; and obtain the certificate data as output from a scan of the development computing device completed in response to the certificate collection request.
In Example 19, the subject matter of Examples 17–18 includes, instructions that, when executed by the at least one processor, cause the at least one processor to perform operations to: obtain certificate metadata from a certificate data source; evaluate the certificate metadata using production certificate mapping data to identify production certificates in use in a production environment; and generate the production certificate inventory using the identified production certificates.
In Example 20, the subject matter of Examples 17–19 wherein, the KPIs include at least one of an identifier of the production cryptographic certificate,  an identifier of the development computing device, an identifier of a computing resource accessible by the computing device using the cryptographic certificate, or a remediation completion data for the production cryptographic certificate.
In Example 21, the subject matter of Examples 17–20 includes, instructions that, when executed by the at least one processor, cause the at least one processor to perform operations to: collect KPIs for the cryptographic certificate for an owner of the development computing device; and transmit the KPIs to the owner of the development computing device.
In Example 22, the subject matter of Examples 17–21 includes, instructions that, when executed by the at least one processor, cause the at least one processor to perform operations to: identify a production computing device using the production cryptographic certificate using the production certificate inventory; collect KPIs for the production cryptographic certificate for an owner of the production computing device; and transmit the KPIs to the owner of the production computing device.
In Example 23, the subject matter of Examples 17–22 includes, the instructions to automatically remove the cryptographic certificate from the certificate store of the development computing device further comprising instructions that, when executed by the at least one processor, cause the at least one processor to perform operations to: generate certificate removal instructions for the cryptographic instructions; and transmit the certificate removal instructions to a key manager application executing on the development computing device.
Example 24 is at least one machine-readable medium including instructions that, when executed by processing circuitry, cause the processing circuitry to perform operations to implement of any of Examples 1–23.
Example 25 is an apparatus comprising means to implement of any of Examples 1–23.
Example 26 is a system to implement of any of Examples 1–23.
Example 27 is a method to implement of any of Examples 1–23.
The above detailed description includes references to the accompanying drawings, which form a part of the detailed description. The  drawings show, by way of illustration, specific embodiments that may be practiced. These embodiments are also referred to herein as “examples. ” Such examples may include elements in addition to those shown or described. However, the present inventors also contemplate examples in which only those elements shown or described are provided. Moreover, the present inventors also contemplate examples using any combination or permutation of those elements shown or described (or one or more aspects thereof) , either with respect to a particular example (or one or more aspects thereof) , or with respect to other examples (or one or more aspects thereof) shown or described herein.
All publications, patents, and patent documents referred to in this document are incorporated by reference herein in their entirety, as though individually incorporated by reference. In the event of inconsistent usages between this document and those documents so incorporated by reference, the usage in the incorporated reference (s) should be considered supplementary to that of this document; for irreconcilable inconsistencies, the usage in this document controls.
In this document, the terms “a” or “an” are used, as is common in patent documents, to include one or more than one, independent of any other instances or usages of “at least one” or “one or more. ” In this document, the term “or” is used to refer to a nonexclusive or, such that “A or B” includes “A but not B, ” “B but not A, ” and “A and B, ” unless otherwise indicated. In the appended claims, the terms “including” and “in which” are used as the plain-English equivalents of the respective terms “comprising” and “wherein. ” Also, in the following claims, the terms “including” and “comprising” are open-ended, that is, a system, device, article, or process that includes elements in addition to those listed after such a term in a claim are still deemed to fall within the scope of that claim. Moreover, in the following claims, the terms “first, ” “second, ” and “third, ” etc. are used merely as labels, and are not intended to impose numerical requirements on their objects.
The above description is intended to be illustrative, and not restrictive. For example, the above-described examples (or one or more aspects thereof) may be used in combination with each other. Other embodiments may be used, such as by one of ordinary skill in the art upon reviewing the above  description. The Abstract is to allow the reader to quickly ascertain the nature of the technical disclosure and is submitted with the understanding that it will not be used to interpret or limit the scope or meaning of the claims. Also, in the above Detailed Description, various features may be grouped together to streamline the disclosure. This should not be interpreted as intending that an unclaimed disclosed feature is essential to any claim. Rather, inventive subject matter may lie in less than all features of a particular disclosed embodiment. Thus, the following claims are hereby incorporated into the Detailed Description, with each claim standing on its own as a separate embodiment. The scope of the embodiments should be determined with reference to the appended claims, along with the full scope of equivalents to which such claims are entitled.

Claims (15)

  1. A system for mitigation of production certificates detected in a development environment comprising:
    at least one processor; and
    memory including instructions that, when executed by the at least one processor, cause the at least one processor to perform operations to:
    obtain certificate data for a cryptographic certificate in use by the development computing device;
    evaluate the certificate data using data from a production certificate inventory to determine that the cryptographic certificate is a production cryptographic certificate;
    collect key performance indicators (KPIs) for the cryptographic certificate for an owner of the production cryptographic certificate;
    transmit the KPIs to the owner of the production cryptographic certificate, the KPIs used by the owner to identify usage instances of the production certificate; and
    automatically remove the cryptographic certificate from a certificate store of the development computing device.
  2. The system of claim 1, the instructions to obtain certificate data further comprising instructions that, when executed by the at least one processor, cause the at least one processor to perform operations to:
    transmit a certificate data collection request to a key manager application executing on the development computing device; and
    obtain the certificate data as output from a scan of the development computing device completed in response to the certificate collection request.
  3. The system of claim 1, the memory further comprising instructions that, when executed by the at least one processor, cause the at least one processor to perform operations to:
    obtain certificate metadata from a certificate data source;
    evaluate the certificate metadata using production certificate mapping data to identify production certificates in use in a production environment; and
    generate the production certificate inventory using the identified production certificates.
  4. The system of claim 1, wherein the KPIs include at least one of an identifier of the production cryptographic certificate, an identifier of the development computing device, an identifier of a computing resource accessible by the computing device using the cryptographic certificate, or a remediation completion data for the production cryptographic certificate.
  5. The system of claim 1, the memory further comprising instructions that, when executed by the at least one processor, cause the at least one processor to perform operations to:
    collect KPIs for the cryptographic certificate for an owner of the development computing device; and
    transmit the KPIs to the owner of the development computing device.
  6. The system of claim 1, the memory further comprising instructions that, when executed by the at least one processor, cause the at least one processor to perform operations to:
    identify a production computing device using the production cryptographic certificate using the production certificate inventory;
    collect KPIs for the production cryptographic certificate for an owner of the production computing device; and
    transmit the KPIs to the owner of the production computing device.
  7. The system of claim 1, the instructions to automatically remove the cryptographic certificate from the certificate store of the development computing device further comprising instructions that, when executed by the at least one processor, cause the at least one processor to perform operations to:
    generate certificate removal instructions for the cryptographic instructions; and
    transmit the certificate removal instructions to a key manager application executing on the development computing device.
  8. A method for mitigation of production certificates detected in a development environment comprising:
    obtaining certificate data for a cryptographic certificate in use by the development computing device;
    evaluating the certificate data using data from a production certificate inventory to determine that the cryptographic certificate is a production cryptographic certificate;
    collecting key performance indicators (KPIs) for the cryptographic certificate for an owner of the production cryptographic certificate;
    transmitting the KPIs to the owner of the production cryptographic certificate; and
    automatically removing the cryptographic certificate from a certificate store of the development computing device.
  9. The method of claim 8, wherein obtaining the certificate data further comprises:
    transmitting a certificate data collection request to a key manager application executing on the development computing device; and
    obtaining the certificate data as output from a scan of the development computing device completed in response to the certificate collection request.
  10. The method of claim 8, further comprising:
    obtaining certificate metadata from a certificate data source;
    evaluating the certificate metadata using production certificate mapping data to identify production certificates in use in a production environment; and
    generating the production certificate inventory using the identified production certificates.
  11. The method of claim 8, wherein the KPIs include at least one of an identifier of the production cryptographic certificate, an identifier of the development computing device, an identifier of a computing resource accessible by the computing device using the cryptographic certificate, or a remediation completion data for the production cryptographic certificate.
  12. The method of claim 8, further comprising:
    collecting KPIs for the cryptographic certificate for an owner of the development computing device; and
    transmitting the KPIs to the owner of the development computing device.
  13. The method of claim 8, further comprising:
    identifying a production computing device using the production cryptographic certificate using the production certificate inventory;
    collecting KPIs for the production cryptographic certificate for an owner of the production computing device; and
    transmitting the KPIs to the owner of the production computing device.
  14. The method of claim 8, automatically removing the cryptographic certificate from the certificate store of the development computing device further comprising:
    generating certificate removal instructions for the cryptographic instructions; and
    transmitting the certificate removal instructions to a key manager application executing on the development computing device.
  15. At least one machine-readable medium including instructions that, when executed by a machine, cause the machine to perform any method of claims 8–14.
PCT/CN2022/098561 2022-06-14 2022-06-14 Mitigation of production secrets in development environments WO2023240428A1 (en)

Priority Applications (2)

Application Number Priority Date Filing Date Title
CN202280066598.7A CN118056379A (en) 2022-06-14 2022-06-14 Mitigation of production secrets in a development environment
PCT/CN2022/098561 WO2023240428A1 (en) 2022-06-14 2022-06-14 Mitigation of production secrets in development environments

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
PCT/CN2022/098561 WO2023240428A1 (en) 2022-06-14 2022-06-14 Mitigation of production secrets in development environments

Publications (1)

Publication Number Publication Date
WO2023240428A1 true WO2023240428A1 (en) 2023-12-21

Family

ID=82547217

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/CN2022/098561 WO2023240428A1 (en) 2022-06-14 2022-06-14 Mitigation of production secrets in development environments

Country Status (2)

Country Link
CN (1) CN118056379A (en)
WO (1) WO2023240428A1 (en)

Citations (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
EP2942900A1 (en) * 2013-01-21 2015-11-11 Huawei Technologies Co., Ltd. Method, device, and system for improving network security
US20180097803A1 (en) * 2016-09-30 2018-04-05 Microsoft Technology Licensing, Llc. Detecting malicious usage of certificates
US20200244455A1 (en) * 2019-01-29 2020-07-30 CELLAR DOOR MEDIA, LLC dba LOCKR Api and encryption key secrets management system and method

Patent Citations (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
EP2942900A1 (en) * 2013-01-21 2015-11-11 Huawei Technologies Co., Ltd. Method, device, and system for improving network security
US20180097803A1 (en) * 2016-09-30 2018-04-05 Microsoft Technology Licensing, Llc. Detecting malicious usage of certificates
US20200244455A1 (en) * 2019-01-29 2020-07-30 CELLAR DOOR MEDIA, LLC dba LOCKR Api and encryption key secrets management system and method

Also Published As

Publication number Publication date
CN118056379A (en) 2024-05-17

Similar Documents

Publication Publication Date Title
Schiller et al. Landscape of IoT security
US11722521B2 (en) Application firewall
US11109229B2 (en) Security for network computing environment using centralized security system
US20210029156A1 (en) Security monitoring system for internet of things (iot) device environments
KR102498168B1 (en) Cyber security system with adaptive machine learning features
US12099596B2 (en) Mobile device policy enforcement
WO2019156786A1 (en) Processing network traffic based on assessed security weaknesses
US11197160B2 (en) System and method for rogue access point detection
US11765590B2 (en) System and method for rogue device detection
US20240031380A1 (en) Unifying of the network device entity and the user entity for better cyber security modeling along with ingesting firewall rules to determine pathways through a network
Anisetti et al. Security threat landscape
US20210329459A1 (en) System and method for rogue device detection
WO2023240428A1 (en) Mitigation of production secrets in development environments
EP3903468B1 (en) Credential loss prevention
GB2572471A (en) Detecting lateral movement by malicious applications
EP3367609B1 (en) Recovering a key in a secure manner
Raja et al. Threat Modeling and IoT Attack Surfaces
US11089020B1 (en) Systems, methods, and media for protecting client devices from insecure cloud-based storage containers
Thomas et al. IoT Security: Challenges, Best Practices, and Service Platforms
Schiller et al. Computer Science Review

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 22741430

Country of ref document: EP

Kind code of ref document: A1

WWE Wipo information: entry into national phase

Ref document number: 202280066598.7

Country of ref document: CN