WO2023121521A1 - Methods and devices for supporting anomaly detection - Google Patents

Methods and devices for supporting anomaly detection Download PDF

Info

Publication number
WO2023121521A1
WO2023121521A1 PCT/SE2021/051295 SE2021051295W WO2023121521A1 WO 2023121521 A1 WO2023121521 A1 WO 2023121521A1 SE 2021051295 W SE2021051295 W SE 2021051295W WO 2023121521 A1 WO2023121521 A1 WO 2023121521A1
Authority
WO
WIPO (PCT)
Prior art keywords
secret
noise
data stream
packets
obfuscating
Prior art date
Application number
PCT/SE2021/051295
Other languages
French (fr)
Inventor
János KÖVÉR
Jonathan Olsson
Ikram Ullah
Original Assignee
Telefonaktiebolaget Lm Ericsson (Publ)
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Telefonaktiebolaget Lm Ericsson (Publ) filed Critical Telefonaktiebolaget Lm Ericsson (Publ)
Priority to PCT/SE2021/051295 priority Critical patent/WO2023121521A1/en
Publication of WO2023121521A1 publication Critical patent/WO2023121521A1/en

Links

Classifications

    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04LTRANSMISSION OF DIGITAL INFORMATION, e.g. TELEGRAPHIC COMMUNICATION
    • H04L63/00Network architectures or network communication protocols for network security
    • H04L63/14Network architectures or network communication protocols for network security for detecting or protecting against malicious traffic
    • H04L63/1408Network architectures or network communication protocols for network security for detecting or protecting against malicious traffic by monitoring network traffic
    • H04L63/1425Traffic logging, e.g. anomaly detection
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04LTRANSMISSION OF DIGITAL INFORMATION, e.g. TELEGRAPHIC COMMUNICATION
    • H04L9/00Cryptographic mechanisms or cryptographic arrangements for secret or secure communications; Network security protocols
    • H04L9/08Key distribution or management, e.g. generation, sharing or updating, of cryptographic keys or passwords
    • H04L9/0894Escrow, recovery or storing of secret information, e.g. secret key escrow or cryptographic key storage
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04LTRANSMISSION OF DIGITAL INFORMATION, e.g. TELEGRAPHIC COMMUNICATION
    • H04L9/00Cryptographic mechanisms or cryptographic arrangements for secret or secure communications; Network security protocols
    • H04L9/40Network security protocols
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04LTRANSMISSION OF DIGITAL INFORMATION, e.g. TELEGRAPHIC COMMUNICATION
    • H04L2209/00Additional information or applications relating to cryptographic mechanisms or cryptographic arrangements for secret or secure communication H04L9/00
    • H04L2209/16Obfuscation or hiding, e.g. involving white box
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04LTRANSMISSION OF DIGITAL INFORMATION, e.g. TELEGRAPHIC COMMUNICATION
    • H04L63/00Network architectures or network communication protocols for network security
    • H04L63/04Network architectures or network communication protocols for network security for providing a confidential data exchange among entities communicating through data packet networks
    • H04L63/0428Network architectures or network communication protocols for network security for providing a confidential data exchange among entities communicating through data packet networks wherein the data content is protected, e.g. by encrypting or encapsulating the payload

Definitions

  • the invention relates to a first device and a method for supporting anomaly detection, a second device and a method for obfuscating a data stream, and corresponding computer program and computer program product.
  • Sensitive network communication may be protected by encryption schemes. Encryption schemes make infeasible for an eavesdropper to reconstruct sensitive information from encrypted data. Some encryption schemes protect application payload on top of a transport mechanism, such as Hypertext Transfer Protocol (HTTP) protected by Transport Layer Security (TLS) on Transmission Control Protocol (TCP). Other examples of communication protocols providing data privacy on different network layers are Datagram Transport Layer Security (DTLS), IPsec, MACsec.
  • HTTP Hypertext Transfer Protocol
  • TLS Transport Layer Security
  • TCP Transmission Control Protocol
  • DTLS Datagram Transport Layer Security
  • IPsec IPsec
  • MACsec Network Control Protocol
  • TFC Traffic Flow Confidentiality
  • ESP IP Encapsulating Security Payload
  • NIDS Network intrusion detection systems
  • NT A Network Traffic Analysis
  • UEBA User and Entity Behavior Analytics
  • the solution disclosed in this document seeks to preferably mitigate, alleviate, or eliminate one or more of the disadvantages mentioned above singly or in any combination.
  • a method performed by a first device for supporting anomaly detection in an obfuscated data stream comprises obtaining a noise secret used for obfuscating at least some statistical characteristics of a received data stream.
  • the method comprises obtaining a representation of at least part of the received data stream.
  • the method comprises reconstructing, by using the noise secret, at least some of the obfuscated statistical characteristics.
  • the method comprises providing information indicative of the reconstructed statistical characteristics for performing anomaly detection. This allows both traffic obfuscation solutions and, anomaly detection tools and/or traffic analysis tools, to work efficiently.
  • end-to-end privacy of the communication is not broken, i.e., the data stream is not decrypted to reconstruct some of the statistical characteristics.
  • a method performed by a second device for obfuscating a data stream comprises obtaining a noise secret; and obfuscating at least some statistical characteristics of the data stream using the noise secret.
  • a first device for supporting anomaly detection in an obfuscated data stream.
  • the first device comprises a processor and a memory, the memory having stored thereon instructions executable by the processor.
  • the instructions when executed by the processor, cause the first device to: obtain a noise secret used for obfuscating at least some statistical characteristics of a received data stream.
  • the first device is also operative to obtain a representation of at least part of the received data stream.
  • the first device is operative to reconstruct, by using the noise secret, at least some of the obfuscated statistical characteristics.
  • the first device is further operative to provide information indicative of the reconstructed statistical characteristics for performing anomaly detection.
  • a second device for obfuscating a data stream.
  • the second device comprises a processor and a memory, the memory having stored thereon instructions executable by the processor.
  • the instructions when executed by the processor, cause the second device to: obtain a noise secret; and obfuscate at least some statistical characteristics of the data stream using the noise secret.
  • a computer program comprising instructions which, when run in a processing unit on a first device, cause the first device to: obtain a noise secret used for obfuscating at least some statistical characteristics of a received data stream; obtain a representation of at least part of the received data stream; reconstruct by using the noise secret, at least some of the obfuscated statistical characteristics; provide information indicative of the reconstructed statistical characteristics for performing anomaly detection.
  • a computer program product comprising a computer readable storage medium on which a computer program, as mentioned above, is stored.
  • a computer program comprising instructions which, when run in a processing unit on a second device, cause the second device to obtain a noise secret; obfuscate at least some statistical characteristics of the data stream using the noise secret.
  • a computer program product comprising a computer readable storage medium on which a computer program, as mentioned above, is stored.
  • the representation is a copy of at least part of the received data stream.
  • the representation is processed data comprising information about data stream.
  • packets in the received data stream are obfuscated by generating dummy packets using the noise secret.
  • the generated dummy packets comprise information to identify the dummy packets using the noise secret.
  • the method comprises identifying the dummy packets by decoding the information to identify the dummy packet using the noise secret. The method comprises discarding the dummy packet.
  • the instructions when executed by the processor, cause the first device to identify the dummy packets by decoding the information to identify the dummy packet using the noise secret; discard the dummy packets.
  • the method comprises re-ordering packets of the at least part of the received data stream.
  • the instructions when executed by the processor, cause the first device to re-order packets of the at least part of the received data stream.
  • the method comprises identifying the dummy packets by using the noise secret and order of the packets; and discarding the dummy packets.
  • the instructions when executed by the processor, cause the first device to identify the dummy packets by using the noise secret and order of the packets; and discard the dummy packets.
  • payload in the obfuscated data stream is encrypted.
  • the noise secret is obtained from a central node.
  • the noise secret is hard coded in the first device.
  • the noise secret is obtained from a second device for obfuscating a data stream.
  • the method comprises generating the noise secret; transmitting the noise secret to a second device for obfuscating a data stream.
  • the instructions when executed by the processor, cause the first device to generate the noise secret; transmit the noise secret to a second device for obfuscating a data stream.
  • the obtained noise secret is encrypted.
  • the method comprises decrypting the encrypted noise secret.
  • the instructions when executed by the processor, cause the first device to decrypt the encrypted noise secret.
  • the instructions when executed by the processor, cause the second device to decrypt the encrypted noise secret.
  • the operation of obfuscating comprises altering a size of a packet of the data stream using the noise secret.
  • the operation of obfuscating comprises generating dummy packets using the noise secret.
  • the operation of obfuscating comprises encoding information in a dummy packet for identifying the dummy packet using the noise secret.
  • the second device is a source node transmitting the data stream.
  • the noise secret is hard coded in the in the second device.
  • the noise secret is obtained from a first device for supporting anomaly detection.
  • the method comprises generating the noise secret.
  • the instructions when executed by the processor, cause the second device to generate the noise secret.
  • Figure 1 shows an example system according to an embodiment
  • Figure 2 shows a flowchart illustrating a method performed by a first device according to embodiments
  • Figure 3 shows a flowchart illustrating a method performed by a second device according to embodiments
  • Figure 4a shows a first example of obfuscation technique according to embodiments
  • Figure 4b shows a first example of de-obfus cation technique according to embodiments if the first example of obfuscation technique has been used;
  • Figure 5a shows a second example of obfuscation technique according to embodiments
  • Figure 5b shows a second example of de-obfuscation technique according to embodiments if the second example of obfuscation technique has been used;
  • Figure 6a shows a third example of obfuscation technique according to embodiments
  • Figure 6b shows a third example of de-obfuscation technique according to embodiments if the third example of obfuscation technique has been used;
  • Figure 7a shows a first example scenario in which an invention according to embodiments may be practiced
  • Figure 7b shows a second example scenario in which an invention according to embodiments may be practiced
  • Figure 8 is a block diagram depicting a first device according to an embodiment
  • Figure 9 is a block diagram depicting units of a first device according to an embodiment
  • Figure 10 is a block diagram depicting a second device according to an embodiment.
  • Figure 11 is a block diagram depicting units of a second device according to an embodiment.
  • VoIP voice over IP
  • UDP user datagram protocol
  • Obfuscation techniques, protocols, and/or extensions are adopted for protecting network communication against statistical traffic analysis, especially in case of use of open and public networks, by masking or altering all or some statistical characteristics of a data stream.
  • NIDS Network Intrusion Detection System
  • the solution to be disclosed allows reconstructing statistical characteristics of a data stream that had been previously obfuscated using a noise secret.
  • An intermediary trusted third party with the knowledge of the noise secret that has been used to obfuscate some of the statistical characteristics, is able to reconstruct at least some of the obfuscated statistical characteristics.
  • the reconstructed statistical characteristics may then be used for statistical traffic analysis and anomaly detection.
  • This solution does not break end-to-end privacy of the communication, i.e., the data stream is not decrypted to reconstruct some of the statistical characteristics.
  • the intermediary trusted third party such as a legitimate monitoring and analysis node, is able to reconstruct the statistical characteristics by using the knowledge of the noise secret.
  • Other non-trusted entities monitoring the traffic such as eavesdropping adversaries, may capture the data stream, but they are not able to reconstruct the statistical characteristics because these are obfuscated.
  • a legitimate receiver node of the data stream is able to correctly process the data stream without knowledge of the noise secret. For example, if the obfuscation altered a size of a packet, since the legitimate receiver is able to decrypt the packet, the legitimate receiver may trim excess data from the packet using packet size information obtained from an upper layer protocol, after decryption.
  • the solution to be disclosed allows both traffic obfuscation solutions and, anomaly detection tools and/or traffic analysis tools, to work efficiently. Moreover, the solution may allow a service provider to control who can perform traffic analysis, protecting business value of usage data.
  • Figure 1 shows an example scenario in which the invention may be practiced.
  • Figure 1 shows a system comprising a first device 104, a legitimate analysis node 105, a transmitting node 101, a second device 102, a legitimate receiving node 103, and an adversary node 107.
  • the transmitting node 101 is a node that wants to transmit a data stream to the receiving node 103 in a secure way via a data network, e.g., Internet Protocol (IP).
  • IP Internet Protocol
  • the transmitting node may be a Network Function, Virtual Network Function, microservice, sensor, a Machine Type Communication (MTC) device, Machine-to -Machine (M2M), Internet of Things (loT) device, user device, vehicle, router, gateway, and any device with computing, storage, and network connectivity.
  • MTC Machine Type Communication
  • M2M Machine-to -Machine
  • LoT Internet of Things
  • the transmitting node 101 transmits its data stream to the second device 102 to obfuscate at least some statistical characteristics of the data stream using a noise secret, the data stream obtained after obfuscation is called obfuscated data stream.
  • modifying (or obfuscating) one or several statistical characteristics of a data stream is a reversible process based on a noise secret.
  • the noise secret may comprise multiple values, parameters, obfuscation method, and any other information necessary to produce and then remove the obfuscation.
  • Example of noise secrets are any values that serve as key for schemes based on keyed hash message authentication code, symmetric key, or asymmetric key; and parameters of a pseudorandom number generator (PRNG) such as seed, number of elements to be discarded starting from beginning of a random number sequence generated from a certain seed, or offset or modulo to be applied on each element of the random number sequencer.
  • PRNG pseudorandom number generator
  • the obfuscation process may be reversed by using the noise secret, the noise secret, the noise secret, the noise secret.
  • the second device 102 may be a router, gateway, middlebox, and any device with computing, storage, and network connectivity.
  • the transmitting node 101 and the second device 102 may be implemented on a same node 111. In an alternative embodiment, the transmitting node 101 and the second device 102 may be implemented on separate nodes.
  • the first device 104 is a node authorized to obtain a representation of at least part of the obfuscated data stream.
  • the first device 104 is in possession of the noise secret and is able to reconstruct, using the noise secret, at least some of the obfuscated statistical characteristics.
  • the statistical characteristics are reconstructed as before being obfuscated.
  • the first device 104 may then use information indicative of the reconstructed statistical characteristics for performing anomaly detection.
  • the first device 104 may alternatively transmit the information indicative of the reconstructed statistical characteristics to a further node, such as the legitimate analysis node 105 for performing anomaly detection.
  • the first device 104 may be a router, gateway, and any device with computing, storage, and network connectivity.
  • the first device 104 may directly collect a representation of the data stream or obtain a representation from a traffic collection point such as a router, switch, probe, bump-in-the-wire device, or host software agent able to collect data streams via port mirroring, network tap or any information exporting capability.
  • a traffic collection point such as a router, switch, probe, bump-in-the-wire device, or host software agent able to collect data streams via port mirroring, network tap or any information exporting capability.
  • the legitimate analysis node 105, intermediary trusted third party, or trusted node is a node authorized to perform anomaly detection on the data stream.
  • the legitimate analysis node 105 may be a router, gateway, and any device with computing, storage, and network connectivity running an anomaly detection tool and/or traffic analysis tool. Examples of anomaly detection tools and traffic analysis tools are NIDS based on statistical analysis, Network Traffic Analysis (NTA), User and Entity Behavior Analytics (UEBA) tools.
  • NTA Network Traffic Analysis
  • UEBA User and Entity Behavior Analytics
  • the legitimate analysis node 105 and the first device 104 may be implemented on a same node 121. In an alternative embodiment, the legitimate analysis node 105 and the first device 104 may be implemented on separate nodes.
  • the adversary node 107 is a node that attempts to access the data stream without authorization.
  • the adversary node 107 may be a router, gateway, a software code operating on an otherwise legitimate router/s witch, and any device with computing, storage, and network connectivity.
  • the adversary node 107 obtaining a representation of at least part of the transmitted data stream with at least some obfuscated statistical characteristics, may not be able to reconstruct at least some of the obfuscated statistical characteristics as before being obfuscated.
  • the receiving node 103 is a legitimate receiver of the data traffic sent by the transmitting node 101.
  • the receiving node 103 is able to correctly process the obfuscated data stream without knowledge of the noise secret. In other words, the receiving node 103 is not in possession of the noise secret. For example, if the obfuscation altered size of a packet, the receiver node 103 may trim excess data from the packet using packet size information obtained after decryption from an upper layer protocol if available. If the obfuscation altered traffic pattern statistics by generating dummy packets, the receiver node 103 may identify the dummy packets when decrypting a received packet.
  • the receiving node 103 may be a Network Function, Virtual Network Function, microservice, sensor, a Machine Type Communication (MTC) device, Machine-to -Machine (M2M), Internet of Things (loT) device, user device, user equipment , compute node, cloud service, Network Function, Virtual Network Function, microservice, vehicle, router, gateway, and any device with computing, storage, and network connectivity.
  • MTC Machine Type Communication
  • M2M Machine-to -Machine
  • LoT Internet of Things
  • First device 104, legitimate analysis node 105, transmitting node 101, second device 102, receiving node 103 may communicate with each other, through a subscription protocol, such as message queuing telemetry transport (MQTT) protocol, or utilizing any one of a number of transfer protocols (e.g., Ethernet, frame relay, IP, transmission control protocol (TCP), UDP, hypertext transfer protocol (HTTP), HTTP/2) and Remote Procedure Call (RPC), protocols, such as Google Remote Procedure Call (gRPC), and ensuring security requirements by using transport layer security, TLS.
  • MQTT message queuing telemetry transport
  • MQTT message queuing telemetry transport
  • TCP transmission control protocol
  • UDP hypertext transfer protocol
  • HTTP hypertext transfer protocol
  • HTTP HTTP/2
  • RPC Remote Procedure Call
  • protocols such as Google Remote Procedure Call (gRPC)
  • gRPC Google Remote Procedure Call
  • Figure 2 shows a method for supporting anomaly detection in an obfuscated data stream. The method is carried out by a first device 104.
  • the method comprises obtaining a noise secret used for obfuscating at least some statistical characteristics of a data stream.
  • the obtained noise secret may be encrypted and the method may comprise decrypting 213 the encrypted noise secret.
  • the method comprises obtaining a representation of at least part of the obfuscated data stream.
  • the representation may be a copy of at least part of the received data stream.
  • the representation may be processed data comprising information about data stream, e.g., truncated packets, IP flows, summary export features.
  • pay load of the obfuscated data stream is encrypted.
  • the method comprises in step 205 reconstructing, by using the noise secret, at least some of the obfuscated statistical characteristics.
  • At least some of the statistical characteristics that may have been obfuscated with the noise secret comprise, for example, packet timing, packet rate, packet sizes, payload size, data transfer rate of a data stream.
  • step 207 the method comprises providing information indicative of the reconstructed statistical characteristics for performing anomaly detection.
  • packets, or packet stream, of the received data stream are obfuscated by generating dummy packets using the noise secret.
  • the generated dummy packets comprise information to identify the dummy packet using the noise secret. Therefore, the information in a packet allows a node knowing the noise secret, to identify 219 a packet as dummy. Once the dummy packets have been identified, they may be discarded 221.
  • the information may be encoded in an Initiation Vector (IV) field of a packet and, by reading the value in the IV field and knowing the noise secret used to generate the value, a first device 104 may identify the dummy packet and discard it, as shown in Figure 6a and 6b.
  • IV Initiation Vector
  • the noise secret may indicate the dummy packets based on the order of transmission of the packets.
  • the received data stream may suffer from packet drops and retransmission, therefore the method may comprise reordering 215 packets of the at least part of the received data stream.
  • the method further comprises identifying 217 the dummy packets by using the noise secret and order of the packets; and discarding 221 the identified dummy packets.
  • the noise secret may be obtained from a central node that may generate the noise secret and distribute the noise secret for example during an on-boarding process.
  • a second device for obfuscating a data stream may generate the noise secret, and share the noise secret for example in a protocol set-up phase, e.g., encrypted with the public key of the first node.
  • the noise secret may be hard coded in the first device 105.
  • the first device may generate 209 the noise secret. The first device may then transmit 211 the noise secret to the second device 102.
  • the noise secret may be obtained through (manual) configuration
  • Figure 3 shows a method for obfuscating a data stream.
  • the method may be carried out by a second device, 102.
  • the second device may be the source node transmitting the data stream, i.e., the transmitting node 101.
  • the method comprises obtaining 301 a noise secret, and obfuscating 303 at least some statistical characteristics of the data stream using the noise secret.
  • payload of the data stream is encrypted.
  • the operation of obfuscating comprises altering a size of a packet of the data stream using the noise secret.
  • padding and fragmentation may be used for masking packet length statistics, wherein the size of a packet is generated using the noise secret.
  • a second device obtaining a packet with altered size would reconstruct the size before obfuscation using the noise secret.
  • the noise secret may comprise a key and a reversible function that uses the key to generate the obfuscation.
  • a legitimate receiver of the packet would not need the noise secret to reconstruct the size, but it would for example decrypt the payload and use packet size from an upper layer protocol.
  • the operation of obfuscating comprises generating dummy packets using the noise secret.
  • information identifying a dummy packet may be encoded in the dummy packet using the noise secret.
  • a first device obtaining a packet with an encoded information would decode the information using the noise secret and be able to identify a dummy packet.
  • a legitimate receiver of the packet would not need the noise secret to identify the dummy packet, but it would for example decrypt the payload and use information in the payload to distinguish between a dummy packet and a legitimate packet.
  • the method comprises obtaining the noise secret from a central node that may generate the noise secret and distribute the noise secret for example in an onboarding process.
  • the noise secret may be hard coded in the second device, natively, the noise secret may be obtained through (manual) configuration.
  • a first device for supporting anomaly detection may generate the noise secret, and share the noise secret for example in a protocol set-up phase.
  • the second device may generate 305 the noise secret. The second device may then transmit the noise secret to the first device.
  • the obtained noise secret may be encrypted, and the method may comprise decrypting 207 the encrypted noise secret.
  • Figure 4a and Figure 4b show a first example of obfuscation technique and corresponding deobfuscation technique according to embodiments.
  • the first example of obfuscation technique alters an original packet length or size based on a noise secret.
  • the obfuscation technique comprises the step of receiving 401 a datagram from an upper layer of the protocol stack.
  • a packet comprising the datagram has an original size or length 5.
  • the obfuscation technique comprises the step of obtaining 403 a random number xl generated for example by a PRNG using the noise secret.
  • a further size 5 ' is determined 405, wherein s ' is a function of the original size or length and xl.
  • the size or length of the packet is modified 407 with padding or fragmentation, so to be equal to .s' '.
  • the obtained obfuscated packet may then be transmitted 409.
  • a first device receives 411 the obfuscated packet with size 5 '. Supposing the first device is an authorized node in possession of the noise secret obtained according to one of the embodiments, the first device obtains 413 the random number xl generated by the PRNG using the noise secret. Then, the first device calculates 415 the original packet size .s' using a reverse function of the function used to calculate .s' ' and reconstructs the original size for example by removing bits. The reconstructed packet may now be transmitted 417 for example to an analysis engine to perform intrusion detection, anomaly detection, or behavior analytics.
  • Figure 5a and Figure 5b show a second simplified example of obfuscation technique and corresponding de-obfuscation technique according to embodiments.
  • the second example of obfuscation technique alters traffic pattern statistics by generating dummy packets according to a sequence based on the noise secret to fill gaps between subsequently transmitted nondummy packet carrying application data.
  • the obfuscation technique comprises the step of receiving 501 a datagram from an upper layer of the protocol stack. Then, the obfuscation technique comprises the step of obtaining 503 a random number x2 generated for example by a PRNG using the noise secret. x2 is the number of dummy packets that will be transmitted after the packet comprising the datagram. Finally, the obfuscation technique comprises the steps of transmitting 505 the packet comprising the datagram and then transmitting 507 x2 dummy packets.
  • a first device receives 511 a packet. Supposing the first device is an authorized node in possession of the noise secret, obtained according to one of the embodiments, the first device obtains 513 the random number x2 generated by the PRNG using the noise secret. Then the first device uses the knowledge of x2 and a correct order of packets to identify 515 if the received packet is a dummy packet or not and drop a packet if it is identified as dummy. If the packets of the data stream are not received in the correct order, the first device orders the packets.
  • Figure 6a and Figure 6b show a third example of obfuscation technique and corresponding deobfuscation technique according to embodiments.
  • the third example of obfuscation technique alters traffic pattern statistics by generating dummy packets to fill gaps between subsequently transmitted legitimate packets and encoding in the dummy packets an indication based on a noise secret to identify the dummy packet.
  • the obfuscation technique comprises the step of encoding 603 an indication to identify the dummy packet in a field of a payload of an Encapsulating Security Payload (ESP) packet, i.e., Initialization Vector (IV) in the example of Figure 6a.
  • ESP Encapsulating Security Payload
  • IV Initialization Vector
  • the IV field is usually not encrypted.
  • the obfuscation technique comprises determining a value of the IV for the dummy packets, generated by using the noise secret. Instead, the IV field of a legitimate packet is determined 601 according to a cryptographic scheme used. Then, both legitimate packets and dummy packets are transmitted 605. Then the technique may comprise the step of going to a sleep state 607 until a new datagram is received.
  • a first device receives 611 an ESP packet. Supposing the first device is an authorized node in possession of the noise secret, obtained according to one of the embodiments, the first device identifies dummy packets based on the noise secret and the IV field in the ESP packet, and discards the dummy packet 615. Packets that have not been discarded may be transmitted 613 for example to an analysis engine to perform intrusion detection. Instead, a legitimate receiver of the traffic will not use the noise secret to discern between legitimate packets and dummy packets. For example, the legitimate receiver identifies a dummy packet after decrypting the received packet. If the packet has the number “59” in Next Header field, the legitimate receiver will discard the packet, since “59” is an indication of “no next header”, i.e., the payload should be discarded.
  • a first example scenario in which the invention may be practiced is a distributed system in a not fully trusted cloud environment, e.g., addressing network side-channel information leak while maintaining observability.
  • Figure 7a shows a distributed system comprising two applications 703, 713, Application! and Application , and corresponding masquerading modules 705, 715 running on two virtual machines (VMs) 701, 711, VM1 and VM2, of a cloud infrastructure.
  • VMs virtual machines
  • a first masquerading module 705 obfuscates data stream generated by Application!
  • a second masquerading module 715 obfuscates data stream generated by Application 713, by using anoise secret NS2 717.
  • the noise secrets NS1 and NS2 are transmitted to a trusted analysis node 721 in a secure way.
  • Observability methods provided by a cloud provider enable data stream capture 731.
  • the captured data stream is de-obfuscated 723 by the trusted analysis node 721 according to embodiments and is directly analyzed, or stored for later analysis.
  • a second example scenario in which the invention may be practiced is a communication system wherein the trusted analysis node is a Cloud Access Security Broker (CASB) 751, as shown in Figure 7b.
  • the CASB observes an encrypted communication between an end-user or loT device 741, and a cloud application 763, Application!, running on a VM 761.
  • Statistical characteristics of a data stream transmitted by Application! 763 have been obfuscated according to embodiments by a first device 765 running in the VM 761, called masquerading module in Figure 7b, using noise secret NS2 767.
  • Statistical characteristics of a data stream transmitted by the end user application or the loT device 741 have been obfuscated according to embodiments by a first device 745, called masquerading module in Figure 7b, using noise secret NS1 743.
  • the CASB is monitoring the communication to identify patterns in the traffic indicative of abnormal behavior, by de-obfuscating 753 the traffic using the noise secrets NS1 and NS2.
  • the CASB however may not inspect the actual plaintext exchanged in the communication since it is end-to-end encrypted.
  • Figure 8 is a block diagram illustrating one embodiment of a first device 104, comprising a processor 801, a computer program product 805 in the form of a computer readable storage medium 806 in the form of a memory 802 and communication circuitry 803.
  • the memory, 802 contains instructions executable by the processor, 801, such that the first device 104, in one embodiment is operative to obtain a noise secret used for obfuscating at least some statistical characteristics of a received data stream.
  • the first device 104 is operative to obtain a representation of at least part of the received data stream.
  • the first device 104 is further operative to reconstruct by using the noise secret, at least some of the obfuscated statistical characteristics, and provide information indicative of the reconstructed statistical characteristics for performing anomaly detection.
  • the memory, 802 contains instructions executable by the processor, 801, such that the first device 104 is operative to identify the dummy packets by decoding the information using the noise secret; and discard the dummy packets
  • the first device 104 may be operative to re-order packets of the at least part of the received data stream; identify the dummy packets by using the noise secret and order of the packets; and discarding the dummy packets.
  • the first device 104 may be operative to generate the noise secret; and transmit the noise secret to a second device 102 for obfuscating a data stream.
  • the first device 104 is further operative to decrypt the encrypted noise secret, if the obtained noise secret is encrypted.
  • the first device 104 may include a processing circuitry (one or more than one processor), 801, coupled to communication circuitry, 803, and to the memory 802.
  • the first device 104 may comprise more than one communication circuitry.
  • the communication circuitry, 803, the processor(s) 801, and the memory 802 may be connected in series as illustrated in Figure 8.
  • these components 803, 801 and 802 may be coupled to an internal bus system of the first device, 104.
  • the memory 802 may include a Read-Only-Memory, ROM, e.g., a flash ROM, a Random Access Memory, RAM, e.g., a Dynamic RAM, DRAM, or Static RAM, SRAM, amass storage, e.g., a hard disk or solid state disk, or the like.
  • ROM Read-Only-Memory
  • RAM Random Access Memory
  • SRAM Static RAM
  • amass storage e.g., a hard disk or solid state disk, or the like.
  • the computer program product 805 comprises a computer program 804, which comprises computer program code loadable into the processor 801, wherein the computer program 804 comprises code adapted to cause the first device 104 to perform the steps of the method described herein, when the computer program code is executed by the processor 801.
  • the computer program 804 may be a software hosted by the first device 104.
  • first device 104 may actually include further components which, for the sake of clarity, have not been illustrated, e.g., further interfaces or processors.
  • memory, 802 may include further program code for implementing other and/or known functionalities.
  • the first device 104 may be provided as a virtual apparatus. In one embodiment, the first device 104 may be provided in distributed resources, such as in cloud resources. When provided as virtual apparatus, it will be appreciated that the memory, 802, processing circuitry, 801, and communication circuitry, 803, may be provided as functional elements. The functional elements may be distributed in a logical network and not necessarily be directly physically connected. It is also to be understood that the first device 104 may be provided as a single-node device, or as a multi-node system.
  • Figure 9 schematically illustrates, in terms of a number of functional units, the components of the first device 104 according to an embodiment.
  • the first device 104 comprises a first obtaining unit 901 configured to obtain a noise secret used for obfuscating at least some statistical characteristics of a received data stream.
  • a representation of at least part of the received data stream is obtained by a second obtaining unit 903.
  • a reconstruct unit 905 is configure to reconstruct by using the noise secret obtained by the first obtaining unit 901, at least some of the obfuscated statistical characteristics of the received data stream.
  • Information indicative of the reconstructed statistical characteristics are provided for performing anomaly detection by a providing unit 907.
  • an identifying unit 909 is configured to identify the dummy packets by decoding the information using the noise secret. The dummy packets are then discarded by a discarding unit 911.
  • an ordering unit 913 is configured to re-order packets of the at least part of the received data stream.
  • the identifying unit 909 may be further configured to identify the dummy packets by using the noise secret and order of the packets and the discarding unit 911 may discard the identified dummy packets.
  • the noise secret may be generated by a generating unit 915.
  • the generated noise secret may be transmitted by a transmitting unit 917 to a second device 102 for obfuscating a data stream.
  • a decrypting unit 919 is configured to decrypt the encrypted noise secret.
  • each functional unit 901-919 may be implemented in hardware or in software.
  • one or more or all functional modules 901-919 may be implemented by the processor 801, possibly in cooperation with the communications circuitry 803 and the computer readable storage medium 806 in the form of a memory 802.
  • the processor 801 may thus be arranged to fetch instructions as provided by a functional unit 901-919 from the computer readable storage medium 806 in the form of a memory 802 fetch instructions as provided by a functional module 901-919 and to execute these instructions, thereby performing any steps of the first device 104 as disclosed herein.
  • Figure 10 is a block diagram illustrating one embodiment of a second device 102, comprising a processor 1001, a computer program product 1005 in the form of a computer readable storage medium 1006 in the form of a memory 1002 and communication circuitry 1003.
  • the memory, 1002 contains instructions executable by the processor, 1001, such that the second device 102, in one embodiment is operative to obtain a noise secret.
  • the second device 102 is further configured to obfuscate at least some statistical characteristics of the data stream using the noise secret. Further, the second device 102 may be operative to generate the noise secret.
  • the second device 102 may be configured to transmit the noise secret to a first device 104.
  • the second device 102 is further operative to decrypt the encrypted noise secret, if the obtained noise secret is encrypted.
  • the second device 102 may include a processing circuitry (one or more than one processor),
  • the second device 102 may comprise more than one communication circuitry. For simplicity and brevity only one communication circuitry, 1003, has been illustrated in Figure 10.
  • the communication circuitry, 1003, the processor(s) 1001, and the memory 1002 may be connected in series as illustrated in Figure 10.
  • these components 1003, 1001 and 1002 may be coupled to an internal bus system of the second device, 102.
  • the memory 1002 may include a Read-Only-Memory, ROM, e.g., a flash ROM, a Random Access Memory, RAM, e.g., a Dynamic RAM, DRAM, or Static RAM, SRAM, amass storage, e.g., a hard disk or solid state disk, or the like.
  • ROM Read-Only-Memory
  • RAM Random Access Memory
  • SRAM Static RAM
  • amass storage e.g., a hard disk or solid state disk, or the like.
  • the computer program product 1005 comprises a computer program 1004, which comprises computer program code loadable into the processor 1001, wherein the computer program 1004 comprises code adapted to cause the second device 102 to perform the steps of the method described herein, when the computer program code is executed by the processor 1001.
  • the computer program 1004 may be a software hosted by the second device 102.
  • the structures as illustrated in Figure 10 are merely schematic and that the second device 102 may actually include further components which, for the sake of clarity, have not been illustrated, e.g., further interfaces or processors. Also, it is to be understood that the memory, 1002, may include further program code for implementing other and/or known functionalities.
  • the second device 102 may be provided as a virtual apparatus.
  • the second device 102 may be provided in distributed resources, such as in cloud resources.
  • distributed resources such as in cloud resources.
  • processing circuitry, 1001, and communication circuitry, 1003, may be provided as functional elements.
  • the functional elements may be distributed in a logical network and not necessarily be directly physically connected.
  • the second device 102 may be provided as a single-node device, or as a multi-node system.
  • Figure 11 schematically illustrates, in terms of a number of functional units, the components of the second device 102 according to an embodiment.
  • the second device 102 comprises a first obtaining unit 1101 configured to obtain a noise secret.
  • the noise secret is used by an obfuscating unit 1103 configured to obfuscate at least some statistical characteristics of the data stream using the noise secret.
  • the noise secret may be generated by a generating unit 1105.
  • a decrypting unit 1107 is configured to decrypt the encrypted noise secret.
  • each functional unit 1101-1107 may be implemented in hardware or in software.
  • one or more or all functional modules 1101-1107 may be implemented by the processor 1001, possibly in cooperation with the communications circuitry 1003 and the computer readable storage medium 1006 in the form of a memory 1002.
  • the processor 1001 may thus be arranged to fetch instructions as provided by a functional unit 1101-1107 from the computer readable storage medium 1006 in the form of a memory 1002 fetch instructions as provided by a functional module 1101-1107 and to execute these instructions, thereby performing any steps of the second device 102 as disclosed herein.

Landscapes

  • Engineering & Computer Science (AREA)
  • Computer Security & Cryptography (AREA)
  • Computer Networks & Wireless Communication (AREA)
  • Signal Processing (AREA)
  • Computer Hardware Design (AREA)
  • Computing Systems (AREA)
  • General Engineering & Computer Science (AREA)
  • Data Exchanges In Wide-Area Networks (AREA)

Abstract

Methods and devices for reconstructing some statistical characteristics of a data stream that has been previously obfuscated using a noise secret. An intermediary trusted third party, with the knowledge of the noise secret that has been used to obfuscate some of the statistical characteristics, is able to reconstruct some of the statistical characteristics as before being 5obfuscated. The reconstruct statistical characteristics may then be used for statistical traffic analysis and anomaly detection.

Description

METHODS AND DEVICES FOR SUPPORTING ANOMALY DETECTION
TECHNICAL FIELD
The invention relates to a first device and a method for supporting anomaly detection, a second device and a method for obfuscating a data stream, and corresponding computer program and computer program product.
BACKGROUND
Sensitive network communication may be protected by encryption schemes. Encryption schemes make infeasible for an eavesdropper to reconstruct sensitive information from encrypted data. Some encryption schemes protect application payload on top of a transport mechanism, such as Hypertext Transfer Protocol (HTTP) protected by Transport Layer Security (TLS) on Transmission Control Protocol (TCP). Other examples of communication protocols providing data privacy on different network layers are Datagram Transport Layer Security (DTLS), IPsec, MACsec.
Despite using an encryption scheme, there might still be sensitive information an attacker can obtain from the network communication via statistical traffic analysis. Confidential information (such as used applications, application layer protocols, used physical devices, or accessed web pages) may be inferred for example through statistical analysis of packet sizes and packet inter-arrival times,. Solutions have been developed to protect against statistical traffic analysis attacks. One example is Traffic Flow Confidentiality (TFC) mechanisms which alter or mask all or some statistical characteristics of a data stream prior to transmission. For example, the characteristics that may be altered are packet size and packet inter-departure time distribution. Further information on TFC may be found in S. Kent, RFC4303 IP Encapsulating Security Payload (ESP), 2005.
However, solutions for masking or altering all or some statistical characteristics are in direct conflict with the requirements of some anomaly detection and trusted traffic analysis tools. Several solutions for anomaly detection and traffic analysis, such as Network intrusion detection systems (NIDS) based on statistical analysis, Network Traffic Analysis (NT A), User and Entity Behavior Analytics (UEBA) tools, require collecting information on statistical characteristics to differentiate between normal and abnormal traffic.
SUMMARY
Accordingly, the solution disclosed in this document seeks to preferably mitigate, alleviate, or eliminate one or more of the disadvantages mentioned above singly or in any combination.
To achieve said object, according to a first aspect of the present invention there is provided a method performed by a first device for supporting anomaly detection in an obfuscated data stream. The method comprises obtaining a noise secret used for obfuscating at least some statistical characteristics of a received data stream. The method comprises obtaining a representation of at least part of the received data stream. The method comprises reconstructing, by using the noise secret, at least some of the obfuscated statistical characteristics. The method comprises providing information indicative of the reconstructed statistical characteristics for performing anomaly detection. This allows both traffic obfuscation solutions and, anomaly detection tools and/or traffic analysis tools, to work efficiently. Moreover, end-to-end privacy of the communication is not broken, i.e., the data stream is not decrypted to reconstruct some of the statistical characteristics.
According to a second aspect of the present invention there is provided a method performed by a second device for obfuscating a data stream. The method comprises obtaining a noise secret; and obfuscating at least some statistical characteristics of the data stream using the noise secret.
According to a third aspect of the present invention there is provided a first device for supporting anomaly detection in an obfuscated data stream. The first device comprises a processor and a memory, the memory having stored thereon instructions executable by the processor. The instructions, when executed by the processor, cause the first device to: obtain a noise secret used for obfuscating at least some statistical characteristics of a received data stream. The first device is also operative to obtain a representation of at least part of the received data stream. The first device is operative to reconstruct, by using the noise secret, at least some of the obfuscated statistical characteristics. The first device is further operative to provide information indicative of the reconstructed statistical characteristics for performing anomaly detection. According to a fourth aspect of the present invention there is provided a second device for obfuscating a data stream. The second device comprises a processor and a memory, the memory having stored thereon instructions executable by the processor. The instructions, when executed by the processor, cause the second device to: obtain a noise secret; and obfuscate at least some statistical characteristics of the data stream using the noise secret.
According to a fifth aspect of the present invention there is provided a computer program comprising instructions which, when run in a processing unit on a first device, cause the first device to: obtain a noise secret used for obfuscating at least some statistical characteristics of a received data stream; obtain a representation of at least part of the received data stream; reconstruct by using the noise secret, at least some of the obfuscated statistical characteristics; provide information indicative of the reconstructed statistical characteristics for performing anomaly detection.
According to a sixth aspect of the present invention there is provided a computer program product comprising a computer readable storage medium on which a computer program, as mentioned above, is stored.
According to a seventh aspect of the present invention there is provided a computer program comprising instructions which, when run in a processing unit on a second device, cause the second device to obtain a noise secret; obfuscate at least some statistical characteristics of the data stream using the noise secret.
According to an eighth aspect of the present invention there is provided a computer program product comprising a computer readable storage medium on which a computer program, as mentioned above, is stored.
In an embodiment, the representation is a copy of at least part of the received data stream.
In an embodiment, wherein the representation is processed data comprising information about data stream.
In an embodiment, packets in the received data stream are obfuscated by generating dummy packets using the noise secret.
In an embodiment, the generated dummy packets comprise information to identify the dummy packets using the noise secret. In an embodiment of the first aspect, the method comprises identifying the dummy packets by decoding the information to identify the dummy packet using the noise secret. The method comprises discarding the dummy packet.
In an embodiment of the third aspect, the instructions, when executed by the processor, cause the first device to identify the dummy packets by decoding the information to identify the dummy packet using the noise secret; discard the dummy packets.
In an embodiment of the first aspect, the method comprises re-ordering packets of the at least part of the received data stream.
In an embodiment of the third aspect, the instructions, when executed by the processor, cause the first device to re-order packets of the at least part of the received data stream.
In an embodiment of the first aspect, the method comprises identifying the dummy packets by using the noise secret and order of the packets; and discarding the dummy packets.
In an embodiment of the third aspect, the instructions, when executed by the processor, cause the first device to identify the dummy packets by using the noise secret and order of the packets; and discard the dummy packets.
In an embodiment, payload in the obfuscated data stream is encrypted.
In an embodiment, the noise secret is obtained from a central node.
In an embodiment of the first aspect and third aspect, the noise secret is hard coded in the first device.
In an embodiment of the first aspect and third aspect, the noise secret is obtained from a second device for obfuscating a data stream.
In an embodiment of the first aspect, the method comprises generating the noise secret; transmitting the noise secret to a second device for obfuscating a data stream.
In an embodiment of the third aspect, the instructions, when executed by the processor, cause the first device to generate the noise secret; transmit the noise secret to a second device for obfuscating a data stream.
In an embodiment, the obtained noise secret is encrypted.
In an embodiment of the first aspect and second aspect, the method comprises decrypting the encrypted noise secret. In an embodiment of the third aspect, the instructions, when executed by the processor, cause the first device to decrypt the encrypted noise secret.
In an embodiment of the fourth aspect, the instructions, when executed by the processor, cause the second device to decrypt the encrypted noise secret.
In an embodiment, the operation of obfuscating comprises altering a size of a packet of the data stream using the noise secret.
In an embodiment, the operation of obfuscating comprises generating dummy packets using the noise secret.
In an embodiment, the operation of obfuscating comprises encoding information in a dummy packet for identifying the dummy packet using the noise secret.
In an embodiment, the second device is a source node transmitting the data stream.
In an embodiment of the second aspect and fourth aspect, the noise secret is hard coded in the in the second device.
In an embodiment of the second aspect and fourth aspect, the noise secret is obtained from a first device for supporting anomaly detection.
In an embodiment of the second aspect, the method comprises generating the noise secret.
In an embodiment, the instructions, when executed by the processor, cause the second device to generate the noise secret.
BRIEF DESCRIPTION OF THE DRAWINGS
For better understanding of the present disclosure, and to show more readily how the invention may be carried into effect, reference will now be made, by way of example, to the following drawings, in which:
Figure 1 shows an example system according to an embodiment;
Figure 2 shows a flowchart illustrating a method performed by a first device according to embodiments;
Figure 3 shows a flowchart illustrating a method performed by a second device according to embodiments; Figure 4a shows a first example of obfuscation technique according to embodiments;
Figure 4b shows a first example of de-obfus cation technique according to embodiments if the first example of obfuscation technique has been used;
Figure 5a shows a second example of obfuscation technique according to embodiments;
Figure 5b shows a second example of de-obfuscation technique according to embodiments if the second example of obfuscation technique has been used;
Figure 6a shows a third example of obfuscation technique according to embodiments;
Figure 6b shows a third example of de-obfuscation technique according to embodiments if the third example of obfuscation technique has been used;
Figure 7a shows a first example scenario in which an invention according to embodiments may be practiced;
Figure 7b shows a second example scenario in which an invention according to embodiments may be practiced;
Figure 8 is a block diagram depicting a first device according to an embodiment;
Figure 9 is a block diagram depicting units of a first device according to an embodiment;
Figure 10 is a block diagram depicting a second device according to an embodiment; and
Figure 11 is a block diagram depicting units of a second device according to an embodiment.
DETAILED DESCRIPTION
Embodiments will be illustrated herein with reference to the accompanying drawings. These embodiments are provided by way of example so that this disclosure will be thorough and complete, and will fully convey the scope of the inventive concept to those skilled in the art.
Even if encryption schemes may protect sensitive network communication, statistical characteristics of a data stream may still be obtained without decrypting the data stream. For instance, voice over IP (VoIP) traffic is usually transferred by user datagram protocol (UDP) packets of a recognizable size with specific requirements for network jitter and delay. Therefore, non-trusted third-parties may obtain desirable information and create security concerns for communicating parties. Obfuscation techniques, protocols, and/or extensions are adopted for protecting network communication against statistical traffic analysis, especially in case of use of open and public networks, by masking or altering all or some statistical characteristics of a data stream.
However, the use of obfuscation solutions may degrade performance of some anomaly detection tools and traffic analysis tools, since they require some statistical characteristics of the data stream to differentiate between normal traffic and abnormal traffic. For example, Network Intrusion Detection System (NIDS) requires statistical characteristics, such as packet sizes and packet inter-departure time.
The solution to be disclosed, in its embodiments, allows reconstructing statistical characteristics of a data stream that had been previously obfuscated using a noise secret. An intermediary trusted third party, with the knowledge of the noise secret that has been used to obfuscate some of the statistical characteristics, is able to reconstruct at least some of the obfuscated statistical characteristics. The reconstructed statistical characteristics may then be used for statistical traffic analysis and anomaly detection.
This solution does not break end-to-end privacy of the communication, i.e., the data stream is not decrypted to reconstruct some of the statistical characteristics. The intermediary trusted third party, such as a legitimate monitoring and analysis node, is able to reconstruct the statistical characteristics by using the knowledge of the noise secret. Other non-trusted entities monitoring the traffic, such as eavesdropping adversaries, may capture the data stream, but they are not able to reconstruct the statistical characteristics because these are obfuscated. Instead, from a receiver point of view, a legitimate receiver node of the data stream is able to correctly process the data stream without knowledge of the noise secret. For example, if the obfuscation altered a size of a packet, since the legitimate receiver is able to decrypt the packet, the legitimate receiver may trim excess data from the packet using packet size information obtained from an upper layer protocol, after decryption.
The solution to be disclosed, in its embodiments, allows both traffic obfuscation solutions and, anomaly detection tools and/or traffic analysis tools, to work efficiently. Moreover, the solution may allow a service provider to control who can perform traffic analysis, protecting business value of usage data.
In the present document, the terms “obfuscation”, “masquerading”, or “perturbation” refer to an alteration of all or some statistical characteristics of a data stream to hide all or some of the statistical characteristics. Figure 1 shows an example scenario in which the invention may be practiced. Figure 1 shows a system comprising a first device 104, a legitimate analysis node 105, a transmitting node 101, a second device 102, a legitimate receiving node 103, and an adversary node 107.
The transmitting node 101, or source node, is a node that wants to transmit a data stream to the receiving node 103 in a secure way via a data network, e.g., Internet Protocol (IP). The transmitting node may be a Network Function, Virtual Network Function, microservice, sensor, a Machine Type Communication (MTC) device, Machine-to -Machine (M2M), Internet of Things (loT) device, user device, vehicle, router, gateway, and any device with computing, storage, and network connectivity. According to an embodiment, the transmitting node 101 transmits its data stream to the second device 102 to obfuscate at least some statistical characteristics of the data stream using a noise secret, the data stream obtained after obfuscation is called obfuscated data stream. According to embodiments, modifying (or obfuscating) one or several statistical characteristics of a data stream is a reversible process based on a noise secret. The noise secret may comprise multiple values, parameters, obfuscation method, and any other information necessary to produce and then remove the obfuscation. Example of noise secrets are any values that serve as key for schemes based on keyed hash message authentication code, symmetric key, or asymmetric key; and parameters of a pseudorandom number generator (PRNG) such as seed, number of elements to be discarded starting from beginning of a random number sequence generated from a certain seed, or offset or modulo to be applied on each element of the random number sequencer. The obfuscation process may be reversed by using the noise secret, the noise secret, the noise secret, the noise secret.
The second device 102 may be a router, gateway, middlebox, and any device with computing, storage, and network connectivity. In an embodiment, the transmitting node 101 and the second device 102 may be implemented on a same node 111. In an alternative embodiment, the transmitting node 101 and the second device 102 may be implemented on separate nodes.
According to an embodiment, the first device 104 is a node authorized to obtain a representation of at least part of the obfuscated data stream. The first device 104 is in possession of the noise secret and is able to reconstruct, using the noise secret, at least some of the obfuscated statistical characteristics. In one embodiment, the statistical characteristics are reconstructed as before being obfuscated. The first device 104 may then use information indicative of the reconstructed statistical characteristics for performing anomaly detection. The first device 104 may alternatively transmit the information indicative of the reconstructed statistical characteristics to a further node, such as the legitimate analysis node 105 for performing anomaly detection. The first device 104 may be a router, gateway, and any device with computing, storage, and network connectivity. The first device 104 may directly collect a representation of the data stream or obtain a representation from a traffic collection point such as a router, switch, probe, bump-in-the-wire device, or host software agent able to collect data streams via port mirroring, network tap or any information exporting capability.
The legitimate analysis node 105, intermediary trusted third party, or trusted node is a node authorized to perform anomaly detection on the data stream. The legitimate analysis node 105 may be a router, gateway, and any device with computing, storage, and network connectivity running an anomaly detection tool and/or traffic analysis tool. Examples of anomaly detection tools and traffic analysis tools are NIDS based on statistical analysis, Network Traffic Analysis (NTA), User and Entity Behavior Analytics (UEBA) tools. In an embodiment, the legitimate analysis node 105 and the first device 104 may be implemented on a same node 121. In an alternative embodiment, the legitimate analysis node 105 and the first device 104 may be implemented on separate nodes.
The adversary node 107 is a node that attempts to access the data stream without authorization. The adversary node 107 may be a router, gateway, a software code operating on an otherwise legitimate router/s witch, and any device with computing, storage, and network connectivity. According to an embodiment, the adversary node 107 obtaining a representation of at least part of the transmitted data stream with at least some obfuscated statistical characteristics, may not be able to reconstruct at least some of the obfuscated statistical characteristics as before being obfuscated.
Finally, the receiving node 103 is a legitimate receiver of the data traffic sent by the transmitting node 101. The receiving node 103 is able to correctly process the obfuscated data stream without knowledge of the noise secret. In other words, the receiving node 103 is not in possession of the noise secret. For example, if the obfuscation altered size of a packet, the receiver node 103 may trim excess data from the packet using packet size information obtained after decryption from an upper layer protocol if available. If the obfuscation altered traffic pattern statistics by generating dummy packets, the receiver node 103 may identify the dummy packets when decrypting a received packet.
The receiving node 103 may be a Network Function, Virtual Network Function, microservice, sensor, a Machine Type Communication (MTC) device, Machine-to -Machine (M2M), Internet of Things (loT) device, user device, user equipment , compute node, cloud service, Network Function, Virtual Network Function, microservice, vehicle, router, gateway, and any device with computing, storage, and network connectivity.
First device 104, legitimate analysis node 105, transmitting node 101, second device 102, receiving node 103 may communicate with each other, through a subscription protocol, such as message queuing telemetry transport (MQTT) protocol, or utilizing any one of a number of transfer protocols (e.g., Ethernet, frame relay, IP, transmission control protocol (TCP), UDP, hypertext transfer protocol (HTTP), HTTP/2) and Remote Procedure Call (RPC), protocols, such as Google Remote Procedure Call (gRPC), and ensuring security requirements by using transport layer security, TLS.
Figure 2 shows a method for supporting anomaly detection in an obfuscated data stream. The method is carried out by a first device 104.
Referring to the method of Figure 2, in step 201, the method comprises obtaining a noise secret used for obfuscating at least some statistical characteristics of a data stream. According to an embodiment, the obtained noise secret may be encrypted and the method may comprise decrypting 213 the encrypted noise secret.
In step 203, the method comprises obtaining a representation of at least part of the obfuscated data stream. According to an embodiment, the representation may be a copy of at least part of the received data stream. According to an alternative embodiment, the representation may be processed data comprising information about data stream, e.g., truncated packets, IP flows, summary export features. According to an embodiment, pay load of the obfuscated data stream is encrypted.
Then the method comprises in step 205 reconstructing, by using the noise secret, at least some of the obfuscated statistical characteristics. At least some of the statistical characteristics that may have been obfuscated with the noise secret comprise, for example, packet timing, packet rate, packet sizes, payload size, data transfer rate of a data stream.
Finally, in step 207, the method comprises providing information indicative of the reconstructed statistical characteristics for performing anomaly detection.
According to embodiments, packets, or packet stream, of the received data stream are obfuscated by generating dummy packets using the noise secret. According to an embodiment, the generated dummy packets comprise information to identify the dummy packet using the noise secret. Therefore, the information in a packet allows a node knowing the noise secret, to identify 219 a packet as dummy. Once the dummy packets have been identified, they may be discarded 221. For example, the information may be encoded in an Initiation Vector (IV) field of a packet and, by reading the value in the IV field and knowing the noise secret used to generate the value, a first device 104 may identify the dummy packet and discard it, as shown in Figure 6a and 6b.
According to an embodiment, the noise secret may indicate the dummy packets based on the order of transmission of the packets. In particular in large network, the received data stream may suffer from packet drops and retransmission, therefore the method may comprise reordering 215 packets of the at least part of the received data stream. The method further comprises identifying 217 the dummy packets by using the noise secret and order of the packets; and discarding 221 the identified dummy packets.
According to an embodiment, the noise secret may be obtained from a central node that may generate the noise secret and distribute the noise secret for example during an on-boarding process. In an alternative embodiment, a second device for obfuscating a data stream may generate the noise secret, and share the noise secret for example in a protocol set-up phase, e.g., encrypted with the public key of the first node. Alternatively, the noise secret may be hard coded in the first device 105. Alternatively, the first device may generate 209 the noise secret. The first device may then transmit 211 the noise secret to the second device 102. Alternatively, the noise secret may be obtained through (manual) configuration
Figure 3 shows a method for obfuscating a data stream. The method may be carried out by a second device, 102. According to an embodiment, the second device may be the source node transmitting the data stream, i.e., the transmitting node 101.
Referring to the method of Figure 3, the method comprises obtaining 301 a noise secret, and obfuscating 303 at least some statistical characteristics of the data stream using the noise secret. According to an embodiment, payload of the data stream is encrypted.
According to an embodiment, the operation of obfuscating comprises altering a size of a packet of the data stream using the noise secret. For example, padding and fragmentation may be used for masking packet length statistics, wherein the size of a packet is generated using the noise secret. A second device obtaining a packet with altered size, would reconstruct the size before obfuscation using the noise secret. According to an embodiment, the noise secret may comprise a key and a reversible function that uses the key to generate the obfuscation. A legitimate receiver of the packet would not need the noise secret to reconstruct the size, but it would for example decrypt the payload and use packet size from an upper layer protocol.
According to an embodiment, the operation of obfuscating comprises generating dummy packets using the noise secret. According to an embodiment, information identifying a dummy packet may be encoded in the dummy packet using the noise secret. A first device obtaining a packet with an encoded information, would decode the information using the noise secret and be able to identify a dummy packet. A legitimate receiver of the packet would not need the noise secret to identify the dummy packet, but it would for example decrypt the payload and use information in the payload to distinguish between a dummy packet and a legitimate packet.
According to an embodiment, the method comprises obtaining the noise secret from a central node that may generate the noise secret and distribute the noise secret for example in an onboarding process. Alternatively, the noise secret may be hard coded in the second device, natively, the noise secret may be obtained through (manual) configuration. According to an alternative embodiment, a first device for supporting anomaly detection may generate the noise secret, and share the noise secret for example in a protocol set-up phase. Alternatively, the second device may generate 305 the noise secret. The second device may then transmit the noise secret to the first device. According to an embodiment, the obtained noise secret may be encrypted, and the method may comprise decrypting 207 the encrypted noise secret.
Figure 4a and Figure 4b show a first example of obfuscation technique and corresponding deobfuscation technique according to embodiments. The first example of obfuscation technique alters an original packet length or size based on a noise secret.
As shown in Figure 4a, the obfuscation technique comprises the step of receiving 401 a datagram from an upper layer of the protocol stack. A packet comprising the datagram has an original size or length 5. Then, the obfuscation technique comprises the step of obtaining 403 a random number xl generated for example by a PRNG using the noise secret. Then, a further size 5 ' is determined 405, wherein s ' is a function of the original size or length and xl. Finally, the size or length of the packet is modified 407 with padding or fragmentation, so to be equal to .s' '. The obtained obfuscated packet may then be transmitted 409.
As shown in Figure 4b, a first device receives 411 the obfuscated packet with size 5 '. Supposing the first device is an authorized node in possession of the noise secret obtained according to one of the embodiments, the first device obtains 413 the random number xl generated by the PRNG using the noise secret. Then, the first device calculates 415 the original packet size .s' using a reverse function of the function used to calculate .s' ' and reconstructs the original size for example by removing bits. The reconstructed packet may now be transmitted 417 for example to an analysis engine to perform intrusion detection, anomaly detection, or behavior analytics.
Figure 5a and Figure 5b show a second simplified example of obfuscation technique and corresponding de-obfuscation technique according to embodiments. The second example of obfuscation technique alters traffic pattern statistics by generating dummy packets according to a sequence based on the noise secret to fill gaps between subsequently transmitted nondummy packet carrying application data.
As shown in Figure 5a, the obfuscation technique comprises the step of receiving 501 a datagram from an upper layer of the protocol stack. Then, the obfuscation technique comprises the step of obtaining 503 a random number x2 generated for example by a PRNG using the noise secret. x2 is the number of dummy packets that will be transmitted after the packet comprising the datagram. Finally, the obfuscation technique comprises the steps of transmitting 505 the packet comprising the datagram and then transmitting 507 x2 dummy packets.
As shown in Figure 5b, a first device receives 511 a packet. Supposing the first device is an authorized node in possession of the noise secret, obtained according to one of the embodiments, the first device obtains 513 the random number x2 generated by the PRNG using the noise secret. Then the first device uses the knowledge of x2 and a correct order of packets to identify 515 if the received packet is a dummy packet or not and drop a packet if it is identified as dummy. If the packets of the data stream are not received in the correct order, the first device orders the packets.
Figure 6a and Figure 6b show a third example of obfuscation technique and corresponding deobfuscation technique according to embodiments. The third example of obfuscation technique alters traffic pattern statistics by generating dummy packets to fill gaps between subsequently transmitted legitimate packets and encoding in the dummy packets an indication based on a noise secret to identify the dummy packet.
As shown in Figure 6a, the obfuscation technique comprises the step of encoding 603 an indication to identify the dummy packet in a field of a payload of an Encapsulating Security Payload (ESP) packet, i.e., Initialization Vector (IV) in the example of Figure 6a. The IV field is usually not encrypted. The obfuscation technique comprises determining a value of the IV for the dummy packets, generated by using the noise secret. Instead, the IV field of a legitimate packet is determined 601 according to a cryptographic scheme used. Then, both legitimate packets and dummy packets are transmitted 605. Then the technique may comprise the step of going to a sleep state 607 until a new datagram is received.
As shown in Figure 6b, a first device receives 611 an ESP packet. Supposing the first device is an authorized node in possession of the noise secret, obtained according to one of the embodiments, the first device identifies dummy packets based on the noise secret and the IV field in the ESP packet, and discards the dummy packet 615. Packets that have not been discarded may be transmitted 613 for example to an analysis engine to perform intrusion detection. Instead, a legitimate receiver of the traffic will not use the noise secret to discern between legitimate packets and dummy packets. For example, the legitimate receiver identifies a dummy packet after decrypting the received packet. If the packet has the number “59” in Next Header field, the legitimate receiver will discard the packet, since “59” is an indication of “no next header”, i.e., the payload should be discarded.
A first example scenario in which the invention may be practiced is a distributed system in a not fully trusted cloud environment, e.g., addressing network side-channel information leak while maintaining observability. Figure 7a shows a distributed system comprising two applications 703, 713, Application! and Application , and corresponding masquerading modules 705, 715 running on two virtual machines (VMs) 701, 711, VM1 and VM2, of a cloud infrastructure. To conceal the traffic patterns between the applications, techniques according to the proposed invention are used. A first masquerading module 705 obfuscates data stream generated by Application! 703, by using a noise secret NS1 707; a second masquerading module 715 obfuscates data stream generated by Application 713, by using anoise secret NS2 717. The noise secrets NS1 and NS2 are transmitted to a trusted analysis node 721 in a secure way. Observability methods provided by a cloud provider enable data stream capture 731. The captured data stream is de-obfuscated 723 by the trusted analysis node 721 according to embodiments and is directly analyzed, or stored for later analysis.
A second example scenario in which the invention may be practiced is a communication system wherein the trusted analysis node is a Cloud Access Security Broker (CASB) 751, as shown in Figure 7b. The CASB observes an encrypted communication between an end-user or loT device 741, and a cloud application 763, Application!, running on a VM 761. Statistical characteristics of a data stream transmitted by Application! 763 have been obfuscated according to embodiments by a first device 765 running in the VM 761, called masquerading module in Figure 7b, using noise secret NS2 767. Statistical characteristics of a data stream transmitted by the end user application or the loT device 741 have been obfuscated according to embodiments by a first device 745, called masquerading module in Figure 7b, using noise secret NS1 743. The CASB is monitoring the communication to identify patterns in the traffic indicative of abnormal behavior, by de-obfuscating 753 the traffic using the noise secrets NS1 and NS2. The CASB however may not inspect the actual plaintext exchanged in the communication since it is end-to-end encrypted.
Figure 8 is a block diagram illustrating one embodiment of a first device 104, comprising a processor 801, a computer program product 805 in the form of a computer readable storage medium 806 in the form of a memory 802 and communication circuitry 803.
The memory, 802, contains instructions executable by the processor, 801, such that the first device 104, in one embodiment is operative to obtain a noise secret used for obfuscating at least some statistical characteristics of a received data stream. The first device 104 is operative to obtain a representation of at least part of the received data stream. The first device 104 is further operative to reconstruct by using the noise secret, at least some of the obfuscated statistical characteristics, and provide information indicative of the reconstructed statistical characteristics for performing anomaly detection.
If packets in the received data stream are obfuscated by generating dummy packets using the noise secret, in an embodiment, the memory, 802, contains instructions executable by the processor, 801, such that the first device 104 is operative to identify the dummy packets by decoding the information using the noise secret; and discard the dummy packets
In an embodiment, the first device 104 may be operative to re-order packets of the at least part of the received data stream; identify the dummy packets by using the noise secret and order of the packets; and discarding the dummy packets.
Further, the first device 104 may be operative to generate the noise secret; and transmit the noise secret to a second device 102 for obfuscating a data stream.
Preferably, the first device 104 is further operative to decrypt the encrypted noise secret, if the obtained noise secret is encrypted.
The first device 104 may include a processing circuitry (one or more than one processor), 801, coupled to communication circuitry, 803, and to the memory 802. The first device 104 may comprise more than one communication circuitry. For simplicity and brevity only one communication circuitry, 803, has been illustrated in Figure 8. By way of example, the communication circuitry, 803, the processor(s) 801, and the memory 802 may be connected in series as illustrated in Figure 8. Alternatively, these components 803, 801 and 802 may be coupled to an internal bus system of the first device, 104.
The memory 802 may include a Read-Only-Memory, ROM, e.g., a flash ROM, a Random Access Memory, RAM, e.g., a Dynamic RAM, DRAM, or Static RAM, SRAM, amass storage, e.g., a hard disk or solid state disk, or the like.
The computer program product 805 comprises a computer program 804, which comprises computer program code loadable into the processor 801, wherein the computer program 804 comprises code adapted to cause the first device 104 to perform the steps of the method described herein, when the computer program code is executed by the processor 801. In other words, the computer program 804 may be a software hosted by the first device 104.
It is to be understood that the structures as illustrated in Figure 8 are merely schematic and that the first device 104 may actually include further components which, for the sake of clarity, have not been illustrated, e.g., further interfaces or processors. Also, it is to be understood that the memory, 802, may include further program code for implementing other and/or known functionalities.
It is also to be understood that the first device 104 may be provided as a virtual apparatus. In one embodiment, the first device 104 may be provided in distributed resources, such as in cloud resources. When provided as virtual apparatus, it will be appreciated that the memory, 802, processing circuitry, 801, and communication circuitry, 803, may be provided as functional elements. The functional elements may be distributed in a logical network and not necessarily be directly physically connected. It is also to be understood that the first device 104 may be provided as a single-node device, or as a multi-node system.
Figure 9 schematically illustrates, in terms of a number of functional units, the components of the first device 104 according to an embodiment. The first device 104 comprises a first obtaining unit 901 configured to obtain a noise secret used for obfuscating at least some statistical characteristics of a received data stream. A representation of at least part of the received data stream is obtained by a second obtaining unit 903. A reconstruct unit 905 is configure to reconstruct by using the noise secret obtained by the first obtaining unit 901, at least some of the obfuscated statistical characteristics of the received data stream. Information indicative of the reconstructed statistical characteristics are provided for performing anomaly detection by a providing unit 907.
Further, if packets in the received data stream are obfuscated by generating dummy packets using the noise secret, an identifying unit 909 is configured to identify the dummy packets by decoding the information using the noise secret. The dummy packets are then discarded by a discarding unit 911.
Further, an ordering unit 913 is configured to re-order packets of the at least part of the received data stream. The identifying unit 909 may be further configured to identify the dummy packets by using the noise secret and order of the packets and the discarding unit 911 may discard the identified dummy packets.
Further, the noise secret may be generated by a generating unit 915. The generated noise secret may be transmitted by a transmitting unit 917 to a second device 102 for obfuscating a data stream.
If the noise secret has been received encrypted, a decrypting unit 919 is configured to decrypt the encrypted noise secret.
In general terms, each functional unit 901-919 may be implemented in hardware or in software. Preferably, one or more or all functional modules 901-919 may be implemented by the processor 801, possibly in cooperation with the communications circuitry 803 and the computer readable storage medium 806 in the form of a memory 802. The processor 801 may thus be arranged to fetch instructions as provided by a functional unit 901-919 from the computer readable storage medium 806 in the form of a memory 802 fetch instructions as provided by a functional module 901-919 and to execute these instructions, thereby performing any steps of the first device 104 as disclosed herein.
Figure 10 is a block diagram illustrating one embodiment of a second device 102, comprising a processor 1001, a computer program product 1005 in the form of a computer readable storage medium 1006 in the form of a memory 1002 and communication circuitry 1003.
The memory, 1002, contains instructions executable by the processor, 1001, such that the second device 102, in one embodiment is operative to obtain a noise secret. The second device 102 is further configured to obfuscate at least some statistical characteristics of the data stream using the noise secret. Further, the second device 102 may be operative to generate the noise secret. The second device 102 may be configured to transmit the noise secret to a first device 104.
Preferably, the second device 102 is further operative to decrypt the encrypted noise secret, if the obtained noise secret is encrypted.
The second device 102 may include a processing circuitry (one or more than one processor),
1001, coupled to communication circuitry, 1003, and to the memory 1002. The second device 102 may comprise more than one communication circuitry. For simplicity and brevity only one communication circuitry, 1003, has been illustrated in Figure 10. By way of example, the communication circuitry, 1003, the processor(s) 1001, and the memory 1002 may be connected in series as illustrated in Figure 10. Alternatively, these components 1003, 1001 and 1002 may be coupled to an internal bus system of the second device, 102.
The memory 1002 may include a Read-Only-Memory, ROM, e.g., a flash ROM, a Random Access Memory, RAM, e.g., a Dynamic RAM, DRAM, or Static RAM, SRAM, amass storage, e.g., a hard disk or solid state disk, or the like.
The computer program product 1005 comprises a computer program 1004, which comprises computer program code loadable into the processor 1001, wherein the computer program 1004 comprises code adapted to cause the second device 102 to perform the steps of the method described herein, when the computer program code is executed by the processor 1001. In other words, the computer program 1004 may be a software hosted by the second device 102.
It is to be understood that the structures as illustrated in Figure 10 are merely schematic and that the second device 102 may actually include further components which, for the sake of clarity, have not been illustrated, e.g., further interfaces or processors. Also, it is to be understood that the memory, 1002, may include further program code for implementing other and/or known functionalities.
It is also to be understood that the second device 102 may be provided as a virtual apparatus. In one embodiment, the second device 102 may be provided in distributed resources, such as in cloud resources. When provided as virtual apparatus, it will be appreciated that the memory,
1002, processing circuitry, 1001, and communication circuitry, 1003, may be provided as functional elements. The functional elements may be distributed in a logical network and not necessarily be directly physically connected. It is also to be understood that the second device 102 may be provided as a single-node device, or as a multi-node system. Figure 11 schematically illustrates, in terms of a number of functional units, the components of the second device 102 according to an embodiment. The second device 102 comprises a first obtaining unit 1101 configured to obtain a noise secret. The noise secret is used by an obfuscating unit 1103 configured to obfuscate at least some statistical characteristics of the data stream using the noise secret.
Further, the noise secret may be generated by a generating unit 1105.
If the noise secret has been received encrypted, a decrypting unit 1107 is configured to decrypt the encrypted noise secret.
In general terms, each functional unit 1101-1107 may be implemented in hardware or in software. Preferably, one or more or all functional modules 1101-1107 may be implemented by the processor 1001, possibly in cooperation with the communications circuitry 1003 and the computer readable storage medium 1006 in the form of a memory 1002. The processor 1001 may thus be arranged to fetch instructions as provided by a functional unit 1101-1107 from the computer readable storage medium 1006 in the form of a memory 1002 fetch instructions as provided by a functional module 1101-1107 and to execute these instructions, thereby performing any steps of the second device 102 as disclosed herein.

Claims

1. A method performed by a first device for supporting anomaly detection in an obfuscated data stream, the method comprising obtaining (201) a noise secret used for obfuscating at least some statistical characteristics of a received data stream; obtaining (203) a representation of at least part of the received data stream; reconstructing (205), by using the noise secret, at least some of the obfuscated statistical characteristics; providing (207) information indicative of the reconstructed statistical characteristics for performing anomaly detection.
2. The method according to claim 1, wherein the representation is a copy of at least part of the received data stream.
3. The method according to claim 1, wherein the representation is processed data comprising information about the data stream.
4. The method according to any of the preceding claims, wherein packets in the received data stream are obfuscated by generating dummy packets using the noise secret.
5. The method according to claim 4, wherein the generated dummy packets comprise information to identify the dummy packets using the noise secret.
6. The method according to claim 5, further comprising identifying (219) the dummy packets by decoding the information to identify the dummy packet using the noise secret; discarding (221) the dummy packet.
7. The method according to claim 4, further comprising re-ordering (215) packets of the at least part of the received data stream.
8. The method according to any of claims 2-4 or 7, further comprising identifying (217) the dummy packets by using the noise secret and order of the packets; discarding (221) the dummy packets.
9. The method according to any of the preceding claims, wherein pay load in the obfuscated data stream is encrypted.
10. The method according to any one of the preceding claims, wherein the noise secret is obtained from a central node.
11. The method according to any one of claims 1-8, wherein the noise secret is hard coded in the in the first device.
12. The method according to any one of claims 1-9, wherein the noise secret is obtained from a second device (102) for obfuscating a data stream.
13. The method according to any one of claims 1-9, further comprising generating (209) the noise secret;
- transmitting (211) the noise secret to a second device (102) for obfuscating a data stream.
14. The method according to any one of the preceding claims, wherein the obtained noise secret is encrypted.
15. The method according to claim 14, further comprising decrypting (213) the encrypted noise secret.
16. A method performed by a second device for obfuscating a data stream, the method comprising: obtaining (301) a noise secret; obfuscating (303) at least some statistical characteristics of the data stream using the noise secret.
17. The method according to claim 16, wherein the operation of obfuscating comprises altering a size of a packet of the data stream using the noise secret.
18. The method according to claim 16, wherein the operation of obfuscating comprises generating dummy packets using the noise secret.
19. The method according to claim 18, wherein the operation of obfuscating comprises encoding information in a dummy packet for identifying the dummy packet using the noise secret.
20. The method according to any of claims 16-19, wherein payload in the obfuscated data stream is encrypted.
21. The method according to any of claims 16-20, wherein the second device is a source node transmitting the data stream.
22. The method according to any of claims 16-21, wherein the noise secret is obtained from a central node.
23. The method according to any one of claims 16-20, wherein the noise secret is hard coded in the in the second device.
24. The method according to any one of claims 16-20, wherein the noise secret is obtained from a first device for supporting anomaly detection.
25. The method according to any one of claims 16-21, further comprising generating (305) the noise secret.
26. The method according to any one of claims 16-25, wherein the obtained noise secret is encrypted.
27. The method according to claim 26, further comprises decrypting (307) the encrypted noise secret. A first device (104) for supporting anomaly detection in an obfuscated data stream, the first device comprising a processor (801) and a memory (802), the memory (802) having stored thereon instructions executable by the processor (801), wherein the instructions, when executed by the processor (801), cause the first device (104) to: obtain a noise secret used for obfuscating at least some statistical characteristics of a received data stream; obtain a representation of at least part of the received data stream; reconstruct, by using the noise secret, at least some of the obfuscated statistical characteristics; provide information indicative of the reconstructed statistical characteristics for performing anomaly detection. The method according to claim 28, wherein the representation is a copy of at least part of the received data stream. The method according to claim 28, wherein the representation is processed data comprising information about data stream. The first device (104) according to any of claims 28-30, wherein packets in the received data stream are obfuscated by generating dummy packets using the noise secret. The first device according to any of claim 31, wherein the generated dummy packets comprise information to identify the dummy packet using the noise secret. The first device (104) according to claim 32, wherein the instructions, when executed by the processor (801), cause the first device to identify the dummy packets by decoding the information to identify the dummy packet using the noise secret; discard the dummy packets. The first device (104) according to claim 31, wherein the instructions, when executed by the processor (801), cause the first device to re-order packets of the at least part of the received data stream. The first device (104) according to any of claims 31 or 34, wherein the instructions, when executed by the processor (801), cause the first device to identify the dummy packets by using the noise secret and order of the packets; discard the dummy packets. The first device (104) according to any of claims 28-35, wherein payload in the obfuscated data stream is encrypted. The first device (104) according to any of claims 28-36, wherein the noise secret is obtained from a central node. The first device (104) according to any of claims 28-36, wherein the noise secret is hard coded in the in the first device. The first device (104) according to any one of claims 28-36, wherein the noise secret is obtained from a second device (102) for obfuscating a data stream. The first device (104) according to any one of claims 28-36, wherein the instructions, when executed by the processor (801), cause the first device to generate the noise secret; transmit the noise secret to a second device (102) for obfuscating a data stream. The first device (104) according to any one of claims 28-36, wherein the obtained noise secret is encrypted. The first device (104) according to claim 41, wherein the instructions, when executed by the processor (801), cause the first device to decrypt the encrypted noise secret. A second device (102) for obfuscating a data stream, the second device (102) comprising a processor (1001) and a memory, the memory (1002) having stored thereon instructions executable by the processor (1001), wherein the instructions, when executed by the processor (1001), cause the second device (102) to: obtain a noise secret; obfuscate at least some statistical characteristics of the data stream using the noise secret.
44. The second device (102) according to claim 43, wherein the operation of obfuscating comprises altering a size of a packet of the data stream using the noise secret.
45. The second device (102) according to claim 43, wherein the operation of obfuscating comprises generating dummy packets using the noise secret.
46. The second device (102) according to claim 45, wherein the operation of obfuscating comprises encoding information in a dummy packet for identifying the dummy packet using the noise secret.
47. The second device (102) according to any of claims 43-46, wherein payload of the obfuscated data stream is encrypted.
48. The second device (102) according to any of claims 43-47, wherein the second device is a source node transmitting the data stream.
49. The second device (102) according to any of claims 43-48, wherein the noise secret is obtained from a central node.
50. The second device (102) according to any one of claims 43-52, wherein the noise secret is hard coded in the in the second device.
51. The second device (102) according to any of claims 43-47, wherein the noise secret is obtained from a first device for supporting anomaly detection.
52. The second device (102) according to any of claims 53-48, wherein the instructions, when executed by the processor (1001), cause the second device to generate the noise secret. The second device (102) according to any of claims 53-52, wherein the obtained noise secret is encrypted. The second device (102) according to claim 53, wherein the instructions, when executed by the processor (1001), cause the second device to decrypt the encrypted noise secret. A computer program (804) comprising instructions which, when run in a processing unit on a first device, cause the first device to: obtain a noise secret used for obfuscating at least some statistical characteristics of a received data stream; obtain a representation of at least part of the received data stream; reconstruct by using the noise secret, at least some of the obfuscated statistical characteristics; provide information indicative of the reconstructed statistical characteristics for performing anomaly detection. A computer program product (805) comprising a computer readable storage medium on which a computer program according to claim 55 is stored. A computer program (1004) comprising instructions which, when run in a processing unit on a second device, cause the second device to obtain a noise secret; obfuscate at least some statistical characteristics of the data stream using the noise secret. A computer program product (1005) comprising a computer readable storage medium on which a computer program according to claim 57 is stored.
PCT/SE2021/051295 2021-12-20 2021-12-20 Methods and devices for supporting anomaly detection WO2023121521A1 (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
PCT/SE2021/051295 WO2023121521A1 (en) 2021-12-20 2021-12-20 Methods and devices for supporting anomaly detection

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
PCT/SE2021/051295 WO2023121521A1 (en) 2021-12-20 2021-12-20 Methods and devices for supporting anomaly detection

Publications (1)

Publication Number Publication Date
WO2023121521A1 true WO2023121521A1 (en) 2023-06-29

Family

ID=86903476

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/SE2021/051295 WO2023121521A1 (en) 2021-12-20 2021-12-20 Methods and devices for supporting anomaly detection

Country Status (1)

Country Link
WO (1) WO2023121521A1 (en)

Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20080089518A1 (en) * 2006-10-12 2008-04-17 Interdigital Technology Corporation Method and system for enhancing cryptographic capabilities of a wireless device using broadcasted random noise
US7376235B2 (en) * 2002-04-30 2008-05-20 Microsoft Corporation Methods and systems for frustrating statistical attacks by injecting pseudo data into a data system
WO2020058619A1 (en) * 2018-09-17 2020-03-26 Commissariat A L'energie Atomique Et Aux Energies Alternatives Confidential method for processing logs of a computer system
US20200195617A1 (en) * 2018-12-18 2020-06-18 Bae Systems Information And Electronic Systems Integration Inc. Securing data in motion

Patent Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US7376235B2 (en) * 2002-04-30 2008-05-20 Microsoft Corporation Methods and systems for frustrating statistical attacks by injecting pseudo data into a data system
US20080089518A1 (en) * 2006-10-12 2008-04-17 Interdigital Technology Corporation Method and system for enhancing cryptographic capabilities of a wireless device using broadcasted random noise
WO2020058619A1 (en) * 2018-09-17 2020-03-26 Commissariat A L'energie Atomique Et Aux Energies Alternatives Confidential method for processing logs of a computer system
US20200195617A1 (en) * 2018-12-18 2020-06-18 Bae Systems Information And Electronic Systems Integration Inc. Securing data in motion

Similar Documents

Publication Publication Date Title
CN109428867B (en) Message encryption and decryption method, network equipment and system
US10057052B2 (en) Data encryption cipher using rotating ports
Weinberg et al. Stegotorus: a camouflage proxy for the tor anonymity system
US11159494B2 (en) Streaming one time pad virtual private network
KR101608815B1 (en) Method and system for providing service encryption in closed type network
Bali et al. Lightweight authentication for MQTT to improve the security of IoT communication
Musa et al. Secure security model implementation for security services and related attacks base on end-to-end, application layer and data link layer security
EP1726112A2 (en) Methods and apparatus for confidentiality protection for fibre channel common transport
Huang et al. A secure communication over wireless environments by using a data connection core
US10015208B2 (en) Single proxies in secure communication using service function chaining
Heinz et al. Covert Channels in Transport Layer Security: Performance and Security Assessment.
Lai et al. Cryptography considerations for distributed energy resource systems
WO2023121521A1 (en) Methods and devices for supporting anomaly detection
KR20060011999A (en) Des algorithm-based encryption method
Toğay et al. Secure gateway for the internet of things
CN210839642U (en) Device for safely receiving and sending terminal data of Internet of things
Song et al. Secure remote control of field-programmable network devices
Bonde Wireless Security
Ahmed et al. Architecture based on tor network for securing the communication of northbound interface in sdn
Hartl et al. Subverting Counter Mode Encryption for Hidden Communication in High-Security Infrastructures
KR20200028782A (en) Method and apparatus for encrypting data based on patterned cipher block for real-time data communication
Junaid et al. Per packet authentication for ieee 802.11 wireless lan
Mohamed et al. Cryptography concepts: Confidentiality
CN117424742B (en) Session key restoring method of non-perception transmission layer security protocol
Wieczorek et al. Towards secure fieldbus communication

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 21969174

Country of ref document: EP

Kind code of ref document: A1

NENP Non-entry into the national phase

Ref country code: DE