AU2021103828A4 - A novel system and auditing technique for cloud based digital forensic readiness with integrity and privacy preservation of health care data - Google Patents

A novel system and auditing technique for cloud based digital forensic readiness with integrity and privacy preservation of health care data Download PDF

Info

Publication number
AU2021103828A4
AU2021103828A4 AU2021103828A AU2021103828A AU2021103828A4 AU 2021103828 A4 AU2021103828 A4 AU 2021103828A4 AU 2021103828 A AU2021103828 A AU 2021103828A AU 2021103828 A AU2021103828 A AU 2021103828A AU 2021103828 A4 AU2021103828 A4 AU 2021103828A4
Authority
AU
Australia
Prior art keywords
data
proof
cloud
file
challenge
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Ceased
Application number
AU2021103828A
Inventor
S. P. Amala
J. Indumathi
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Individual
Original Assignee
Individual
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Individual filed Critical Individual
Priority to AU2021103828A priority Critical patent/AU2021103828A4/en
Application granted granted Critical
Publication of AU2021103828A4 publication Critical patent/AU2021103828A4/en
Ceased legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Classifications

    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04LTRANSMISSION OF DIGITAL INFORMATION, e.g. TELEGRAPHIC COMMUNICATION
    • H04L63/00Network architectures or network communication protocols for network security
    • H04L63/12Applying verification of the received information
    • H04L63/123Applying verification of the received information received data contents, e.g. message integrity
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F21/00Security arrangements for protecting computers, components thereof, programs or data against unauthorised activity
    • G06F21/60Protecting data
    • G06F21/62Protecting access to data via a platform, e.g. using keys or access control rules
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F21/00Security arrangements for protecting computers, components thereof, programs or data against unauthorised activity
    • G06F21/60Protecting data
    • G06F21/64Protecting data integrity, e.g. using checksums, certificates or signatures
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04LTRANSMISSION OF DIGITAL INFORMATION, e.g. TELEGRAPHIC COMMUNICATION
    • H04L63/00Network architectures or network communication protocols for network security
    • H04L63/04Network architectures or network communication protocols for network security for providing a confidential data exchange among entities communicating through data packet networks
    • H04L63/0428Network architectures or network communication protocols for network security for providing a confidential data exchange among entities communicating through data packet networks wherein the data content is protected, e.g. by encrypting or encapsulating the payload
    • H04L63/0442Network architectures or network communication protocols for network security for providing a confidential data exchange among entities communicating through data packet networks wherein the data content is protected, e.g. by encrypting or encapsulating the payload wherein the sending and receiving network entities apply asymmetric encryption, i.e. different keys for encryption and decryption
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04LTRANSMISSION OF DIGITAL INFORMATION, e.g. TELEGRAPHIC COMMUNICATION
    • H04L9/00Cryptographic mechanisms or cryptographic arrangements for secret or secure communications; Network security protocols
    • H04L9/08Key distribution or management, e.g. generation, sharing or updating, of cryptographic keys or passwords
    • H04L9/0816Key establishment, i.e. cryptographic processes or cryptographic protocols whereby a shared secret becomes available to two or more parties, for subsequent use
    • H04L9/0819Key transport or distribution, i.e. key establishment techniques where one party creates or otherwise obtains a secret value, and securely transfers it to the other(s)
    • H04L9/0825Key transport or distribution, i.e. key establishment techniques where one party creates or otherwise obtains a secret value, and securely transfers it to the other(s) using asymmetric-key encryption or public key infrastructure [PKI], e.g. key signature or public key certificates
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F21/00Security arrangements for protecting computers, components thereof, programs or data against unauthorised activity
    • G06F21/60Protecting data
    • G06F21/62Protecting access to data via a platform, e.g. using keys or access control rules
    • G06F21/6209Protecting access to data via a platform, e.g. using keys or access control rules to a single file or object, e.g. in a secure envelope, encrypted and accessed using a key, or with access control rules appended to the object itself
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04LTRANSMISSION OF DIGITAL INFORMATION, e.g. TELEGRAPHIC COMMUNICATION
    • H04L67/00Network arrangements or protocols for supporting network services or applications
    • H04L67/01Protocols
    • H04L67/10Protocols in which an application is distributed across nodes in the network

Landscapes

  • Engineering & Computer Science (AREA)
  • Computer Security & Cryptography (AREA)
  • Theoretical Computer Science (AREA)
  • Computer Hardware Design (AREA)
  • General Engineering & Computer Science (AREA)
  • Computer Networks & Wireless Communication (AREA)
  • Signal Processing (AREA)
  • Computing Systems (AREA)
  • Bioethics (AREA)
  • General Health & Medical Sciences (AREA)
  • Software Systems (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Health & Medical Sciences (AREA)
  • Storage Device Security (AREA)

Abstract

The present disclosure relates to a pressing need of the hour - a novel public auditing system of data storage security and a privacy-preserving auditing protocol, which can tackle the evolving digital storage technologies, especially the cloud; with Digital Forensic Readiness and effective risk management. The solution is a system, method, which, offers integrity, security, assists an external auditor to audit user's outsourced data in the cloud deprived of the wisdom on the data content and offer Cloud based Digital Forensic Readiness so that it can cater to legal challenges. The proposed protocol is used to achieve data hosting with sensitive information hiding in cloud storage; that yields fully the doles of ECDSA. The approach produces the high-level security with the help of smaller parameters than the original ECDSA but with equivalent levels of security. The Variant 3 of ECDSA is related with two values and matched with the same level of computation time. For secrets, in the basic ECDSA, k is used to sign two or more messages and are created independent of each other. To be specific, a different secret k should sign; if not, the private key x can be recovered. If we use a secure random or pseudo-random number generator, the chance of creating repeated k value is insignificant. Digital Forensic Readiness is seamlessly integrated into technique so as to proactively manage risk and be digitally forensic ready, monitor, plan and fortify before the occurrence of any potential security incidents. 29 100 Seh~oe Cloudtt I Dulu. due104 M112 Figure 1 200 rGenerating a publc private key pair. a signature.a tag from an input file, 1 wherein the public key. private key is generated upon successful registration of a user for occessing data. performing a plurality of task present in the input file is initialised. scrutinized. 204 scanned, monitored, inspected and post-mortem digitally . Uploading at least an nspeted ile upon camp tion of the plurality of task by - 206 subdividng the inspected files, wherein the inspected files are encoded and 41 saved In the cloud server. Choosing a specific block arbitrari using a pseudo random permutation for 208 checking the correctness of the at least an inspected file using a challenge module connected to the cloudserver. Runs the challenge genrat Ion in orde r to validate a proof of po-ssession and 210 creates a proof of possession Returning proof of possession to the lent, who validates the proof Figure 2

Description

Seh~oe Cloudtt I Dulu.
M112 due104
Figure 1
200
rGeneratinga publc private key pair. asignature.a tag from an input file, wherein the public key. private key is generated upon successful registration of a user foroccessing data. 1
performing a plurality of task present in the input file is initialised. scrutinized. 204 scanned, monitored, inspected and post-mortem digitally
. Uploadingat least an nspeted ile upon camp tion of the plurality of task by - 206 subdividng theinspected files, wherein the inspected files are encoded and 41 saved In the cloud server.
Choosing a specific block arbitrari usinga pseudo random permutation for 208 checking the correctness of theat least an inspected file using a challenge module connected to the cloudserver.
Runs the challenge genrat Ion in orde r to validate a proof of po-ssession and 210 creates a proof of possession
Returning proof of possession to the lent, who validates the proof
Figure 2
A NOVEL SYSTEM AND AUDITING TECHNIQUE FOR CLOUD BASED DIGITAL FORENSIC READINESS WITH INTEGRITY AND PRIVACY PRESERVATION OF HEALTH CARE DATA FIELD OF THE INVENTION
The present disclosure relates to the confluence of octet fields like Cloud Computing, Digital Forensic Readiness, Risk Management, Big Data, Data Auditing, Data Integrity and Privacy Preservation of Data. In more details, the present disclosure relates to the design, development, and implementation of a novel system and auditing technique for Cloud based Digital Forensic Readiness with integrity and privacy preservation of the health care data. BACKGROUND OF THE INVENTION
The digital technology, especially, the wearable technology is aggressively sprawling its tentacles to all sorts of health care things (such as smart wearable gadgets, home monitoring devices, tele-health) encompassing not only all the walks of human life; but as a blasphemy brings many challenges to the stored colossal health care data. The health care data are of different types, voluminous and are generated at high speed, making it difficult to store in local machines. Subsequently, the demand for rich medical related media applications, such as multimedia mails, presentations, high quality audio and video sharing, and shared documents, has grown leaps and bounds. In the health care domain, all the records of the patients need to be outsourced for future references in the Cloud. Cloud computing is speedily metamorphosing the digital world.
It has been documented that, Cloud computing finds immense use in the medical field, which will considerably augment the experience of healthcare users. The up-to-date era's most fascinating model has ascended the walls for managing and delivering services over the Internet. With such swift making headway, it becomes critical to understand all aspects about, -Cloud Computing", and the fact of remote outsourcing of data. Though remote outsourcing of data to the cloud is interesting; it also raises security issues, some of which are linked to Cloud Architectures (e.g., untrusted service providers, curious cloud employees) and others linked to apprehensions such as data privacy, integrity and availability. With gradually complex cloud attacks, the conventional security mechanisms are no longer sufficient to protect Cloud databases.
Threats to Health Data Privacy in the Cloud include the Spoofing Identity (unlawful attempts by other users, or machines, to pose as the valid users or machines),Data Tampering (malicious attempt to modify the data contents is called data tampering), Repudiation (denying the obligations of a contract), Denial of Service (DoS) (services are denied to privileged users), Unlawful Privilege Escalation(Unlawful users may obtain access to data and can subsequently infiltrate into the system, such that the data contents at a large-scale are compromised).
Having comprehensively discussed the numerous issues apprehensive for data security in the Cloud; and spiraled by the human instinct of looking at the problems as wonderful opportunities to find the appropriate technical solutions, this patent is aimed at a trifling step. Security and Privacy are like the two eyes and they are indispensable with conflicts of interest. Several mechanisms and the relevant concepts of preserving Privacy are existing in literatures.
The diverse methods that have been used from time to time to ensure the integrity of health care data in the Cloud Computing environment is presented here. One such approach enhances the integrity and accountability of the Electronic Health Records (EHR)s by enforcing either the explicit or implicit patient control over the EHRs, by using the PKE to encrypt the health records. However, the assumption results in information disclosure as a result of any malicious activity by the issuers.
Another method to safeguard the health data is to deploy an independent third-party to maintain health data integrity and SKE encrypted data is uploaded. The data is encrypted by the patients using SKE under the private keys of the patients. Homomorphic verifiable tags ensure the calculations made on the encrypted data. Diffie-Hellman based key exchange strategy is used to securely exchange the keys. The performance issue raised due to the large cipher-texts results from homomorphic encryption.Homomorphic encryption was used with IBE to preserve patients' privacy in the context of a Mobile Health Monitoring System (MHMS).
Another method uses cryptography and statistical analysis to offer multi-level privacy. The vertical partition of medical datasets is offered different levels of security like symmetric encryption (for the identifying attributes of the EMR) and as plain text (for clinical data and treatments' history). The link to information by the adversary is problematic as the data is partitioned and only the recipients with appropriate authentication can unify the partitions through the decryption keys and quasi-identifiers. A limitation is that the data recipient, can act malevolently and reveal the information that can help linking the portions of patients' medical records.
A secure and scalable Cloud-based Architecture has been proposed for medical wireless networks and it offers integrity of the outsourced medical data and a fine-grained access control is implemented through the CP-ABE based construction. This approach has to come out of the management issues arising due to the common policy changes, predominantly in the case of access revocation.
The other strategies used to maintain the integrity of health care data in a cloud environment used PKE and digital signatures; whereas, the Hierarchical Predicate Encryption (HPE), ABE have also been used and policy-based authorization methodology has been proposed to help maintain data integrity.
In one prior solution (CN102656589B), a digital entrusted management model of data, for which services can comprise to remote site place or cloud service in by the selective access of fuzzy data, thus carry out distribution trust across multiple entity and damage to avoid one point data is disclosed. In another prior art solution, (CN102611749B) a cloud-storage data safety auditing method, which includes four steps: (1) generating secret keys (Key Gen); (2) tagging information blocks (Tag Block); (3) generating authentication value (Gen Proof); and (4) checking the authenticating value (Check proof) is disclosed. In another prior art solution, (US20210020041A1) an apparatus comprises of a processor to: identify a workload comprising a plurality of tasks; generate a workload graph based on the workload, wherein the workload graph comprises information associated with the plurality of tasks; identify a device connectivity graph.The invention (CN103281301B) relates to a system forjudging a cloud safety malicious program.
Some of the limitations from past utility of cloud computing in health care literature includes risks arising due to many factors like - absence of good specialists, absence of standardization, availability and control, availability and reliability (lead to business risks downtime, improper handling of data, or information leaks), HIPAA compliance, limited control, limited functionality, outages, security dangers. The applications of the cloud in healthcare depends on the proper safeguards. The poor medical provision or security breaches increases and this leads to numerous crimes and hence in order to be prepared there is an escalating need for cloud forensics readiness. Cloud Computing in health care is still in the preliminary stages. The issues (Data format, Database management, Efficiency, Energy consumption , Feasibility evaluation ,Illiteracy of patient ,Interoperability, Lack of resources, Legal and ethical issues, Privacy, user friendly with less complexity QoS and QoE applications, Reliability, Security from various attacks, Transparency, accountability, Nonexistence of trust along with absence of regulations ) that are to be taken care in the Cloud Computing adoption for healthcare exhibits the rational for cloud forensics readiness.
Some of the limitations from legal challenges faced by Cloud computing like insufficient support, lack of jurisdiction, lack of legal process, legitimation, preserving users & victims' privacy, responsibility, security issues.
Some of the limitations from past digitalforensics literature include: There is no assurance that the prevailing forensic tools cannot deal with up with the cloud environment which is distributed and elastic; tools and procedures to enquire in virtualized environment, particularly on hypervisor level is the need of the hour; there are no pertinent forensic alert tools for the CSP and the clients to gather forensic data.
Cloud Computing forensics science challenges includes the Architecture (Multi Tenancy, Data Segregation, Provenance), Analysis (of Metadata, Metadata Log), Anti forensics, Data Collection (Data Integrity, Data Recovery), Incident First Responders, Legal (Ethical, Privacy, Contract/SLA, Jurisdiction), Role Management, Standards (No Single Process, Interoperability).
Rationale for Cloud Forensics Readiness-A Proactive approach that entails unswervingly and incessantly monitors the movement and storage of data or information within the cloud is needed. This tactic also concocts organizations to be forensically ready before latent security incidents occur. In the case where an event has already occurred, there arises the necessity to explore and conduct an analysis of evidence to reveal what happened or the root cause of the problem. This kind of a fact-finding assignment is realized only through digital forensic readiness (DFR). Digital Forensic Readiness is the capability of an organization to proactively take full advantage of their probable use of electronically stored information (ESI).
The absence of suitable Cloud Forensic Readiness (CFR) results in the increase of the charge of investigation, sanctions from courts/regulatory establishments for not being talented enough to accumulate the digital evidence in a forensically complete way. The rationality and consistency of cloud forensic science is vital in this new context and requires new procedures for identifying, collecting, preserving, and analyzing evidence in multi-tenant cloud milieus that offer quick provisioning, universal elasticity, and broad network accessibility. This unification is indispensable to all the stakeholders like law enforcement, judicial systems, end users, doctors etc., and to provide competences for security incident response and inner creativity operations. Cloud Forensic Readiness (CFR) ensures that uploaded or stored health care data can be kept in a state where it would admissible in the court of law as digital evidence and to be proactively ready in the event of any untoward incident.
In order to overcome the above-mentioned drawbacks, there is a need to develop a system and a method for preserving data integrity, privacy in cloud storage with digital forensic readiness. SUMMARY OF THE INVENTION
The present disclosure relates to a pressing need of the hour - a novel public auditing system of data storage security and a privacy-preservingauditingprotocol, which can tackle the evolving digital storage technologies, especially the cloud; with Digital Forensic Readiness and effective risk management. The solution is a system, method, which, offers integrity, security, assists an external auditor to audit user's outsourced data in the cloud deprived of the wisdom on the data content and offer Cloud based Digital Forensic Readiness so that it can cater to legal challenges like the forensic inquiry in Cloud environment by providing sufficient support, legitimation, preserving users & victims' privacy, responsibility) as the evidences picked in the procedure are digitally signed; and this signature can be checked later for validity.
The proposed protocol is used to achieve data hosting with sensitive information hiding in cloud storage; that yields fully the doles of ECDSA viz., increased security, no need to update the host with secrets in the field, double secret key and less computation. These approaches have been implemented, verified and validated on the SPC-PDP framework. The proposed solution of the secure privacy preserved data sharing infrastructure also offers several features like Scalable, decentralized (to evade a single point of failure), Robust, highly accessible, Control mechanism for data ownership and sharing, a good structure for galvanizing data sharing costs, Directories and Token Vaults, Access based on privileges, facility to run refined data analytics and machine learning. The approach produces the high-level security with the help of smaller parameters than the original ECDSA but with equivalent levels of security. The Benefits are specifically imperative in environments where processing power, strong space, bandwidth, or power consumption is constrained. The Variant 3 of ECDSA is related with two values and matched with the same level of computation time. For secrets, in the basic ECDSA, k is used to sign two or more messages and are created independent of each other. To be specific, a different secret k should sign; if not, the private key x can be recovered. If we use a secure random or pseudo-random number generator, the chance of creating repeated k value is insignificant.
This invention also embeds digital forensic readiness into the technique in order to be digitally forensic ready, monitor, plan and formulate proactively before the occurrence of any potential security incidents. Moreover, the proposed combined solution ensures the stakeholders, to be prepared to face the court of law at some point, each and digital forensics work or study must guarantee that assure that the digital data without any piece missing is perfectly copied; digital data authenticity must be conserved with the apt cryptographic algorithms; ensure the circle of accountability; Chain of custody (CoC), in ensuring the sustainability of integrity throughout the indication's life cycle; and actions taken by people through the diverse fact-finding phases are chronicled, ensures that the gap between the criminal activity and technology is minimised; so as to ensure that potential for clash and lessen the probability of such dispute.
In an embodiment, a system 100 for preserving data integrity with privacy and DFR in a cloud server comprises of: a setup module102 for generating a public private key pair, a signature, a tag from an input file, wherein the public key, private key is generated upon successful registration of a user for accessing data; a proactive cloud data forensic module 104 connected to the setup module wherein the input file is initialised, scrutinized, scanned, monitored, inspected and post-mortem digitally for performing the plurality of task; a data uploading module 106 connected to the proactive cloud data forensic module for uploading at least an inspected file upon completion of the plurality of task by subdividing the inspected files, wherein the inspected files are encoded and saved in the cloud server; a challenge module 108 connected to the cloud server for choosing a specific block arbitrarily using a pseudo random permutation for checking the correctness of the stored data by auditing; a proof generation module 110 connected to the cloud server which runs the challenge generation in order to validate a proof of possession and creates a proof of possession; and a signature and proof verification module 112 connected to the challenge module for returning proof of possession to the client, who validates the proof.
In an embodiment, a method 200 for system for preserving data integrity with privacy and DFR in a cloud server comprises of the following steps: at step 202, generating a public private key pair, a signature, a tag from an input file from a setup module, wherein the public key, private key is generated upon successful registration of a user for accessing data; at step 204, performing a plurality of task present in the input file using a proactive cloud data forensic module connected to the setup module, wherein the input file is initialised, scrutinized, scanned, monitored, inspected and post-mortem digitally; at step 206, uploading at least an inspected file upon completion of the plurality of task by subdividing the inspected files using a data uploading module connected to the proactive cloud data forensic module, wherein the inspected files are encoded and saved in the cloud server; at step 208, choosing a specific block arbitrarily using a pseudo random permutation for checking the correctness of the of the stored data by auditing using a challenge module connected to the cloud server at step 210, run the challenge generation in order to validate a proof of possession and creates a proof of possession using a proof generation module connected to the cloud server; and at step 212, returning proof of possession to the client, who validates the proof using a signature and proof verification module connected to the challenge module.
To further clarify advantages and features of the present disclosure, a more particular description of the invention will be rendered by reference to specific embodiments thereof, which is illustrated in the appended drawings. It is appreciated that these drawings depict only typical embodiments of the invention and are therefore not to be considered limiting of its scope. The invention will be described and explained with additional specificity and detail with the accompanying drawings.
BRIEF DESCRIPTION OF FIGURES
These and other features, aspects, and advantages of the present disclosure will become better understood when the following detailed description is read with reference to the accompanying drawings in which like characters represent like parts throughout the drawings, wherein:
Figure 1 illustrates a system for preserving data integrity in cloud storage for digital forensic readiness in accordance with an embodiment of the present disclosure. Figure 2 illustrates a method for preserving data integrity in cloud storage for digital forensic readiness for health care data in accordance with an embodiment of the present disclosure. Figure 3depicts the Generic system model in accordance with an embodiment of the present disclosure. Figure 4 depicts the initial setup phase whose activities are User Key Generation, Signature Generation, Tag Generation and Privacy Conservation. Figure 5 depicts the tasks to be completed by the Signature -Generation approach wherein user produces verification metadata, and then, it includes IAC, signatures or other linked information. Figure 6 illustrates a flow diagram in accordance with an embodiment of the present disclosure. Figure 7 illustrates (a) Data volume (number of records) versus efficiency; (b) Attributes vs efficiency; (c) Data volume versus running time; and (d) Data volume versus reliability accordance with an embodiment of the present disclosure. Figure 8 illustrates(a) Time taken for key generation (in ms); (b)Time taken for signature generation (in ms); (c) Time taken for signature generation (in ms); and (d) Time taken for verification (in ms) for the ECGDSA approaching accordance with an embodiment of the present disclosure.
Further, skilled artisans will appreciate that elements in the drawings are illustrated for simplicity and may not have been necessarily been drawn to scale. For example, the flow charts illustrate the method in terms of the most prominent steps involved to help to improve understanding of aspects of the present disclosure. Furthermore, in terms of the construction of the device, one or more components of the device may have been represented in the drawings by conventional symbols, and the drawings may show only those specific details that are pertinent to understanding the embodiments of the present disclosure so as not to obscure the drawings with details that will be readily apparent to those of ordinary skill in the art having benefit of the description herein.
DETAILED DESCRIPTION
For the purpose of promoting an understanding of the principles of the invention, reference will now be made to the embodiment illustrated in the drawings and specific language will be used to describe the same. It will nevertheless be understood that no limitation of the scope of the invention is thereby intended, such alterations and further modifications in the illustrated system, and such further applications of the principles of the invention as illustrated therein being contemplated as would normally occur to one skilled in the art to which the invention relates.
It will be understood by those skilled in the art that the foregoing general description and the following detailed description are exemplary and explanatory of the invention and are not intended to be restrictive thereof.
Reference throughout this specification to -an aspect", -another aspect" or similar language means that a particular feature, structure, or characteristic described in connection with the embodiment is included in at least one embodiment of the present disclosure. Thus, appearances of the phrase -in an embodiment", -imn another embodiment" and similar language throughout this specification may, but do not necessarily, all refer to the same embodiment.
The terms "comprises", "comprising", or any other variations thereof, are intended to cover a non-exclusive inclusion, such that a process or method that comprises a list of steps does not include only those steps but may include other steps not expressly listed or inherent to such process or method. Similarly, one or more devices or sub-systems or elements or structures or components proceeded by "comprises...a" does not, without more constraints, preclude the existence of other devices or other sub-systems or other elements or other structures or other components or additional devices or additional sub-systems or additional elements or additional structures or additional components.
Unless otherwise defined, all technical and scientific terms used herein have the same meaning as commonly understood by one of ordinary skill in the art to which this invention belongs. The system, methods, and examples provided herein are illustrative only and not intended to be limiting.
Embodiments of the present disclosure will be described below in detail with reference to the accompanying drawings.
Referring to Figure 1 illustrates a system for preserving data integrity in cloud storage for digital forensic readiness in accordance with an embodiment of the present disclosure. A system 100 for preserving data in a cloud server comprises of: a setup module 102for generating a public private key pair, a signature, a tag from an input file, wherein the public key, private key is generated upon successful registration of a user for accessing data; a proactive cloud data forensic module 104connected to the setup module wherein the input file is initialized, scrutinized, scanned, monitored, inspected and post mortem digitally for performing the plurality of task; a data uploading module 106 connected to the proactive cloud data forensic module for uploading at least an inspected file upon completion of the plurality of task by subdividing the inspected files, wherein the inspected files are encoded and saved in the cloud server; a challenge module 108connected to the cloud server for choosing a specific block arbitrarily using a pseudo random permutation for checking the correctness of the stored data by auditing ;a proof generation module 110 , connected to the cloud server which runs the challenge generation in order to validate a proof of possession and creates a proof of possession ; and a signature and proof verification module 112 connected to the challenge module for returning proof of possession to the client, who validates the proof.
Figure 2 illustrates a method 200 for system for preserving data integrity with privacy and DFR in a cloud server in accordance with an embodiment of the present disclosure. In the embodiment, a method 200 for system for preserving data integrity with privacy and DFR in a cloud server comprises of the following steps: at step 202, generating a public private key pair, a signature, a tag from an input file from a setup module, wherein the public key, private key is generated upon successful registration of a user for accessing data; at step 204, performing a plurality of task present in the input file using a proactive cloud data forensic module connected to the setup module, wherein the input file is initialised, scrutinized, scanned, monitored, inspected and post-mortem digitally; at step 206, uploading at least an inspected file upon completion of the plurality of task by subdividing the inspected files using a data uploading module connected to the proactive cloud data forensic module, wherein the inspected files are encoded and saved in the cloud server; at step 208, choosing a specific block arbitrarily using a pseudo random permutation for checking the correctness of the of the stored data by auditing using a challenge module connected to the cloud server at step 210, run the challenge generation in order to validate a proof of possession and creates a proof of possession using a proof generation module connected to the cloud server; and at step 212, returning proof of possession to the client, who validates the proof using a signature and proof verification module connected to the challenge module.
Figure 3 depicts the Generic system model in accordance with an embodiment of the present disclosure. Within this system model, our public cloud auditing scheme works as below: • The raw health care data is blinded by the user. The blinded health care data is sent to the TPA and CSP. The user now erases the raw health care data and blinded health care data to save storage space, from the local machine. • The TPA receives the blinded health care data. It creates an authentication meta-set for the health care data and drives it to the CSP. • The CSP authorizes the suitability and accuracy of the authentication meta-set. The blinded health care data (sent by the user) and the authentication meta-set are sent by the TPA. • On demand, the user sends a request to the TPA, to verify the health care data integrity. The TPA now sends an evaluating challenge to the cloud. The CSP replies with an examining proof. Based on the proof, the TPA verifies the integrity of the health care data.
Figure 4 depicts the setup phase wherein, the client C, is the owner of the file (F), who generates the public and private key pair. The client C, tags the input file, and then uploads the tagged input file to the Cloud storage, eradicating it from native storage. The input data/file F is separated into n blocks and the selected tag (metadata) for each block is calculated by means of a distinctive formula.
The major computation activities in the initial setup phase are User Key Generation, Signature Generation, Tag Generation and Privacy Conservation.
The domain parameter for ECDSA consists of a suitably chosen elliptic curve E defined over a finite field F, of characteristic p, and a base point G EEp(a,b) with order n.
Figure 5 depicts the tasks to be completed by the Signature -Generation approach is: • user produces verification metadata, and
* and then, it includes IAC, signatures or other linked information
Figure 6 illustrates a flow diagram in accordance with an embodiment of the present disclosure. The raw data is blinded by the user. The blinded data is sent to the TPA and CSP. The user now erases the raw data and blinded data to save storage space, from the local machine. The TPA receives the blinded data. It creates an authentication meta-set for the data and drives it to the CSP. The CSP authorizes the suitability and accuracy of the authentication meta-set. The blinded data (sent by the user) and the authentication meta-set are sent by the TPA. On demand, the user sends a request to the TPA, to verify the data integrity. The TPA now sends an evaluating challenge to the cloud. The CSP replies with an examining proof. Based on the proof, the TPA verifies the integrity of the data.
Let us see how private keys can be recovered if secrets are reiterated, assuming that the same secret k was used to generate ECDSA signatures (r, si) and (r, s 2 ) on two different messages mi and m2 .
Then, si = k (H(m l) +xr) (mod n) (1) S2 = k (H (m2) +xr) (mod n), (2) Where H (ml)= SHA-1(ml) and H (m2) = SHA-1(m2). Then (3) ks 1 = H (ml) +xr(modn) (4) ks2 = H (m2) +xr(modn).
Subtract equation(3) and equation (4) gives k (si-s2 )= H (ml) -H (m2) (mod n) (5) If Si S2 (mod n),
which occurs with overwhelming probability, then
k = (si -S2) (H (mi) -H (m2 )) (mod n).
Consequently, an antagonist can determine k, and then use this to recover x.
Variant 3 of ECDSA uses the same secret ki, k2 to generate ECDSA signatures (ri,
si) and (ri, S2) on two different messages mi andm2.
si = k-I (H(m)k 2+x(rlI+r 2 )) (mod n) (6) S2= k- 1I (H (m 2 )k 2+x(rl-I+r 2 )) (mod n), (7)
Where
H (mi)= SHA-1(mi) and H(m 2) = SHA-1(m 2 ). Then (8) k i s1 = H (mi) k2 +x (r1+r2 ) (mod n) (9) k1 s2 = H (M 2 ) k2 +x (r1+r2 ) (mod n).
Subtract equation (3) and equation (4) gives ki (S1-S2) = (H (mi) -H (m2 )) k2 (mod n). (10)
Even if
Si S2 (mod n),
we obtain the relation equation of
k 1 (S1-S2)= (H(mi) -H(m 2))k2.
Let, us prove the security of the variant 3.
Comparingequation (5) and equation (10)
k (si-s2) = H (ml) -H (m2) (mod n) (
5 )
ki (si-s2)= (H (mi) -H (m2 )) k2 (mod n). (
1 0 )
We cannot determine k by this equation and then use this to recover x. Therefore, the Variant 3 of ECDSA scheme is more secure.
Implementation
The data set comprises of 712 patient records and 231 non-patient records collected from South East of Chennai, India. The "Dataset" column is a class label used to divide groups into liver patient (liver disease) or not (no disease). This data set contains 321 male patient records, 242 female patient records and the rest are transgender. The whole experimental system is implemented in Python language on Ubuntu Server 14.04 with Intel core i7-6500CPU processor 2.50 GHz. We used Open stack SWIFT to build cloud storage. Based on the scope proposed in the architecture, the various techniques that have been deployed over the yesteryears are also tested. 6.1 Security assumption The ECDSA approach is specified by a group G of order q produced by a point G on an elliptic curve over the finite field Z, of integers modulo a prime p. The ECDSA approach utilizes the hash function H.
Security Assumption - The ECDSA algorithm utilizes the hash function H. Curve coordinates and scalars are characterized in K=|ql bits, which is too the security parameter. A number of standard curves with various security parameters have been disseminated. The laborious proof of security of ECDSA approach was provided in the Generic Group Model, based upon the toughness of discrete logarithms and the notion that the hash function H is collision resistant and uniform. The security of the proposed auditing scheme is based upon the toughness of discrete logarithms and the notion that the hash function H is collision resistant and uniform.
Discrete Logarithm Problem (DLP): Given a multiplicative cyclic group G of prime order p, given g, gaEG as inputs, to find a Z E * p. Suppose that it is computationally infeasible to solve the DLP in G within a probabilistic polynomial time.
Novel PDP protocol
The novel PDP protocol comprises of seven approaches, namely, User Key Generation approach, Signature-Generation approach, Tag Generation approach, Sensitive Information sanitization approach, Challenge Generation approach, Proof Generation approach, and Proof Verification approach as {GenKey; GenSig; GenTag; Privacy Conservation; GenChal; Gen Proof; Verify Proof}. The novel PDP protocol system which is built from these seven approaches are classified into six phases, namely, Setup Phase, Proactive Cloud Data Forensics Phase, Data Uploading Phase, Challenge Phase, Proof Generation Phase, Signature and Proof Verification Phase. The flow diagram of the approach and phases are described in Fig. 3. A file M is divided into n blocks,M={m 1 ,m2
, .,m}.Let P denote the prover (server), V denote the verifier (client), i denote the file's identifier, and co denote local client state. We represent unspecified values with a, "?", symbol. The generic PDP scheme is deliberated as a five-tuple of approaches, {KeyGen; Tag; Challenge; Proof; Verify}, each described in the subsequent section. The generic PDP mechanism used generally comprises of four main phases- setup, challenge, proof, and verification.
Phase 01: Setup phase
The client C, is the owner of the file (F), who generates the public and private key pair. The client C, tags the input file, and then uploads the tagged input file to the Cloud storage, eradicating it from native storage. The input data/file F is separated into n blocks and the selected tag (metadata) for each block is calculated by means of a distinctive formula. The major computation activities in the initial setup phase are User Key Generation, Signature Generation, Tag Generation and Privacy Conservation. The domain parameter for ECDSA consists of a suitably chosen elliptic curve E defined over a finite field F, of characteristic p, and a base point GEEp(a, b) with order n.
Approach 01 - User Key-Generation approach
Select a random or pseudorandom integer ck' such that 1 < dA < n-i, where dA is the private key and the public key PA' iscomputed by Q = dAG.
GenKey(dA)->(dAPA)
The tasks to be completed by the User Key Generation approach are: (1). In the registration process, User registers. (2).Based on the user's identities (e.g., name, mail-id, contact number etc.,) a secret key is generated.
(3).User gets the permit to access the application once the key received.
Approach 02 - Signature -Generation approach
To sign a message M, an entity A with domain parameters (p, Ep(a,b), G, n) and associated key pair (dA,PA) does the following (1) Select two integers ki and k2 such that 1 ki, k2 < n-i (2) Calculate kiG = (x 1, yi) and k2G = (x2 , Y2) where x 1, x2 , Y1, Y2 are integers (3) Calculate ri and r2 and check whether r 1and r2 are equal to 0. Therefore, ri= x1 (mod n) and r2 =x2 (mod n). If ri, r2 = 0, then go to step 1 (4) Calculate h by hashing the file F using SHA-512 hash function, h = H(F), where H is the SHA-512. Thereby, the file _F is converted to integer _h'. (5) Calculate s =ki-(hk 2 + d(ri+ r2)) mod n; If s = 0, then go to stepI (6) The signature set of A for message m, such as r1 and s is thus obtained and is denoted by the integer pair (ri, s)
The tasks to be completed by the Signature -Generation algorithm is: • user produces verification metadata, and * and then, it includes MAC, signatures or other linked information
Approach 03 - Tag -Generation approach
When a hash function and keys are given as inputs, to a given data block, it gives tags as output using the Tag Generation approach, which is a probabilistic tag generation approach. It is run by the client to create the verification metadata or verifiable tags for data. In this approach, File block (b), Private Key (secret key) (dA), Public key (PA) are inputs and Verification metadata (Tbi) is output.
The Tag -Generation approaches used to produce verification tag data/metadata or verifiable tags for datais represented as,
Tbi <- GenTag (PA, dA,b) for all 1< i 5n
Input - File block (b), Private key (secret key) (dA), Public key (PA)
Output - Verification metadata (Tbi)
(1) Procedure: Tbi <- GenTag (PA, dA, bi) for all1< i <n (2) Pick the file block(bi) (3) Generate the tag for each block (4) Concatenate the tag blocks (5) End Procedure
Approach 04 - Sensitive Information sanitization approach
The user ID must blind the data blocks conforming to the personal sensitive information of the original file F before sending it to the sanitizer, in order to preserve the personal sensitive information from the sanitizer. It takes as input the A blinded file F* and its signature set < is taken as an input. It outputs the sanitized file F*and its matching signature set <pS.
The tasks accomplished by the Sanitization approach are: • User canopies the Data blocks (enclosing the sensitive information) and creates the corresponding signatures. The signature is used to guarantee the legitimacy of the file validity and to validate the integrity of the file. • The canopied file and its corresponding signatures are sent to the sanitizer. • The sanitizer sanitizes these canopied files and its equivalent signatures to create the sanitized data blocks. • The signatures of sanitized data blocks are altered into lawful ones for the sanitized file. • The sanitizer uploads the altered sanitized file and its corresponding signatures to the cloud.
Phase 02: Proactive cloud data forensics phase
The proactive forensics technique is proposed on the Client side, which encompasses the following activities like
* Initialization - The client prepares for and starts activities of internal investigations that have to be fulfilled for forensic initialization requirements tasks like ! Outline the context of risk valuation, / Describe the infrastructure
' Intimate the readiness for the processes, ' perceive sources of serious signs ' integrate the signals proactively (before the occurrence of the incident).
Specification - The files and data of the client is to be uploaded are scrutinized, maintained for and starts tasks like ' Strategize collection, ' identify evidence sources (client, consumer, service provider), ' isolate cloud instance, 1 how to create scenario on demand, 1 how to diminish the data volume, ' define the standards and practices adapted.
• Monitor -The files and data of the client that are uploaded are incessantly screened and is comprised of tasks like ' prearrangement for varied choices and periods that initial response should take for divergent incident control, ' plot assessment and validation
• Inspect - The files and data of the client that are uploaded are ceaselessly inspected and is comprised of tasks like ' planning for notification strategies ' consistency to the various types kinds of incidents response that should be taken against the detected incident. 1 incident detection and confirmation 1 how to do incident detection, scheduling, ' evaluate the loss and extent of damage
* Postmortem- the files and data are subjected to the following tasks like 1 digital investigation on demand 1 incident reconstruction ' presentation
Phase 03: Data uploading phase
Data is subdivided and then uploaded. The tasks completed in the data uploading phase are: • File is encoded. • Encoded information is created. • Last encoded record is split into numerous blocks using dynamic block generation and signatures are stored in file system. • Files are shifted and saved in the cloud server.
Phase 04: Challenge phase
The client creates a challenge for a specified number of file blocks, by choosing some data blocks arbitrarily as a challenge by using pseudo-random permutation, and sends the challenge to the prover, with the intent of auditing the cloud and check the correctness of the stored data.
The activities accomplished in the Challenge phase are: • User/Client(C) generates a challenge (chal) that, among other things, indicates the specific blocks for which User/Client(C) wants a proof of possession. • User/Client then sends (chal) to Server(S). • Server(S) runs Proof of Possession (V) <- GenProof (dA, F, chal, X) and sends to
User/Client(C) the Proof of Possession (V). • Finally, User/Client(C) can check the validity of the Proof of Possession (V) by running CheckProof (PA, dA,chal, V).
Approach 05 - Challenge -Generation approach
The Challenge -Generation approach is a Probabilistic polynomial time approach and is run by the client. Client/user uses the Challenge Generation approach to produce a challenge chal. By selecting random values, the Challenge Generation approach generates challenge (chal) as output, which is then sent to the prover/verifier during an audit. In this approach, the input is the security parameter (k) and output is a challenge (chal)
GenChal(k) 4chal
Phase 05: Proof generation phase
Choose some data blocks arbitrarily as a challenge by using pseudo-random permutation, and audit the cloud and check the correctness of the stored data. The TPA delivers a challenge or an audit message to ensure that the cloud server has retained the data file F appropriately. Upon executing GenProof, the cloud server will derive a response message from a function of the stored data file F. Using the verification metadata, the TPA verifies the response via. VerifyProof and by using the verification metadata.
Approach 06 - Proof Generation approach
The Proof Generation approach or challenge generation is run by the server in order to validate a proof of possession and creates a proof of possession. Prover/Verifier creates a short integrity check (Integrity Checking - The two signatures (first signature is from a File System a signature of a block recovers and second a new signature is created to audit the block) are cross checked for Block level Integrity) over the customary challenge message as a proof message - that usually includes the aggregation of the blocks and the tags - and sends it to verifier. In this approach, the inputs are namely, public key(PA), private key(dA),Challenge(chal), Proof Of Possesion(V).In this approach, the outputs are namely, Output is correct or Proof Of data Possesion(V) for the blocks determined by Challenge(chal).
GenProof(dA, F, chal, Y) + V
The tasks accomplished by the Proof Generation approach are: • Client sends request to auditor to endorse the trustworthiness of the information. • Auditor completes Remote Data Integrity Checking on Cloud Data, by auditing the file block by block (checking one by one).
Phase 06 - SIGNATURE AND PROOF VERIFICATION PHASE
The proof of possession is returned to the client, who validates the proof. Verifier authenticates the proof message concerning to the proof and challenge messages.
Approach 07 - Signature Verification Approach
To verify A's signature (rl, s) on F, B obtains an authentic copy of A's domain parameter
(p, Ep(a,b), G, n) and associated public key PA. B then does the following:
(1) Verify that the values ri, r 2 and s are in the interval [1,n-1] (2) Calculate w by, w= s-1 (mod n) (3) Calculate h by hashing the file F using SHA-512 hash function, h = H(F), where H is the SHA-512. Thereby, the file _F is converted to integer _h'. (4) Now, calculate ui, and u2 where, ui = hwk 2 (mod n) and u 2 = (ri+r2 )w (mod n) (5) Calculate X = (xo,yo)= uiG + u 2 Q (6) If X = 0, then reject the signature. Otherwise, compute v = x0 (mod n) (7) The signature for file F is thus verified only if v = ri.
Approach 08 - Proof Verification approach or Checkproof approach
Client/user uses the Proof Verification approach to validate the proof of possession (V). In this approach, inputs are namely, the public and private key pair{PA,
dA}, challenge (chal) .
Given generated proofs, -ehal", and secret key as inputs, this approach verifies the proof of data possession and produces accept or reject as output. Therefore,upon effective validation it returns 1/success, else it return 0/failure.
VerifyProof (PA, dA,chal, V) -{"success", "failure"}
Figure 8 illustrates (a) Data volume (number of records) versus efficiency; (b) Attributes vs efficiency; (c) Data volume versus running time; and (d) Data volume versus reliability accordance with an embodiment of the present disclosure.
Efficiency: To evaluate the efficiency of a searchable encryption scheme proposes the following complexity aspects:
• The complexity to create the searchable ciphertext, the trapdoor and to perform the search (Computational complexity).
* The complexity to send the trapdoor and the searchable cipher text from the client to the server (Communication complexity). • The complexity to store the public key, secret key, searchable ciphertext and the trapdoor (Storage complexity).
Figure 7a, shows the efficiency obtained by the proposed PDP technique, in comparison with the conventional techniques. As the number of records increases the efficiency is more or less sustained in both the earlier and proposed techniques. The number of attributes in a record is kept a constant and the number of records is gradually increased to gauge the efficiency of the technique. It is observed that as we increase the number of records the proposed PDP technique more efficiently and proficiently achieves secure auditing, privacy, integrity and outclasses the erstwhile generic PDP technique.
Figure 7bshows the efficiency obtained by the proposed PDP technique, in comparison with the conventional techniques. As the number of attributes increase the efficiency is more or less sustained in both the earlier and proposed techniques. The number of records is kept a constant and the number of attributes is gradually increased to gauge the efficiency of the technique. It is observed that as we increase the number of attributes the proposed PDP technique more efficiently and proficiently achieves secure auditing, privacy, and integrity and outclasses the erstwhile generic PDP technique.
The running time increases with rise in data volume and is more or less same for both the techniques proposed PDP technique, Generic ECSDA PDP technique can be seen from the Figure 7c. The amount of time CSP takes to retrieve the user data-running time, is just very few seconds. After the execution of our project, we calculated the time taken for the challenge and response. We infer that as the file size increases, time taken also increases. Time duration of different file sizes varies only in few milli seconds. Since the time duration varies in few milli seconds the user /verifier cannot perceive the difference in challenge and response.
Figure 7dshows the reliability obtained by the propose techniques. As the number of files increases, the reliability is more or less persistent in both the earlier and proposed techniques.
Figure 8 illustrates (a) Time taken for key generation (in ms); (b) Time taken for signature generation (in ms); (c) Time taken for signature generation (in ms); and (d) Time taken for verification (in ms) for the ECGDSA approaching accordance with an embodiment of the present disclosure.
Figure 8a shows the time taken for Key Generation (in ms) for all the variants of the ECDSA. Efficiency obtained by the proposed PDP technique, in comparison with the conventional techniques. For variants 2, 3, 5 the time taken is more compared for the ECGDSA.
Figure 8b shows the time taken for Signature Generation (in ms) for the ECGDSA approach is the least among all the variants of the ECDSA. Variant 3 show a steep increase in time taken for verification.
Figure 8c shows the time taken for Verification (in ms) for the ECGDSA approach is the least among all the variants of the ECDSA. Variant 5 show a steep increase in time taken for verification.
Figure 8d shows the time taken for Verification (in ms) for the ECGDSA approach is the least among all the variants of the ECDSA. Variant 5 show a steep increase in time taken for verification. The generic techniques of five ECDSA PDP schemes, have been implemented, in the PDP, to infer the future cost. It is established that audit costs of certain sophisticated PDP schemes (ECDSA, Variant 1, Variant 2, Variant 3, Variant 5) are nearly identical to those of the simple ECDSA PDP scheme; however, ag/preprocessing in schemes utilizing asymmetric key operations and storage costs have a noteworthy control on the total cost differences among the schemes. It is thus indomitable that the total cost of schemes utilizing the ECDSA asymmetric key primitives are analogous, whereas the cost of schemes utilizing advanced ECDSA public key primitives are more costly compared to the other schemes at large fle sizes, whereas, the technique using ECGDSA shows an overall improvement in security with a short key, short signature and less cost.
The drawings and the forgoing description give examples of embodiments. Those skilled in the art will appreciate that one or more of the described elements may well be combined into a single functional element. Alternatively, certain elements may be split into multiple functional elements. Elements from one embodiment may be added to another embodiment. For example, orders of processes described herein may be changed and are not limited to the manner described herein. Moreover, the actions of any flow diagram need not be implemented in the order shown; nor do all of the acts necessarily need to be performed. Also, those acts that are not dependent on other acts may be performed in parallel with the other acts. The scope of embodiments is by no means limited by these specific examples. Numerous variations, whether explicitly given in the specification or not, such as differences in structure, dimension, and use of material, are possible. The scope of embodiments is at least as broad as given by the following claims.
Benefits, other advantages, and solutions to problems have been described above with regard to specific embodiments. However, the benefits, advantages, solutions to problems, and any component(s) that may cause any benefit, advantage, or solution to occur or become more pronounced are not to be construed as a critical, required, or essential feature or component of any or all the claims.

Claims (10)

WE CLAIM
1. A system for Cloud based Digital Forensic Readiness with integrity and privacy preservation of the health care data storage, the system comprises of:
a setup module for generating a public private key pair, a signature, a tag from an input file, wherein the public key, private key is generated upon successful registration of a user for accessing data;
a proactive cloud data forensic module connected to the setup module, wherein the input file is initialized, scrutinized, scanned, monitored, inspected and post-mortem digitally for performing the plurality of task;
a data uploading module connected to the proactive cloud data forensic module for uploading at least an inspected file upon completion of the plurality of task by subdividing the inspected files, wherein the inspected files are encoded and saved in the cloud server;
a challenge module connected to the cloud server for choosing a specific block arbitrarily using a pseudo random permutation for checking the correctness of the stored data by auditing;
a proof generation module connected to the cloud server which runs the challenge generation in order to validate a proof of possession and creates a proof of possession; and
a signature and proof verification module connected to the challenge module for returning proof of possession to the client, who validates the proof.
2. The system as claimed in claim 1, wherein the input data/file F is separated into n blocks and the selected tag (metadata) for each block is calculated by means of a distinctive formula; wherein the user key generation comprises of a registration process for registering the user; wherein the registration process is based on a user identity such as a name, a mail id, a contact number and generating a secret key; and wherein the application is accessed by the user using the key.
3. The system as claimed in claim 1, wherein the signature generation process comprises of user producing verification of metadata, and then, it includes IAC, signatures or other linked information, and wherein the tag generation process generates a tag as output for each of the data block based on two inputs like hash function and the keys; and concatenate the tag blocks.
4. The system as claimed in claim 1, wherein a sanitization process performed by the setup module comprises of User canopies the Data blocks (enclosing the sensitive information) and creates the corresponding signatures. The signature is used to guarantee the legitimacy of the file validity and to validate the integrity of the file, wherein the canopied file and its corresponding signatures are sent to the sanitizer, in which sanitizer sanitizes these canopied files and its equivalent signatures to create the sanitized data blocks, in which the signatures of sanitized data blocks are altered into lawful ones for the sanitized file, and in which sanitizer uploads the altered sanitized file and its corresponding signatures to the cloud.
5. The system as claimed in claim 1, wherein the proactive cloud data forensic module comprises steps of: initialization of files for performing internal investigation of the file, wherein the initialization of files comprises steps of: outlining a context of the file for risk valuation; describing an infrastructure of the investigation; intimate a readiness for a process for performing the investigation of the task; and perceiving sources of serious signs to integrate a plurality of signals proactively.
6. The system as claimed in claim 1, wherein the files and data of the client is to be uploaded are scrutinized, maintained for and starts tasks comprising at least one of Strategize collection, identify evidence sources (client, consumer, service provider),isolate cloud instance, how to create scenario on demand, how to diminish the data volume, and defines the standards and practices adapted.
7. The system as claimed in claim 1, wherein the files and data of the client that are uploaded are incessantly screened and is comprised of at least of tasks: prearrangement for varied choices and periods that initial response should take for divergent incident control, plot assessment and validation; wherein the files and data of the client that are uploaded are ceaselessly inspected and is comprised of tasks comprising at least one of planning for notification strategies, consistency to the various types kinds of incidents response that should be taken against the detected incident, incident detection and confirmation how to do incident detection, scheduling, and evaluate the loss and extent of damage, and wherein the files and data are subjected to the tasks comprising of at least one of digital investigation on demand, incident reconstruction, and presentation
8. The system as claimed in claim 1, wherein the files are scrutinized for uploading to the cloud server for performing a plurality of tasks, and wherein the steps of data uploading module comprising at least on of file is encoded, encoded information is created ,last encoded record is split into numerous blocks using dynamic block generation and signatures are stored in file system, and files are shifted and saved in the cloud server.
9. The system as claimed in claim 1, wherein the challenge module comprises of the client creates a challenge for a specified number of file blocks, by choosing some data blocks arbitrarily as a challenge by using pseudo-random permutation, and sends the challenge to the prover, with the intent of auditing the cloud and check the correctness of the stored data;wherein a proof generation module for run the challenge generation in order to validate a proof of possession and creates a proof of possession, and wherein a signature and proof verification module for returning proof of possession to the client, who validates the proof using a signature and proof verification module
10. A method for preserving data integrity in cloud storage for digital forensic readiness, the method comprises of: generating a public private key pair, a signature, a tag from an input file from a setup module; wherein the public key, private key is generated upon successful registration of a user for accessing data; performing a plurality of task present in the input file using a proactive cloud data forensic module connected to the setup module, wherein the input file is initialized, scrutinized, scanned, monitored, inspected and post-mortem digitally; uploading at least an inspected file upon completion of the plurality of task by subdividing the inspected files using a data uploading module connected to the proactive cloud data forensic module, wherein the inspected files are encoded and saved in the cloud server; choosing a specific block arbitrarily using a pseudo random permutation for checking the correctness of the of the stored data by auditing using a challenge module connected to the cloud server run the challenge generation in order to validate a proof of possession and creates a proof of possession using a proof generation module connected to the cloud server; and returning proof of possession to the client, who validates the proof using a signature and proof verification module connected to the challenge module.
AU2021103828A 2021-07-02 2021-07-02 A novel system and auditing technique for cloud based digital forensic readiness with integrity and privacy preservation of health care data Ceased AU2021103828A4 (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
AU2021103828A AU2021103828A4 (en) 2021-07-02 2021-07-02 A novel system and auditing technique for cloud based digital forensic readiness with integrity and privacy preservation of health care data

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
AU2021103828A AU2021103828A4 (en) 2021-07-02 2021-07-02 A novel system and auditing technique for cloud based digital forensic readiness with integrity and privacy preservation of health care data

Publications (1)

Publication Number Publication Date
AU2021103828A4 true AU2021103828A4 (en) 2021-08-26

Family

ID=77369698

Family Applications (1)

Application Number Title Priority Date Filing Date
AU2021103828A Ceased AU2021103828A4 (en) 2021-07-02 2021-07-02 A novel system and auditing technique for cloud based digital forensic readiness with integrity and privacy preservation of health care data

Country Status (1)

Country Link
AU (1) AU2021103828A4 (en)

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN113992389A (en) * 2021-10-26 2022-01-28 东北大学秦皇岛分校 SGX data integrity auditing method based on dynamic frequency table

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN113992389A (en) * 2021-10-26 2022-01-28 东北大学秦皇岛分校 SGX data integrity auditing method based on dynamic frequency table

Similar Documents

Publication Publication Date Title
Shi et al. Applications of blockchain in ensuring the security and privacy of electronic health record systems: A survey
US10404455B2 (en) Multiple-phase rewritable blockchain
Hussien et al. A systematic review for enabling of develop a blockchain technology in healthcare application: taxonomy, substantially analysis, motivations, challenges, recommendations and future direction
Wang et al. Blockchain-based personal health records sharing scheme with data integrity verifiable
Hossain et al. FIF-IoT: A forensic investigation framework for IoT using a public digital ledger
Zhu et al. Dynamic audit services for outsourced storages in clouds
Hardin et al. Amanuensis: Information provenance for health-data systems
Li et al. Eunomia: Anonymous and secure vehicular digital forensics based on blockchain
Jayaraman et al. RETRACTED ARTICLE: A novel privacy preserving digital forensic readiness provable data possession technique for health care data in cloud
Naresh et al. Blockchain‐based patient centric health care communication system
Marichamy et al. Blockchain based securing medical records in big data analytics
Shin et al. A Survey of Public Provable Data Possession Schemes with Batch Verification in Cloud Storage.
Ametepe et al. Data provenance collection and security in a distributed environment: a survey
Yoosuf Lightweight fog‐centric auditing scheme to verify integrity of IoT healthcare data in the cloud environment
Narayanan et al. Improved Security for Cloud Storage Using Elgamal Algorithms Authentication Key Validation
AU2021103828A4 (en) A novel system and auditing technique for cloud based digital forensic readiness with integrity and privacy preservation of health care data
Xu et al. A privacy-preserving and efficient data sharing scheme with trust authentication based on blockchain for mHealth
Li et al. An accountable decryption system based on privacy-preserving smart contracts
Jin et al. Blockchain-based secure and privacy-preserving clinical data sharing and integration
Tian et al. Identity-based public auditing for cloud storage of internet-of-vehicles data
Nie et al. Time‐enabled and verifiable secure search for blockchain‐empowered electronic health record sharing in IoT
Thompson et al. Multifactor IoT Authentication System for Smart Homes Using Visual Cryptography, Digital Memory, and Blockchain Technologies
Vulapula et al. Review on Privacy Preserving of Medical Data in Cloud Computing System.
Mahapatra et al. A secure health management framework with anti-fraud healthcare insurance using blockchain
Li et al. In-Vehicle Digital Forensics for Connected and Automated Vehicles With Public Auditing

Legal Events

Date Code Title Description
FGI Letters patent sealed or granted (innovation patent)
MK22 Patent ceased section 143a(d), or expired - non payment of renewal fee or expiry