GB2596037A - Data anonymisation - Google Patents

Data anonymisation Download PDF

Info

Publication number
GB2596037A
GB2596037A GB2002450.1A GB202002450A GB2596037A GB 2596037 A GB2596037 A GB 2596037A GB 202002450 A GB202002450 A GB 202002450A GB 2596037 A GB2596037 A GB 2596037A
Authority
GB
United Kingdom
Prior art keywords
data
captured
operable
anonymised
personal
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
GB2002450.1A
Other versions
GB202002450D0 (en
Inventor
Maniak Tomasz
Iqbal Rahat
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Interactive Coventry Ltd
Original Assignee
Interactive Coventry Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Interactive Coventry Ltd filed Critical Interactive Coventry Ltd
Priority to GB2002450.1A priority Critical patent/GB2596037A/en
Publication of GB202002450D0 publication Critical patent/GB202002450D0/en
Publication of GB2596037A publication Critical patent/GB2596037A/en
Pending legal-status Critical Current

Links

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F21/00Security arrangements for protecting computers, components thereof, programs or data against unauthorised activity
    • G06F21/60Protecting data
    • G06F21/62Protecting access to data via a platform, e.g. using keys or access control rules
    • G06F21/6218Protecting access to data via a platform, e.g. using keys or access control rules to a system of files or objects, e.g. local or distributed file system or database
    • G06F21/6245Protecting personal data, e.g. for financial or medical purposes
    • G06F21/6254Protecting personal data, e.g. for financial or medical purposes by anonymising data, e.g. decorrelating personal data from the owner's identification
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F21/00Security arrangements for protecting computers, components thereof, programs or data against unauthorised activity
    • G06F21/60Protecting data
    • G06F21/62Protecting access to data via a platform, e.g. using keys or access control rules
    • G06F21/6218Protecting access to data via a platform, e.g. using keys or access control rules to a system of files or objects, e.g. local or distributed file system or database
    • G06F21/6245Protecting personal data, e.g. for financial or medical purposes

Landscapes

  • Engineering & Computer Science (AREA)
  • Health & Medical Sciences (AREA)
  • Bioethics (AREA)
  • General Health & Medical Sciences (AREA)
  • Theoretical Computer Science (AREA)
  • Computer Hardware Design (AREA)
  • Databases & Information Systems (AREA)
  • Computer Security & Cryptography (AREA)
  • Software Systems (AREA)
  • Physics & Mathematics (AREA)
  • General Engineering & Computer Science (AREA)
  • General Physics & Mathematics (AREA)
  • Medical Informatics (AREA)
  • Storage Device Security (AREA)

Abstract

A device and method for anonymising data prior to storage. Captured data is received and an identification unit processes the captured data to identify personal data within the captured data. The captured data may be image data such as still or moving images from cameras. The personal data maybe faces, number plates, ID badges, street names etc. An anonymisation unit obscures the identified personal data within the captured data providing anonymised data for output. The personal data maybe obscured via blurring, deleting or replacing the personal data with a pattern or alternate image. An encryption unit may be used to encrypt the captured data and output this alongside the anonymised data or also encrypt the anonymised data. This allows data to be anonymised without the need to store or transmit any personal data captured increasing the security of the personal data as it is not transmitted or stored in raw form.

Description

DATA ANONYM NATION
Technical Field of the Invention
The present invention relates to data anonymisation. Particularly, but not exclusively, the present invention relates to a device and method for anonymising captured data prior to storage.
Background to the Invention
With the increased usage of surveillance cameras and similar devices, the amount of data captured is ever increasing. This has created a large concern over the privacy of people's data, and an increased demand for data privacy rights. One example of this is the General Data Protection Regulation 20 16/679 (GDPR), which was introduced by the European Union (EU) in 2016, and relates to ensuring that personal and/or identifying data of the citizens of the EU is protected using appropriate technical measures. In relation to data collected by surveillance cameras and the like, there is thus a need to anonymise personal information, such as images which can be used to identify people, or other relevant information such as vehicle number plates.
Conventionally, this can be achieved by the use of anonymisation software. This can be applied directly on locally stored data, or by transmitting the relevant data to a cloud-based server for anonymisation.
These software-based systems contain several vulnerabilities, notably that the data is transmitted from the camera to the storage device, and subsequently stored (at least temporarily) in non-anonymised form. Further, for cloud based anonymisation, the raw data is also transmitted to the cloud. This means data breaches or hacking incidents could allow unauthorised access to non-anonymised data. Further, these devices are complex and processing-heavy, requiring storing and transmission of at least two sets of data and access to a suitable anonymization software or server.
It is an object of the present invention to provide a device and method for data anonymisation to at least partially overcome or alleviate the above issues in existing data anonymisation devices.
Summary of the Invention
According to a first aspect of the present invention, there is provided a data anonymisation device comprising: an input for receiving captured data; an identification unit operable to process said captured data and thereby identify personal data within the captured data, an anonymisation unit operable to obscure the identified personal data within the captured data so as to thereby provide anonymised data for output; arid an output means for outputting the anonymised data.
According to a second aspect of the present invention, there is provided a method of anonymising data, comprising the steps of: receiving captured data; processing the data in order to identify personal data within said captured data; and anonymising the identified personal data so as to provide an anonymised data output.
The present invention therefore provides a device (and method) by which data can be anonymised without the need to store or transmit any personal data captured. This increases the security of said personal data by removing vulnerabilities in the path of the data.
The input may be connected to a camera. The input may be connected directly to an image sensor. The captured data may be image data. The data may be in the form of still or moving images The identification unit may be operable to identify faces, number plates, street names and house numbers, taxi license numbers, ID badges, or the like. The identification unit may be operable using hardware such as Field Programmable Gate Arrays (FPGAs), Application Specific Integrated Circuits (ASICs), Complex Programmable Logic Devices (CPLDs) or a System on Chip (SoC).The identification unit may be operable using software such as based on Verilog, VELD, Assembler, C/C++, Java, Python, .NET, TypeScript, TensorFlow, PyTorch, Torch, Apache Sparc or CNTK. The identification unit may be operable to perform algorithms such as Deep Neural Networks (DNN), Convolution Neural Networks (CNN), Recurrent Neural Networks (RNN), Region based CNNs (R-CNNs), variations of \DLO algorithm, Fully Connected Neural Networks, Histogram of Oriented Gradients (HOG) and Data Clustering Techniques. The skilled man will appreciate that there are many suitable hardware, software and algorithmic techniques possible to identify any personal data in :3 the captured data. The identification unit may be operable to recognise personal data that is identified multiple times in the captured data. The identification unit may be operable to link any personal data which is identified multiple times in the captured data The identification may be operable to link matching personal data across multiple sets of captured data. For example, if a particular number plate is detected multiple times within the captured data, such as entering and leaving a car park, the identification unit may be operable to link the separate detections of the same number plate with an appropriate tag.
The anonymisation unit may be operable to obscure personal data via blurring, deleting or replacing the personal data. The anonymisation unit may replace the personal data with a pattern or an alternate image. The blurring could take the form of pixelating the personal data until it is not identifiable. The pattern may be a blank or plain, evenly coloured region. The pattern may be a simple geometric pattern, or a static/ pseudo static pattern. The alternate image may be a placeholder, such as a simulated version of the personal data, or a randomly selected alternate other piece of personal data. For example, all faces identified by the identification unit may be replaced with the same placeholder face. Alternatively, each face may be replaced with a generic face randomly selected from a set of multiple generic faces. The generic faces may be randomly generated using a generative algorithm. One suitable example of such an algorithm is a generative adversarial network (GAN).
The device may comprise an encryption unit operable to encrypt the captured data and output this alongside the anonymised data. The encryption unit may also be operable to encrypt the anonymised data output by the anonymisation unit. The encrypted raw data may be stored as a digital watermark in the anonymised data, or as an extension to the file containing the anonymised data. The encryption unit may be operable to encrypt the entirety of the captured data. Alternately, the encryption unit may be operable to encrypt only the identified personal data after it had been identified by the identification unit. The encryption unit may be operable to encrypt each piece of personal data separately.
The encryption unit may be operable to produce a decryption key. The decryption key may be operable to decrypt the encrypted data when the key is used.
The decryption key may be applied to the encrypted data on any suitable device. After decryption, the entirety of the captured data, or alternately individual pieces of personal data, may be viewed, depending on the decryption key used. The decryption key may be operable to only decrypt certain subsets of the encrypted data. For example, these subsets could include the personal data captured by a particular data capture device, or the personal data captured in a specified time period.
The decryption key may be operable to leave any data which is not be decrypted in an encrypted form, or may be operable to decrypt the entirety of the personal data, whilst leaving the data which is not for decryption obscured following the anonymisation process. It will be understood that the decryption key should only be made available to people with a suitable authorisation to decrypt the data, thus ensuring the personal data is protected The encryption unit may be operable to systematically encrypt each linked piece of personal data using an identical technique, such that only a single decryption key is required to decrypt all linked personal data. Alternatively, the encryption unit may be operable to encrypt certain subsets of the linked personal data. For example, these subsets could include the personal data captured by a particular data capture device, or the personal data captured in a specified time period.
To use the example above, the encryption unit would be operable, using only a single decryption key, to access the captured data of the number plate entering and leaving the car park, as opposed to requiring two separate keys. The encryption unit of the above example may only be operable to decrypt the personal data captured on a specific day with the single decryption key. The encryption unit may be operable to encrypt any linked personal data across multiple sets of captured data.
The device may comprise a storage unit connected to the output of the device. The storage unit may be operable to the anonymised data. The storage unit may be operable to store the encrypted data produced by the encryption unit. The skilled person will understand that there are many suitable types of storage unit and storage formatting that could be used The device may comprise a communication unit connected to the output of the device. The communication unit may be operable to transmit the anonymised data to a suitable receiver.
In embodiments where the device comprises a communication unit in addition to a storage unit, the communication unit may be placed before the storage unit in the path of the data, or vice versa. The storage unit may be operable to store only the encrypted data, or alternatively only the anonymised data Similarly, the communication unit may be operable to transmit the data which is not stored by the storage unit Optionally, the device may be operable to enable both the encrypted and anonymised data to be both stored in the storage unit and transmitted by the communication unit. The transmission unit may be operable, for example, to transmit the anonymised data to a display unit where the anonymised data can be displayed for monitoring The device may be integrated into a data capture device. An example of a suitable data capture device is a camera. The skilled person will understand that there are many suitable data capture devices to which the data anonymisation device could be integrated into.
The anonymisation and identification units may be integrated into a single master unit. The master unit may take the form of an integrated circuit. Any of the data capture device, encryption, storage and communication units may also comprise part of the master unit. Further, when implemented as a hardware-based solution, this reduces the need to use software that requires intense processing and/or transmission of the data to a cloud-based server for anonymisation.
The skilled person will understand that the method of anonymising data of the second aspect of the present invention may comprise the step of performing any of the actions performed by any of the above mentioned features of the device of the first aspect of the present invention.
According to a third aspect of the present invention, there is provided a data capture device comprising a data anonymisation device according to the first aspect of the present invention or operable to the methods of the second aspect.
According to a fourth aspect of the present invention, there is provided a data acquisition system comprising: one or more data capture devices; said data capture devices each in communication with one or more servers for storing/processing captured data characterised in that at least one data capture device is a data capture device according to the third aspect of the present invention, and/or the system comprises a device according to the first aspect of the present invention or a device operable according to the method of the second aspect of the present invention placed in the data path between at least one data capture device and at least one server.
Detailed Description of the Invention
In order that the invention may be more clearly understood one or more embodiments thereof will now be described, by way of example only, with reference to the accompanying drawings, of which: Figure 1 is a schematic illustration of a data anonymisation device according to the present invention.
Figure 2 is a schematic illustration of an alternate embodiment of the data anonymisation device of the present invention Figure 3 is a flow chart illustrating a method of anonymising data according to the second aspect of the present invention.
Figure 4 is a flow chart illustrating a method of decrypting personal data according to the present invention.
Figure 5 is a schematic illustration of an embodiment of a data capture device according to the third aspect of the present invention.
Figure 6A is a schematic illustration of an embodiment of a data acquisition system according to the fourth aspect of the present invention Figure 6B is a schematic illustration of an alternate embodiment of a data acquisition system according to the fourth aspect of the present invention.
Figures 7A-C show example options for the format of the output data from the data anonymisation device, method, data capture device or data acquisition system of the first, second third and fourth aspects of the present invention respectively.
Figures 8A&B show state of the art data acquisition systems In existing prior art systems, as exemplified in Figures 8A&B, data is captured using cameras 42, and transmitted to a storage unit 51 for temporary storage. The captured data CD, which may contain personal data PD is stored prior to anonymisation and/or encryption. The captured data CD is then transmitted, through a router 52, to a server 41 for permanent storage. From the server 41, the captured data can then either be anonymised directly by anonymisation software, as shown in Figure 8A, or transmitted to yet another server 41 via the cloud 54. The captured data CD is anonymised on said server 41, and transmitted back to the original server 41 for permanent storage, as shown in Figure 8B.
Turning now to Figure 1, there is provided a schematic illustration of a data anonymisation device 1 of the present invention. The device 1 comprises a data input 2 The data input 2 is connected to an identification unit 3 The identification unit 3 is operable to process any captured data CD received from the input 1 in order to assess whether the captured data CD contains any personal data PD. The identification unit 3 is operable to tag any identified personal data PD within the captured data CD. The identification unit 3 is operable to recognise particular pieces of personal data PD that occurs multiple times within the captured data CD and link the personal data PD accordingly. The identification unit 3 is connected to an anonymisation unit 4, to which it transmits the captured data CD after the identification of the personal data PD is complete.
The anonymisation unit 4 is operable to obscure any personal data PD identified in the captured data CD by the identification unit 3. Thus, the anonymisation unit 4 is operable to produce anonymised data AD from the captured data CD. The anonymisation unit 4 is operable to obscure the personal data PD using various techniques, as illustrated in Figures 7A, B and C. The anonymisation unit 4 is connected to an output means 5 operable to output the anonymised data AD Figure 2 shows a schematic illustration of an alternate embodiment of a data anonymisation device 1 according to the present invention. In addition to the features shown in Figure 1 and described above, there is provided an encryption unit 4a operable to encrypt the captured data CD. In this embodiment, the encryption unit 4a is connected between the anonymisation unit 4 and the output 5.
The encryption unit 4a is operable to encrypt the personal data PD identified by the identification unit 3, or to encrypt the entirety of the captured data CD. The encryption unit 4a is operable to act in conjunction with or in place ofthe anonymisation unit 4. As such, it will be understood that the encryption unit 4a could be placed between the identification unit 3 and the anonymisation unit 4. In this alternate embodiment, the output means 5 is connected to both a storage unit 6 and a communication unit 7.
The encryption unit 4a is operable to encrypt the personal data PD such that an appropriate key can be provided to decrypt and thus display the personal data PD. In cases where there exists personal data PD that has been linked to other instances of that same personal data PD being identified, the encryption unit 4a is operable to encrypt each linked piece of personal data PD using a common key. This technique produces encrypted data ED that requires just a single key to unlock all instances of that particular piece of linked personal data PD.
The storage unit 6 is operable to store any encrypted data ED or anonymised data AD. The communication unit 7 is operable to transmit any anonymised data AD or encrypted data ED to a suitable external device. The storage unit 6 and communication unit are connected such that any data stored by the storage unit 7 can then be transmitted by the communication unit 7, and vice versa.
Turning to Figure 3, there is provided a flow chart showing an example method of anonymising data according to a second aspect of the present invention. Firstly, at step 11, data is captured or received from an external device. This captured data CD may take the form of still or moving images, among other alternatives.
The captured data CD is then processed, at step 12, to identify any personal data PD which is contained in the captured data CD Any personal data PD identified in this step is tagged as being personal data PD. Further, any particular piece of personal data PD which is identified multiple times as that same piece of personal data PD are assigned a tag to denote that the instances are linked.
At step 13, the captured data CD is anonymised by obscuring any personal data PD contained within the captured data CD. Alternately, the anonymisation step 13 may comprise of anonymising the entirety of the captured data CD.
The anonymisation is followed by the step 14 of encrypting the identified personal data PD to form encrypted data ED. This step 14 optionally comprises encrypting the entirety of the captured data CD. This step 14 comprises the encryption of any linked personal data PD such that any linked personal data PD is operable to be decrypted by a single key. The step 15 comprises the outputting of the encrypted data ED or anonymised data AD. The data ED, AD may be stored (step 16), or communicated (step 17), or both stored and communicated.
Figure 4 shows an example decryption method when applied to encrypted data ED produced via the method of the second aspect of the present invention. At step 21, a decryption key is input into a suitable device on which the encrypted data ED is stored At step 22, the decryption key identifies the personal data PD within the encrypted data ED for decryption, and at step 23, identifies any personal data PD linked to the personal data to be decrypted (i.e. other instances of the same personal data PD within the encrypted data ED, or within the relevant subset of the encrypted data ED).
After all the relevant personal data PD is identified by the decryption key, the relevant personal data PD is decrypted at step 24. This decryption process ends with the outputting of the decrypted data DD at step 25, which it will be understood may not be identical to the captured data CD, depending upon the contents of the captured data CD and the decryption key used.
Figure 5 illustrates a data capture device according to the third aspect of the present invention. This particular embodiment is a camera fitted with the data anonymisation device of the first aspect of the present invention. The skilled person would understand basic camera architecture, but this is briefly explained below for clarity.
A camera 30 comprises a lens 31 operable to focus incoming light onto a charge-coupled device (CCD) 32. The CCD 32 is operable to convert the incoming light from the lens 31 and convert this into an analogue signal. This signal is output to an analogue to digital converter 33, which outputs a digital signal to an image processor 34. The incoming digital signal is processed to form an image, which is the captured data CD referred to above. The captured data CD is then output from the image processor 34 and input into the data input 2 of a data anonymisation device 1, as described above.
The data anonymisation device 1_ operates to anonymise (and optionally encrypt) any personal data PD identified within the captured data CD, and output this anonymised data AD and/or encrypted data ED. The output means 5 of the device 1 is connected to a compressor 35, which is operable to compress the anonymised data AD and/or encrypted data ED, arid transmit and or store the compressed data as required Figure 6A shows an embodiment of a system according to the fourth aspect of the present invention An array of data capture devices, in this embodiment three video cameras 30, are arranged to capture, anonymise and optionally encrypt data as detailed above. These cameras are connected to a server 41, such that any data output by the cameras is operable to be stored, or otherwise operated on by the server 41.
Figure 6B shows an alternate embodiment of the system according to the first aspect of the present invention. In this embodiment, an array of three cameras 42 (not containing data anonymisation devices) are each connected individually to separate data anonymisation devices 1. The captured data CD output from the cameras is input to the devices 1, which operate on the data as described above. Each of the devices 1 connect to a server 41. The server is operable to receive, store and perform operations on the data output by the devices 1 Figures 7A, B and C show the various types of anonymisation that can be performed by the above described devices, method and system. In each figure, identical captured data CD is input into a device 1 as described above. The device is operable to identify any personal data PD present and produce totally anonymised data AD1, data anonymised through insertion of a selective blurring effect AD2 and anonymisation via replacement of the personal data PD with a placeholder face AD3, as shown in Figures 7A, B and C respectively. The devices, method and system are also operable to provide alternative pattern rather than blurring or the insertion of placeholder faces, as will be understood.
The one or more embodiments are described above by way of example only. Many variations are possible without departing from the scope of protection afforded by the appended claims.

Claims (25)

  1. CLAIMSA data anonymisation device comprising: an input for receiving captured data; an identification unit operable to process said captured data and thereby identify personal data within the captured data; an anonymisation unit operable to obscure the identified personal data within the captured data so as to thereby provide anonymised data for output; and an output means for outputting the anonymised data.
  2. 2. A device as claimed in claim 1 wherein the input is connected to a camera.
  3. A device as claimed in either claim t or 2 wherein the identification unit is operable to identify any one or more of faces, number plates, street names and house numbers, taxi license numbers, ID badges and other features that can uniquely identify a person.
  4. 4. A device as claimed in any preceding claim wherein the identification unit is operable to link any personal data which is identified multiple times in the captured data
  5. 5. A device as claimed in any preceding claim wherein the identification unit is operable to link matching personal data across multiple sets of captured data
  6. 6. A device as claimed in any preceding claim wherein the anonymisation unit is operable to obscure personal data via blurring, deleting or replacing the personal data with a pattern or an alternate image.
  7. 7. A device as claimed in any preceding claim wherein the device comprises an encryption unit operable to encrypt the captured data and/or the anonymised data.
  8. 8. A device as claimed in claim 7 wherein the encrypted data is stored as a digital watermark in the anonymised data, or as an extension to the file containing the anonymised data.
  9. 9. A device as claimed in either claim 7 or 8 wherein the encryption unit is operable to systematically encrypt each linked piece of personal data using an identical technique, such that only a single decryption key is required to decrypt all linked personal data.
  10. 10. A device as claimed in any claim wherein the encryption unit is operable to encrypt certain subsets of the linked personal data.
  11. 11. A device as claimed in any preceding claim wherein the device comprises a storage unit connected to the output of the device; the storage unit operable to store the anonymised data.
  12. 12. A device as claimed in any preceding claim wherein the device comprises a communication unit connected to the output of the device; the communication unit operable to transmit the anonymised and/to encrypted data to a suitable receiver.
  13. 13. A method of anonymising data, comprising the steps of: receiving captured data; processing the data in order to identify personal data within said captured data; and anonymising the identified personal data so as to provide an anonymised data output.
  14. 14. A method as claimed in claim 13 comprising the step of linking any personal data which is identified multiple times in the captured data.
  15. 15. A method as claimed in either claim 13 or 14 comprising the step of linking matching personal data across multiple sets of captured data.
  16. 16. A method as claimed in any of claims 13-15 comprising the step of obscuring personal data via blurring, deleting or replacing the personal data with a pattern or an alternate image.
  17. 17. A method as claimed in any of claims 13-16 comprising the step of encrypting the captured data and/or the anonymised data.
  18. 18. A method as claimed in any of claims 13-17 comprising the step of storing the encrypted data as a digital watermark in the anonymised data, or as an extension to the file containing the anonymised data.
  19. 19. A method as claimed in any of claims 13-18 comprising the step of systematically encrypting each linked piece of personal data using an identical technique, such that only a single decryption key is required to decrypt all linked personal data.
  20. 20. A method as claimed in any of claims H-19 comprising the step of encrypting certain subsets of the linked personal data.
  21. 21. A method as claimed in any of claims 13-20 comprising the step of storing the anonym i sed data.
  22. 22. A method as claimed in any of claims 13-21 comprising the step of transmitting the anonymised and/or encrypted data to a suitable receiver.
  23. 23. A data capture device comprising a data anonymisation device according to any of claims 1 -12 or operable to the method of any of claims 13-22.
  24. 24. A device as claimed in claim 23 wherein the data capture device is a camera.
  25. 25. A data acquisition system comprising: one or more data capture devices, said data capture devices each in communication with one or more servers for storing/processing captured data characterised in that: at least one data capture device is a data capture device according to either claim 23 or 24; and/or the system comprises a device according to any of claims 1-12 or a device operable according to the method of any of claims 13-22 placed in the data path between at least one data capture device and at least one server.
GB2002450.1A 2020-02-21 2020-02-21 Data anonymisation Pending GB2596037A (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
GB2002450.1A GB2596037A (en) 2020-02-21 2020-02-21 Data anonymisation

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
GB2002450.1A GB2596037A (en) 2020-02-21 2020-02-21 Data anonymisation

Publications (2)

Publication Number Publication Date
GB202002450D0 GB202002450D0 (en) 2020-04-08
GB2596037A true GB2596037A (en) 2021-12-22

Family

ID=70108267

Family Applications (1)

Application Number Title Priority Date Filing Date
GB2002450.1A Pending GB2596037A (en) 2020-02-21 2020-02-21 Data anonymisation

Country Status (1)

Country Link
GB (1) GB2596037A (en)

Cited By (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2023129055A1 (en) * 2021-12-28 2023-07-06 Havelsan Hava Elektronik San. Ve Tic. A.S. Reliable in-camera anonymization method for machine learning/deep learning
EP4276771A1 (en) * 2022-05-11 2023-11-15 Sick Ag Method and device for anonymized image acquisition in an industrial installation

Citations (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20020188187A1 (en) * 2001-06-07 2002-12-12 Jordan Sarah E. System and method for removing sensitive data from diagnostic images
US20130004090A1 (en) * 2011-06-28 2013-01-03 Malay Kundu Image processing to prevent access to private information
EP2945098A1 (en) * 2014-05-13 2015-11-18 Xiaomi Inc. Method and device for hiding privacy information
US20160148019A1 (en) * 2014-11-26 2016-05-26 Ncr Corporation Secure image processing
US20190050592A1 (en) * 2018-09-27 2019-02-14 Intel IP Corporation Systems and methods for processing and handling privacy-sensitive image data
US20190377901A1 (en) * 2018-06-08 2019-12-12 Microsoft Technology Licensing, Llc Obfuscating information related to personally identifiable information (pii)

Patent Citations (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20020188187A1 (en) * 2001-06-07 2002-12-12 Jordan Sarah E. System and method for removing sensitive data from diagnostic images
US20130004090A1 (en) * 2011-06-28 2013-01-03 Malay Kundu Image processing to prevent access to private information
EP2945098A1 (en) * 2014-05-13 2015-11-18 Xiaomi Inc. Method and device for hiding privacy information
US20160148019A1 (en) * 2014-11-26 2016-05-26 Ncr Corporation Secure image processing
US20190377901A1 (en) * 2018-06-08 2019-12-12 Microsoft Technology Licensing, Llc Obfuscating information related to personally identifiable information (pii)
US20190050592A1 (en) * 2018-09-27 2019-02-14 Intel IP Corporation Systems and methods for processing and handling privacy-sensitive image data

Cited By (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2023129055A1 (en) * 2021-12-28 2023-07-06 Havelsan Hava Elektronik San. Ve Tic. A.S. Reliable in-camera anonymization method for machine learning/deep learning
EP4276771A1 (en) * 2022-05-11 2023-11-15 Sick Ag Method and device for anonymized image acquisition in an industrial installation

Also Published As

Publication number Publication date
GB202002450D0 (en) 2020-04-08

Similar Documents

Publication Publication Date Title
CN110663047B (en) Safe Convolutional Neural Network (CNN) accelerator
US11155725B2 (en) Method and apparatus for redacting video for compression and identification of releasing party
JP2011151770A (en) Image encrypting system for output of encrypted images subjected to undefining treatment of degree according to authorized browsing person
Honda et al. Hierarchical image-scrambling method with scramble-level controllability for privacy protection
GB2596037A (en) Data anonymisation
KR20120035299A (en) Image protection processing apparatus for privacy protection, and image security system and method using the same
US11082731B1 (en) Privacy-preserving video analytics
CN109635576B (en) Method and system for hiding data in image
JP2023534417A (en) Image delivery using synthetic re-encrypted images
JP2003319158A (en) Image processing system
CN111526325A (en) Privacy masking method and recording medium using morphological preserving encryption technique
WO2021084944A1 (en) Information processing system, information processing method, imaging device, and information processing device
US11463240B2 (en) Methods and image processing devices for encoding and decoding private data
JP5162732B2 (en) Image browsing system
TW202044797A (en) Sensor device and encryption method
WO2021175714A1 (en) Scrambling of regions of interest in an image to preserve privacy
JP5840804B2 (en) An image encryption system that outputs an encrypted image subjected to a blurring process having a strength corresponding to a viewing right holder.
KR101228480B1 (en) Apparatus and method for capturing multimedia data along with identifiable information
KR101731012B1 (en) System for managing transfer of personal image information
WO2022090738A1 (en) Selective video modification
US11159323B1 (en) Pseudonymous video data capture and query system
EP1386489B1 (en) Monitoring apparatus, computer program and network for secure data storage
KR101677111B1 (en) Dynamic image object privacy protection device and the method of detecting the face of the pedestrian based
Priya et al. Reversible Information Hiding in Videos
CN114710649B (en) Pollution source video monitoring method and system

Legal Events

Date Code Title Description
R108 Alteration of time limits (patents rules 1995)

Free format text: EXTENSION ALLOWED

Effective date: 20210824

Free format text: EXTENSION APPLICATION

Effective date: 20210820