WO2021129146A1 - 一种图片中隐私信息保护的处理方法及装置 - Google Patents

一种图片中隐私信息保护的处理方法及装置 Download PDF

Info

Publication number
WO2021129146A1
WO2021129146A1 PCT/CN2020/125306 CN2020125306W WO2021129146A1 WO 2021129146 A1 WO2021129146 A1 WO 2021129146A1 CN 2020125306 W CN2020125306 W CN 2020125306W WO 2021129146 A1 WO2021129146 A1 WO 2021129146A1
Authority
WO
WIPO (PCT)
Prior art keywords
sensitive
picture
noise
processed
pictures
Prior art date
Application number
PCT/CN2020/125306
Other languages
English (en)
French (fr)
Inventor
宗志远
Original Assignee
支付宝(杭州)信息技术有限公司
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by 支付宝(杭州)信息技术有限公司 filed Critical 支付宝(杭州)信息技术有限公司
Publication of WO2021129146A1 publication Critical patent/WO2021129146A1/zh

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F21/00Security arrangements for protecting computers, components thereof, programs or data against unauthorised activity
    • G06F21/60Protecting data
    • G06F21/602Providing cryptographic facilities or services
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F2221/00Indexing scheme relating to security arrangements for protecting computers, components thereof, programs or data against unauthorised activity
    • G06F2221/21Indexing scheme relating to G06F21/00 and subgroups addressing additional information or applications relating to security arrangements for protecting computers, components thereof, programs or data against unauthorised activity
    • G06F2221/2107File encryption

Definitions

  • This manual belongs to the field of computer technology, and in particular relates to a processing method and device for protecting private information in pictures.
  • the purpose of the embodiments of this specification is to provide a processing method and device for protecting private information in pictures, which reduces the ability of illegal users to identify sensitive information.
  • this specification provides a processing method for protecting private information in pictures, including: obtaining sensitive pictures to be processed; locating sensitive positions of sensitive information in the sensitive pictures to be processed; and generating anti-noise by using an adversarial sample generation method ; Synthesize the generated anti-noise into the picture at the sensitive position of the sensitive picture to be processed, and store the synthesized picture.
  • this specification provides a processing device for protecting private information in pictures, including: a picture acquisition module for acquiring sensitive pictures to be processed; a sensitive location positioning module for locating the sensitive pictures in the Sensitive position of sensitive information; a noise generation module for generating anti-noise using the anti-sample generation method; a noise synthesis module for synthesizing the generated anti-noise into the picture at the sensitive position of the sensitive picture to be processed, And store the synthesized picture.
  • this specification provides a processing device for protecting private information in pictures, including: at least one processor and a memory for storing processor-executable instructions. When the processor executes the instructions, the processing device in the picture The processing method of privacy information protection.
  • the processing methods, devices, and processing equipment for protecting private information in pictures provided in this manual generate anti-noise through the anti-sample generation method, and locate the sensitive location of the sensitive information in the sensitive picture to be processed. Add anti-noise at the sensitive position to obtain the synthesized picture, so that the picture recognition algorithm or model (such as OCR algorithm) cannot recognize the sensitive information in the processed sensitive picture, but does not affect the visual quality of the picture itself , Which does not affect people's visual experience. There is no need to do other processing such as decryption during normal business use, which satisfies the use of normal business, and at the same time reduces the automatic identification ability of illegal users to identify sensitive information, and ensures the privacy of users.
  • Fig. 1 is a schematic flowchart of a processing method for protecting private information in a picture in an embodiment of this specification.
  • Fig. 2 is a schematic diagram of a sensitive position in a scanned ID card in an embodiment of this specification.
  • Fig. 3 is a schematic structural diagram of a processing method for protecting private information in a picture in another embodiment of this specification.
  • Fig. 4 is a schematic diagram of the module structure of an embodiment of a processing device for protecting private information in pictures provided in this specification.
  • Fig. 5 is a block diagram of the hardware structure of a processing server for protecting private information in pictures in an embodiment of this specification.
  • Some data will be stored or interacted in the form of pictures, such as: ID card scans, invoice photos, scans of medical records, etc.
  • Some personal sensitive information such as: name, ID number, date of birth, photo, company name, company address, medical condition, etc.
  • OCR Optical Character Recognition
  • OCR can represent a process in which an electronic device (such as a scanner or a digital camera) checks characters printed on paper, determines their shape by detecting dark and light patterns, and then uses character recognition methods to translate the shapes into computer text. That is, for printed characters, the text in a paper document is optically converted into a black and white dot matrix image file, and the text in the image is converted into a text format through recognition software for further editing and processing by word processing software.
  • an electronic device such as a scanner or a digital camera
  • the embodiments of this specification provide a processing method for protecting private information in pictures, which can add anti-noise to pictures with sensitive information and reduce the recognition ability of the OCR algorithm, so that the OCR algorithm cannot accurately identify or incorrectly identify the sensitive information in the picture.
  • Information to achieve the purpose that illegal users cannot automatically make profits, gain valuable time for data leakage investigation, and increase the cost of monetizing illegal users.
  • the processing method for protecting the privacy information in the pictures in this manual can be applied to the client or server.
  • the client can be a smart phone, a tablet, a smart wearable device (smart watch, etc.), a smart car device, and other electronic devices.
  • Fig. 1 is a schematic flowchart of a processing method for protecting private information in a picture in an embodiment of this specification. As shown in Fig. 1, the processing method for protecting private information in a picture provided in an embodiment of this specification may include steps 102-108.
  • Step 102 Acquire sensitive pictures to be processed.
  • Sensitive pictures to be processed can represent pictures with sensitive information (sensitive information can include personal or corporate private information, etc.), and can be photos or electronic scans, such as: ID card scans, invoice photos, and medical records in the above embodiments Scanned copies, etc. It can be obtained from a database storing sensitive pictures, or directly obtained when the user uploads it, and the embodiment of this specification does not specifically limit it. Normally, sensitive pictures with sensitive information to be processed will be encrypted and stored in the database, and users need to decrypt them when they use them. If the sensitive picture to be processed is obtained from the database, the obtained sensitive picture to be processed may be encrypted.
  • the encrypted picture is first performed Decrypt to obtain the original sensitive picture, and use the decrypted original picture as the to-be-processed sensitive picture in the embodiment of this book.
  • Step 104 Locate the sensitive position of the sensitive information in the sensitive image to be processed.
  • Sensitive location can indicate the location of sensitive information in the picture.
  • the location with sensitive information in the picture is relatively fixed.
  • Figure 2 is a schematic diagram of sensitive locations in a scanned ID card in an embodiment of this specification.
  • the sensitive information in the scanned ID card is generally a photo of a person, an ID number, and home address.
  • the photo of a person is generally located in the scanned image.
  • the ID number is located at the bottom of the scanned document
  • the home address is located on the center left of the scanned document.
  • the position of sensitive information in sensitive pictures such as case scans and invoice photos is generally relatively fixed, and the sensitive position of the sensitive information in the sensitive picture to be processed can be located according to the type of the sensitive picture to be processed.
  • the sensitive image to be processed is a scanned ID card
  • the upper right, bottom right, and middle left areas of the image can be used as sensitive locations.
  • the name and date of birth on the scanned ID card can also be used as sensitive information to locate the location of the name and date of birth.
  • the area of the sensitive location area can be set according to the size of the sensitive information in the real object, and can be adjusted according to the size of the picture. If the picture size is relatively large, you can specify a relatively large area in the picture as the sensitive location. If the size of the picture is relatively small, the area of the sensitive location can be appropriately reduced, which can be set according to actual needs.
  • the locating the sensitive position of the sensitive information in the sensitive picture to be processed may include: pre-utilizing the historical sensitive picture and the sensitive position marked in the historical sensitive picture to train to construct the sensitive position positioning Model; using the sensitive position positioning model to locate the sensitive position of the sensitive information in the to-be-processed sensitive picture.
  • a sensitive location positioning model is constructed. After the sensitive image to be processed is obtained, the sensitive image to be processed can be input to the constructed sensitive location positioning model, and the sensitive location of the sensitive information in the sensitive image to be processed can be automatically located by using the sensitive location positioning model.
  • the sensitive location positioning model can use the Faster RCNN model, and the Faster RCNN model can be understood as a target detection model based on deep learning.
  • the model is trained in advance by marking sensitive positions in historical sensitive pictures, and the sensitive positions of sensitive information in the sensitive pictures to be processed are automatically recognized through the model, which realizes the automatic and rapid identification of sensitive positions and improves the picture quality. Processing efficiency.
  • Step 106 Use the anti-sample generation method to generate anti-noise.
  • Adversarial samples can be understood as input samples formed by deliberately adding subtle interference in the data set, which causes the model to give an incorrect output with high confidence.
  • Using the anti-sample generation method can generate targeted anti-noise for the model, such as generating anti-noise for the OCR algorithm.
  • Anti-noise can also be understood as a kind of anti-sample, which can be used to interfere with the recognition results of image sensitive information recognition algorithms such as OCR algorithm.
  • the adversarial sample generation method can be selected according to actual needs, such as the FGSM (Fast Gradient Sign Method) method.
  • FGSM can be understood as a method that induces the network to misclassify the generated images by adding increments in the gradient direction. method.
  • a black-box attack countermeasure sample generation method may be used to generate the countermeasure noise.
  • the adversarial sample generation method for black box attacks can be understood as a method of generating adversarial samples that are effective against various algorithms without knowing the specific model used (such as the OCR algorithm).
  • the embodiment of this specification adopts a black-box attack confrontation sample generation method, which can generate confrontation samples suitable for various algorithms, that is, confrontation noise, and provides a data basis for subsequent processing of sensitive pictures to be processed.
  • the adversarial sample generation method of the black box attack can be: boundary attack method (that is, boundary attack can be understood as a method of adding large-scale disturbances, and then proceeding with boundary exploration), or a pixel attack method ( That is, one pixel attack can be understood as a method to achieve an attack by changing the value of an image pixel). Adopting the boundary attack method or the one-time pixel attack method can generate anti-noise against different algorithm models and reduce the recognition ability of the model.
  • the principle of the boundary attack method can be understood as initializing from a point that is already adversarial, and then performing a random walk along the boundary between the adversarial area (making the model misclassified) and the non-adversarial area (making the model correctly classified) .
  • the specific algorithm process can be referred to as follows:
  • the principle of a pixel attack method can be understood as: suppose an input image can be represented by a vector, where each scalar element represents a pixel.
  • the opponent’s goal is to find the optimal solution e(x) for the following problems:
  • Step 108 Synthesize the generated anti-noise into the picture at the sensitive position of the sensitive picture to be processed, and store the synthesized picture.
  • the anti-noise can be combined with the sensitive picture to be processed to obtain a new picture.
  • the anti-noise can be synthesized into the picture information located at the sensitive position of the sensitive picture to be processed, so as to interfere with the recognition result of the sensitive information of the sensitive picture to be processed with the sensitive information by the model or algorithm.
  • the pixel value of the picture at the sensitive position can be changed according to the anti-noise. Normally, the anti-noise is very subtle, and generally only the value of a small part of the pixel is changed.
  • Store the synthesized picture for use by the user specifically, it can be stored in a designated database or other data storage device.
  • the processing method for protecting private information in pictures provided by the embodiments of this specification generates anti-noise through the anti-sample generation method, and adds anti-noise at the sensitive position where the sensitive information of the sensitive picture to be processed is located, so that the picture recognition algorithm or model is such as: OCR
  • OCR optical character recognition
  • the algorithm cannot identify the sensitive information in the processed sensitive picture, but it does not affect the visual quality of the picture itself, that is, it does not affect the human visual experience.
  • There is no need to do other processing such as decryption during normal business use, which satisfies the use of normal business, and at the same time reduces the automatic identification ability of illegal users to identify sensitive information, and improves the security of user private information in the picture.
  • the method may further include: obtaining the clarity of the synthesized picture; if the clarity is less than a preset threshold, adjusting the anti-noise and adjusting The resulting anti-noise is synthesized into the sensitive position of the to-be-processed sensitive picture to obtain the processed sensitive picture until the sharpness of the processed sensitive picture is greater than or equal to the preset threshold; the obtained sharpness is greater than or equal to the preset Threshold sensitive pictures are stored.
  • the definition of the picture synthesized by adding noise can be obtained, and the definition can be determined by the resolution, bit rate, pixel, etc. of the picture. If the clarity of the synthesized picture is less than the preset threshold, it can be considered that the synthesized picture may affect the visual quality, and the anti-noise can be adjusted.
  • the anti-noise generation method can be replaced to generate a new anti-noise, or the anti-noise
  • the parameters are fine-tuned, and the re-obtained anti-noise is synthesized into the sensitive position of the sensitive picture to be processed to obtain the processed sensitive picture. Then obtain the adjusted sharpness of the processed sensitive picture and determine whether the sharpness is greater than the preset threshold.
  • the cleanliness of the picture can also be checked manually, such as: the staff observes whether the sensitive picture synthesized against noise is clear and whether it affects the visual quality, that is, whether the information in the synthesized sensitive picture is clear, and if it does not affect the visual quality, store it If it affects the visual quality, adjust the anti-noise and re-synthesize new sensitive pictures.
  • the anti-noise is adjusted to synthesize a new picture to ensure that the sensitive picture after the anti-noise is added does not affect the visual quality. It does not affect the normal business use of sensitive pictures, and at the same time protects the privacy of users in the pictures.
  • FIG. 3 is a schematic structural diagram of a processing method for protecting private information in a picture in another embodiment of this specification.
  • the process of the processing method for protecting private information in a picture in an embodiment of this specification will be specifically introduced below in conjunction with FIG. 3.
  • anti-sample generator uses the anti-sample generator to generate anti-noise in a targeted manner.
  • the purpose of anti-noise is to add noise to sensitive information.
  • the embodiment of this specification adopts a black-box attack countermeasure sample generation method to generate counter noise, and specific generation algorithms include boundary attack and one pixel attack.
  • Gaussian noise or salt and pepper noise to the original image, that is, the sensitive image to be processed, to affect the recognition ability of image recognition algorithms such as OCR, or to fight against the deformation of some images of sensitive information, but these methods may It will affect the visual recognition of normal users and affect the use of normal services.
  • the embodiment of this specification adopts the method of adding anti-sample noise to protect the sensitive picture information, avoiding the problem that the encryption method causes the normal business to be unreadable and has a large amount of decryption overhead. At the same time, it can be compared with the method of behavior and permission abnormal detection. Take precautions against unknown risks.
  • the processing method for protecting private information in pictures provided in the embodiments of this specification can be deployed in data security sensitive data protection systems and user behavior analysis systems to reduce the recognition ability of the OCR algorithm, and further reduce the automatic recognition ability for illegal users to profit. , To meet the use of normal business (without loss of human visual experience).
  • one or more embodiments of this specification also provide a processing device for protecting private information in pictures.
  • the described devices may include systems (including distributed systems), software (applications), modules, components, servers, clients, etc., which use the methods described in the embodiments of this specification, combined with necessary implementation hardware devices.
  • the devices in one or more embodiments provided in the embodiments of this specification are as described in the following embodiments. Since the implementation scheme of the device to solve the problem is similar to the method, the implementation of the specific device in the embodiment of this specification can refer to the implementation of the aforementioned method, and the repetition will not be repeated.
  • unit or “module” can be a combination of software and/or hardware that implements a predetermined function.
  • the devices described in the following embodiments are preferably implemented by software, implementation by hardware or a combination of software and hardware is also possible and conceived.
  • FIG. 4 is a schematic diagram of the module structure of an embodiment of the processing device for privacy information protection in pictures provided in this specification.
  • the processing device for privacy information protection in pictures provided in this specification may include: image acquisition Module 41, sensitive position positioning module 42, noise generation module 43, noise synthesis module 44.
  • the picture acquisition module 41 may be used to acquire sensitive pictures to be processed.
  • the sensitive position locating module 42 may be used to locate the sensitive position of the sensitive information in the sensitive picture to be processed.
  • the noise generation module 43 can be used to generate anti-noise by using an anti-sample generation method.
  • the noise synthesis module 44 may be used to synthesize the generated anti-noise into the picture at the sensitive position of the sensitive picture to be processed, and store the synthesized picture.
  • the processing device for protecting private information in pictures provided in the embodiments of this specification generates anti-noise through an anti-sample generation method, and adds anti-noise at the sensitive position where the sensitive information of the sensitive picture to be processed is located, so that the picture recognition algorithm or model is such as: OCR
  • OCR optical character recognition
  • the algorithm cannot identify the sensitive information in the processed sensitive picture, but it does not affect the visual quality of the picture itself, that is, it does not affect the human visual experience.
  • other processing such as decryption is not required, which satisfies the use of normal business, and at the same time reduces the automatic identification ability of illegal users to identify sensitive information.
  • the sensitive position positioning module includes: a model construction unit for pre-utilizing historical sensitive pictures and sensitive positions marked in the historical sensitive pictures to train and construct sensitive positions. Position positioning model; a position positioning unit for locating the sensitive position of the sensitive information in the sensitive picture to be processed by using the sensitive position positioning model.
  • the model is trained in advance by marking sensitive positions in historical sensitive pictures, and the sensitive positions of sensitive information in the sensitive pictures to be processed are automatically recognized through the model, which realizes the automatic and rapid identification of sensitive positions and improves the picture quality. Processing efficiency.
  • the noise generation module is specifically used for:
  • the confrontation sample generation method of black box attack is adopted to generate the confrontation noise.
  • the adversarial sample generation method of black box attack can be used to generate adversarial samples suitable for various algorithms, that is, anti-noise, which provides a data basis for the subsequent processing of sensitive pictures to be processed.
  • the method for generating countermeasures against the black box attack in the noise generation module includes: a boundary attack method or a pixel attack method.
  • the boundary attack method or the one-pixel attack method can be used to generate anti-noise against different algorithm models and reduce the recognition ability of the model.
  • the device further includes an image adjustment module, configured to: obtain the sharpness of the synthesized picture; if the sharpness is less than a preset threshold, adjust the Anti-noise, synthesize the adjusted anti-noise into the sensitive position of the sensitive picture to be processed, and obtain the processed sensitive picture until the sharpness of the processed sensitive picture is greater than or equal to the preset threshold;
  • the sensitive pictures whose definition is greater than or equal to the preset threshold are stored.
  • the anti-noise is adjusted to synthesize a new picture to ensure that the sensitive picture after the anti-noise is added does not affect the visual quality. Does not affect the normal business use of sensitive pictures.
  • the above-mentioned device may also include other implementation manners according to the description of the method embodiment.
  • the specific implementation manner reference may be made to the description of the corresponding method embodiment above, which will not be repeated here.
  • the embodiments of this specification also provide a processing device for protecting private information in pictures, including: at least one processor and a memory for storing processor-executable instructions. When the processor executes the instructions, the picture in the above-mentioned embodiment is implemented.
  • the processing method of privacy information protection in the medium such as: obtaining the sensitive picture to be processed; locating the sensitive position of the sensitive information in the sensitive picture to be processed; using the anti-sample generation method to generate anti-noise; and synthesize the generated anti-noise into In the picture at the sensitive position of the sensitive picture to be processed, the synthesized picture is stored.
  • processing device may also include other implementation manners according to the description of the method embodiment.
  • specific implementation manner reference may be made to the description of the corresponding method embodiment above, which will not be repeated here.
  • the processing device or processing equipment for the protection of privacy information in the pictures provided in this manual can also be applied to a variety of data analysis and processing systems.
  • the device or processing device may include a processing device for protecting privacy information in any picture in the foregoing embodiments.
  • the device or processing device may be a separate server, or it may include a server cluster, system (including distributed system), software (applications) using one or more of the methods or one or more of the embodiments of this specification. ), actual operation devices, logic gate circuit devices, quantum computers, etc., combined with necessary terminal devices to implement hardware.
  • the detection system for checking difference data may include at least one processor and a memory storing computer-executable instructions, and the processor implements the steps of the method in any one or more of the foregoing embodiments when executing the instructions.
  • FIG. 5 is a hardware structural block diagram of a processing server for protecting private information in a picture in an embodiment of this specification.
  • the server may be a processing device for protecting private information in a picture in the above-mentioned embodiment, or in the picture. Processing equipment for privacy information protection. As shown in FIG.
  • the server 10 may include one or more (only one is shown in the figure) processor 100 (the processor 100 may include, but is not limited to, a processing device such as a microprocessor MCU or a programmable logic device FPGA), The memory 200 for storing data, and the transmission module 300 for communication functions.
  • processor 100 may include, but is not limited to, a processing device such as a microprocessor MCU or a programmable logic device FPGA
  • the memory 200 for storing data
  • the transmission module 300 for communication functions.
  • FIG. 5 is only for illustration, and it does not limit the structure of the above-mentioned electronic device.
  • the server 10 may also include more or fewer components than shown in FIG. 5, for example, may also include other processing hardware, such as a database or multi-level cache, GPU, or have a configuration different from that shown in FIG.
  • the memory 200 can be used to store software programs and modules of application software, such as the program instructions/modules corresponding to the processing method for privacy information protection in pictures in the embodiment of this specification.
  • the processor 100 runs the software programs and modules stored in the memory 200 , So as to perform various functional applications and resource data updates.
  • the memory 200 may include a high-speed random access memory, and may also include a non-volatile memory, such as one or more magnetic storage devices, flash memory, or other non-volatile solid-state memory.
  • the memory 200 may further include a memory remotely provided with respect to the processor 100, and these remote memories may be connected to a computer terminal through a network. Examples of the aforementioned networks include, but are not limited to, the Internet, corporate intranets, local area networks, mobile communication networks, and combinations thereof.
  • the transmission module 300 is used to receive or send data via a network.
  • the above-mentioned specific examples of the network may include a wireless network provided by a communication provider of a computer terminal.
  • the transmission module 300 includes a network adapter (Network Interface Controller, NIC), which can be connected to other network devices through a base station so as to communicate with the Internet.
  • the transmission module 300 may be a radio frequency (RF) module, which is used to communicate with the Internet in a wireless manner.
  • RF radio frequency
  • the methods or devices described in the above embodiments provided in this specification can implement business logic through computer programs and are recorded on a storage medium, and the storage medium can be read and executed by a computer to achieve the effects of the solutions described in the embodiments of this specification.
  • the storage medium may include a physical device for storing information, and the information is usually digitized and then stored in an electric, magnetic, or optical medium.
  • the storage medium may include: devices that use electrical energy to store information, such as various types of memory, such as RAM, ROM, etc.; devices that use magnetic energy to store information, such as hard disks, floppy disks, magnetic tapes, magnetic core memory, bubble memory, U disk; a device that uses optical methods to store information, such as a CD or DVD.
  • devices that use electrical energy to store information such as various types of memory, such as RAM, ROM, etc.
  • devices that use magnetic energy to store information such as hard disks, floppy disks, magnetic tapes, magnetic core memory, bubble memory, U disk
  • a device that uses optical methods to store information such as a CD or DVD.
  • quantum memory graphene memory, and so on.
  • the processing method or device for protecting the privacy information in the above pictures provided in the embodiments of this specification can be implemented in a computer by the processor executing the corresponding program instructions, such as using the c++ language of the windows operating system to implement on the PC side, the linux system, or Others, such as the use of android and iOS system programming languages to be implemented in smart terminals, and the implementation of processing logic based on quantum computers, etc.
  • the device, computer storage medium, and system described above in the specification may also include other implementation manners according to the description of the related method embodiments.
  • specific implementation manners please refer to the description of the corresponding method embodiments, which will not be repeated here. .
  • the improvement of a technology can be clearly distinguished between hardware improvements (for example, improvements in circuit structures such as diodes, transistors, switches, etc.) or software improvements (improvements in method flow).
  • hardware improvements for example, improvements in circuit structures such as diodes, transistors, switches, etc.
  • software improvements improvements in method flow.
  • the improvement of many methods and processes of today can be regarded as a direct improvement of the hardware circuit structure.
  • Designers almost always get the corresponding hardware circuit structure by programming the improved method flow into the hardware circuit. Therefore, it cannot be said that the improvement of a method flow cannot be realized by the hardware entity module.
  • a programmable logic device for example, a Field Programmable Gate Array (Field Programmable Gate Array, FPGA)
  • PLD Programmable Logic Device
  • FPGA Field Programmable Gate Array
  • HDL Hardware Description Language
  • ABEL Advanced Boolean Expression Language
  • AHDL Altera Hardware Description Language
  • HDCal JHDL
  • Lava Lava
  • Lola MyHDL
  • PALASM RHDL
  • VHDL Very-High-Speed Integrated Circuit Hardware Description Language
  • Verilog Verilog
  • the controller can be implemented in any suitable manner.
  • the controller can take the form of, for example, a microprocessor or a processor and a computer-readable medium storing computer-readable program codes (such as software or firmware) executable by the (micro)processor. , Logic gates, switches, application specific integrated circuits (ASICs), programmable logic controllers and embedded microcontrollers. Examples of controllers include but are not limited to the following microcontrollers: ARC625D, Atmel AT91SAM, Microchip PIC18F26K20 and Silicon Labs C8051F320, the memory controller can also be implemented as part of the memory control logic.
  • controllers in addition to implementing the controller in a purely computer-readable program code manner, it is entirely possible to program the method steps to make the controller use logic gates, switches, application specific integrated circuits, programmable logic controllers, and embedded logic.
  • the same function can be realized in the form of a microcontroller or the like. Therefore, such a controller can be regarded as a hardware component, and the devices included in it for realizing various functions can also be regarded as a structure within the hardware component. Or even, the device for realizing various functions can be regarded as both a software module for realizing the method and a structure within a hardware component.
  • a typical implementation device is a computer.
  • the computer may be, for example, a personal computer, a laptop computer, a vehicle-mounted human-computer interaction device, a cellular phone, a camera phone, a smart phone, a personal digital assistant, a media player, a navigation device, an email device, a game console, and a tablet.
  • Computers, wearable devices, or any combination of these devices may be specifically implemented by computer chips or entities, or implemented by products with certain functions.
  • the computer may be, for example, a personal computer, a laptop computer, a vehicle-mounted human-computer interaction device, a cellular phone, a camera phone, a smart phone, a personal digital assistant, a media player, a navigation device, an email device, a game console, and a tablet.
  • the functions are divided into various modules and described separately.
  • the function of each module can be realized in the same one or more software and/or hardware, or the module that realizes the same function can be realized by a combination of multiple sub-modules or sub-units, etc. .
  • the device embodiments described above are merely illustrative.
  • the division of the units is only a logical function division, and there may be other divisions in actual implementation, for example, multiple units or components can be combined or integrated. To another system, or some features can be ignored, or not implemented.
  • the displayed or discussed mutual coupling or direct coupling or communication connection may be indirect coupling or communication connection through some interfaces, devices or units, and may be in electrical, mechanical or other forms.
  • These computer program instructions can also be stored in a computer-readable memory that can guide a computer or other programmable resource data update equipment to work in a specific manner, so that the instructions stored in the computer-readable memory produce an article of manufacture including the instruction device.
  • the instruction device implements the functions specified in one process or multiple processes in the flowchart and/or one block or multiple blocks in the block diagram.
  • These computer program instructions can also be loaded on a computer or other programmable resource data update equipment, so that a series of operation steps are executed on the computer or other programmable equipment to produce computer-implemented processing, which can be executed on the computer or other programmable equipment.
  • the instructions provide steps for implementing the functions specified in one process or multiple processes in the flowchart and/or one block or multiple blocks in the block diagram.
  • the computing device includes one or more processors (CPUs), input/output interfaces, network interfaces, and memory.
  • processors CPUs
  • input/output interfaces network interfaces
  • memory volatile and non-volatile memory
  • the memory may include non-permanent memory in a computer-readable medium, random access memory (RAM) and/or non-volatile memory, such as read-only memory (ROM) or flash memory (flash RAM).
  • RAM random access memory
  • ROM read-only memory
  • flash RAM flash memory
  • Computer-readable media include permanent and non-permanent, removable and non-removable media, and information storage can be realized by any method or technology.
  • the information can be computer-readable instructions, data structures, program modules, or other data.
  • Examples of computer storage media include, but are not limited to, phase change memory (PRAM), static random access memory (SRAM), dynamic random access memory (DRAM), other types of random access memory (RAM), read-only memory (ROM), electrically erasable programmable read-only memory (EEPROM), flash memory or other memory technology, CD-ROM, digital versatile disc (DVD) or other optical storage, Magnetic cassettes, magnetic tape magnetic disk storage, graphene storage or other magnetic storage devices or any other non-transmission media can be used to store information that can be accessed by computing devices. According to the definition in this article, computer-readable media does not include transitory media, such as modulated data signals and carrier waves.
  • one or more embodiments of this specification can be provided as a method, a system, or a computer program product. Therefore, one or more embodiments of this specification may adopt the form of a complete hardware embodiment, a complete software embodiment, or an embodiment combining software and hardware. Moreover, one or more embodiments of this specification may adopt computer programs implemented on one or more computer-usable storage media (including but not limited to disk storage, CD-ROM, optical storage, etc.) containing computer-usable program codes. The form of the product.
  • computer-usable storage media including but not limited to disk storage, CD-ROM, optical storage, etc.
  • One or more embodiments of this specification may be described in the general context of computer-executable instructions executed by a computer, such as program modules.
  • program modules include routines, programs, objects, components, data structures, etc. that perform specific tasks or implement specific abstract data types.
  • One or more embodiments of this specification can also be practiced in a distributed computing environment. In these distributed computing environments, tasks are performed by remote processing devices connected through a communication network.
  • program modules can be located in local and remote computer storage media including storage devices.

Landscapes

  • Engineering & Computer Science (AREA)
  • Theoretical Computer Science (AREA)
  • Health & Medical Sciences (AREA)
  • Bioethics (AREA)
  • General Health & Medical Sciences (AREA)
  • Computer Hardware Design (AREA)
  • Computer Security & Cryptography (AREA)
  • Software Systems (AREA)
  • Physics & Mathematics (AREA)
  • General Engineering & Computer Science (AREA)
  • General Physics & Mathematics (AREA)
  • Image Processing (AREA)

Abstract

一种图片中隐私信息保护的处理方法及装置,所述方法包括通过对抗样本生成方法生成对抗噪声,定位出待处理敏感图片中敏感信息所在的敏感位置后,在待处理敏感图片的敏感信息所在的敏感位置处加入对抗噪声,获得合成后的图片。加入对抗噪声后的图片使得图片识别算法或模型(如:OCR算法)无法识别出处理后的待处理敏感图片中的敏感信息,保护了用户的隐私信息,但是不影响图片本身的视觉质量,即不影响人的视觉体验。

Description

一种图片中隐私信息保护的处理方法及装置 技术领域
本说明书属于计算机技术领域,尤其涉及一种图片中隐私信息保护的处理方法及装置。
背景技术
随着计算机和互联网技术的发展,数据电子化越来越普遍,很多信息都需要转换成电子图片进行保存,如:身份证扫描件、发票照片、病历扫描件等,方便查看,也使得隐私数据的保护越来越重要,特别是针对敏感的图片数据(如身份证扫描件、发票照片、病历扫描件等)。不法用户通常会使用OCR(Optical Character Recognition,光学字符识别)算法自动化获得图片中敏感的文本信息,得到用户的敏感信息。
发明内容
本说明书实施例的目的在于提供一种图片中隐私信息保护的处理方法及装置,降低了不法用户对敏感信息识别的能力。
一方面,本说明书提供了一种图片中隐私信息保护的处理方法,包括:获取待处理敏感图片;定位出所述待处理敏感图片中的敏感信息的敏感位置;利用对抗样本生成法生成对抗噪声;将生成的所述对抗噪声合成到所述待处理敏感图片的敏感位置处的图片中,并将合成后的图片存储。
另一方面,本说明书提供了一种图片中隐私信息保护的处理装置,包括:图片获取模块,用于获取待处理敏感图片;敏感位置定位模块,用于定位出所述待处理敏感图片中的敏感信息的敏感位置;噪声生成模块,用于利用对抗样本生成法生成对抗噪声;噪声合成模块,用于将生成的所述对抗噪声合成到所述待处理敏感图片的敏感位置处的图片中,并将合成后的图片存储。
还一方面,本说明书提供了一种图片中隐私信息保护的处理设备,包括:至少一个处理器以及用于存储处理器可执行指令的存储器,所述处理器执行所述指令时实现上述图片中隐私信息保护的处理方法。
本说明书提供的图片中隐私信息保护的处理方法、装置、处理设备,通过对抗样 本生成方法生成对抗噪声,定位出待处理敏感图片中敏感信息所在的敏感位置后,在待处理敏感图片的敏感信息所在的敏感位置处加入对抗噪声,获得合成后的图片,使得图片识别算法或模型(如:OCR算法)无法识别出处理后的待处理敏感图片中的敏感信息,但是不影响图片本身的视觉质量,即不影响人的视觉体验。正常业务使用时不需要做其他的处理如解密等,满足了正常业务的使用,同时降低了不法用户识别敏感信息的自动化识别能力,确保了用户的隐私安全。
附图说明
为了更清楚地说明本说明书实施例中的技术方案,下面将对实施例描述中所需要使用的附图作简单地介绍,显而易见地,下面描述中的附图仅仅是本说明书中记载的一些实施例,对于本领域普通技术人员来讲,在不付出创造性劳动性的前提下,还可以根据这些附图获得其他的附图。
图1是本说明书一个实施例中图片中隐私信息保护的处理方法的流程示意图。
图2是本说明书一个实施例中身份证扫描件中敏感位置的示意图。
图3是本说明书又一个实施例中图片中隐私信息保护的处理方法的结构示意图。
图4是本说明书提供的图片中隐私信息保护的处理装置一个实施例的模块结构示意图。
图5是本说明书一个实施例中图片中隐私信息保护的处理服务器的硬件结构框图。
具体实施方式
为了使本技术领域的人员更好地理解本说明书中的技术方案,下面将结合本说明书实施例中的附图,对本说明书实施例中的技术方案进行清楚、完整地描述,显然,所描述的实施例仅仅是本说明书一部分实施例,而不是全部的实施例。基于本说明书中的实施例,本领域普通技术人员在没有作出创造性劳动前提下所获得的所有其他实施例,都应当属于本说明书保护的范围。
电子化办公越来越普及,许多敏感信息也需要在计算机上存储,一些数据会以图片的形式进行保存或进行交互,如:身份证扫描件、发票照片、病例扫描件等,这些图片上会有些个人的敏感信息如:姓名、身份证号码、出生日期、照片、公司名称、公司地址、病情等等。对于图片类的敏感数据的安全保护,是一项重要的工作。一些不法用 户可能会利用计算机技术,获取到带有敏感信息的图片,获取到用户的敏感信息,用于不法用途,给用户带来不必要的损失。例如:对于带有图片类的敏感数据,不法用户通常会使用OCR(Optical Character Recognition,光学字符识别)技术,来获得敏感的文本信息。OCR可以表示一种电子设备(例如扫描仪或数码相机)检查纸上打印的字符,通过检测暗、亮的模式确定其形状,然后用字符识别方法将形状翻译成计算机文字的过程。即针对印刷体字符,采用光学的方式将纸质文档中的文字转换成为黑白点阵的图像文件,并通过识别软件将图像中的文字转换成文本格式,供文字处理软件进一步编辑加工的技术。
本说明书实施例提供一种图片中隐私信息保护的处理方法,可以将带有敏感信息的图片中加入对抗噪声,降低OCR算法的识别能力,使得OCR算法无法准确识别或者错误识别出图片中的敏感信息,达到不法用户无法自动化获利的目的,为数据泄漏排查争取宝贵的时间,提升了不法用户变现的成本。
本说明书中图片中隐私信息保护的处理方法可以应用在客户端或服务器中,客户端可以是智能手机、平板电脑、智能可穿戴设备(智能手表等)、智能车载设备等电子设备。
图1是本说明书一个实施例中图片中隐私信息保护的处理方法的流程示意图,如图1所示,本说明书一个实施例中提供的图片中隐私信息保护的处理方法可以包括步骤102-108。
步骤102、获取待处理敏感图片。
待处理敏感图片可以表示带有敏感信息(敏感信息可以包括个人或企业的隐私信息等)的图片,可以是照片或电子扫描件,如:上述实施例记载的身份证扫描件、发票照片、病例扫描件等。可以从存储敏感图片的数据库中获取,或者在用户上传时直接获取,本说明书实施例不作具体限定。通常情况下,带有敏感信息的待处理敏感图片会进行加密保存在数据库中,用户使用时,需要进行解密。若待处理敏感图片从数据库中获取,则获取到的待处理敏感图片可能会被加密,本说明书一些实施例中,若获取到的待处理敏感图片是被加密的图片,则先对加密图片进行解密,获得原始的敏感图片,将解密后的原始图片作为本书明书实施例中的待处理敏感图片。
步骤104、定位出所述待处理敏感图片中的敏感信息的敏感位置。
敏感位置可以表示图片中敏感信息所在的位置,通常情况下,图片中带有敏感信 息的位置比较固定。图2是本说明书一个实施例中身份证扫描件中敏感位置的示意图,如图2所示,身份证扫描件中的敏感信息一般是人物照片、身份证号码、家庭住址,人物照片一般位于扫描件的右侧偏上,身份证号码位于扫描件的底部,家庭住址则位于扫描件的中间偏左。病例扫描件以及发票照片等敏感图片中敏感信息的位置一般也比较固定,可以根据待处理敏感图片的类型,定位出待处理敏感图片中的敏感信息的敏感位置。如图2所示,若待处理敏感图片是身份证扫描件,则可以将图片的右侧偏上、底部偏右、中间偏左区域作为敏感位置。根据需要,身份证扫描件上的姓名和出生日期也可以作为敏感信息,定位出姓名和出生日期的位置。其中敏感位置的区域的面积大小,可以根据实物中敏感信息的大小进行设置,并可以根据图片尺寸大小进行调整,若图片尺寸比较大,则可以在图片中指定方位比较大的区域作为敏感位置,若图片尺寸比较小,则可以适当的缩小敏感位置的区域,具体可以根据实际需要进行设置。
本说明书一些实施例中,所述定位出所述待处理敏感图片中的敏感信息的敏感位置,可以包括:预先利用历史敏感图片和所述历史敏感图片中标记的敏感位置,训练构建敏感位置定位模型;利用所述敏感位置定位模型定位出所述待处理敏感图片中的敏感信息的敏感位置。
在具体的实施过程中,可以先在多张历史敏感图片中标记出敏感信息所在的敏感位置,利用标记后的历史敏感图片进行模型训练,将标记好的历史敏感图片作为输入,标记的敏感位置作为训练标签,构建出敏感位置定位模型。获取到待处理敏感图片后,可以将待处理敏感图片输入到构建好的敏感位置定位模型,利用敏感位置定位模型自动定位出待处理敏感图片中的敏感信息的敏感位置。其中,敏感位置定位模型可以使用Faster RCNN模型,Faster RCNN模型可以理解为一种基于深度学习的目标检测模型。
本说明书实施例,预先通过在历史敏感图片标记敏感位置,进行模型训练,通过模型自动的识别出待处理敏感图片中的敏感信息的敏感位置,实现了敏感位置的自动化快速识别,提高了图片的处理效率。
步骤106、利用对抗样本生成法生成对抗噪声。
对抗样本可以理解为在数据集中通过故意添加细微的干扰所形成的输入样本,导致模型以高置信度给出一个错误的输出。利用对抗样本生成法可以针对模型生成具有针对性的对抗噪声,如:生成针对OCR算法的对抗噪声。对抗噪声也可以理解为一种对抗样本,可以用来干扰OCR算法等图片敏感信息识别算法的识别结果。其中,对抗样本生成方法可以根据实际需要进行选择,如:FGSM(Fast Gradient Sign Method)法, FGSM可以理解为一种通过在梯度方向上进行添加增量来诱导网络对生成的图片进行误分类的方法。
本说明书一些实施例中,可以采用黑盒攻击的对抗样本生成法,生成所述对抗噪声。黑盒攻击的对抗样本生成法可以理解为不知道具体使用的模型(如:OCR算法),生成针对各种算法都有效的对抗样本的方法。本说明书实施例采用黑盒攻击的对抗样本生成法,可以生成适用于各种算法的对抗样本即对抗噪声,为后续待处理敏感图片的处理提供了数据基础。
本说明书一些实施例中,黑盒攻击的对抗样本生成法可以采用:边界攻击法(即boundary attack可以理解为一种先加大规模扰动,再进行边界探索的方法),或一次像素攻击法(即one pixel attack可以理解为一种通过改变一个图像像素值实现攻击的方法)。采用边界攻击法或一次像素攻击法可以生成针对不同算法模型的对抗噪声,降低模型的识别能力。
其中,边界攻击法的原理可以理解为从一个已经是对抗性的点初始化,然后沿着边界执行随机漫步在对抗性区域(使得模型错误分类)和非对抗性区域(使得模型正确分类)之间。其具体的算法过程可以参考如下:
数据:原始图像o,对抗性判据c(.),模型d(.)
结果:对抗的例子o~的距离d(o,o~)=||o-o~||最小化
初始化:k=0,o~0~μ(0,1)s.t.o~0是对抗性的
While k<最大步长do
从建议分布中随机扰动η k~p(o ~k-1)
if o ~k-1k是对抗的then
   o ~k=o ~k-1k
   else o ~k=o ~k-1
   end
   k=k+1
end
一次像素攻击法原理可以理解为:假设一个输入图像可以用一个向量表示,其中每个标量元素表示一个像素。设f为接收n维输入的目标图像分类器,x=(x1,…,xn)为正确分类为t类的原始自然图像,因此x属于t类的概率为ft(x)。向量e(x)=(e1,…, en)是根据目标类adv的x和最大修正L的限制,对目标类adv进行的加性对抗性扰动。注意,L总是由向量e(x)的长度来度量的。在目标攻击情况下,对手的目标是找到针对以下问题的优化解e(x):
Figure PCTCN2020125306-appb-000001
步骤108、将生成的所述对抗噪声合成到所述待处理敏感图片的敏感位置处的图片中,并将合成后的图片存储。
获得对抗噪声后,可以将对抗噪声与待处理敏感图片一起合成,获得新的图片。具体可以将对抗噪声合成到定位出的待处理敏感图片中的敏感位置处的图片信息中,以干扰模型或算法对带有敏感信息的待处理敏感图片的敏感信息的识别结果。如:可以根据对抗噪声改变敏感位置处图片的像素值,通常情况下对抗噪声是非常细微的,一般只改变少部分像素的值。将合成后的图片进行存储,以供用户使用,具体可以存储到指定的数据库或其他数据存储设备中。
本说明书实施例提供的图片中隐私信息保护的处理方法,通过对抗样本生成方法生成对抗噪声,在待处理敏感图片的敏感信息所在的敏感位置处加入对抗噪声,使得图片识别算法或模型如:OCR算法无法识别出处理后的待处理敏感图片中的敏感信息,但是不影响图片本身的视觉质量,即不影响人的视觉体验。正常业务使用时不需要做其他的处理如解密等,满足了正常业务的使用,同时降低了不法用户识别敏感信息的自动化识别能力,提高了图片中用户隐私信息的安全性。
在上述实施例的基础上,本说明书一些实施例中,所述方法还可以包括:获取合成后的图片的清晰度;若所述清晰度小于预设阈值,则调整所述对抗噪声,将调整后的对抗噪声合成到所述待处理敏感图片的敏感位置中,获得处理后敏感图片,直至所述处理后敏感图片的清晰度大于或等于预设阈值;将获得的清晰度大于或等于预设阈值的敏感图片进行存储。
在具体的实施过程中,可以获取加入噪声合成后的图片的清晰度,清晰度可以通过图片的分辨率、码率、像素等来确定。若合成后的图片的清晰度小于预设阈值,则可以认为合成后的图片可能会影响视觉质量,可以调整对抗噪声,如:可以更换对抗样本生成法生成新的对抗噪声,或者对对抗噪声的参数进行微调,再将重新获得的对抗噪声合成到待处理敏感图片的敏感位置中,获得处理后敏感图片。再获取调整后的处理后敏 感图片的清晰度,判断清晰度是否大于预设阈值,若不大于,则继续调整对抗噪声,合成新的敏感图片,直至处理后的敏感图片的清晰度大于等于预设阈值。将最终获得的清晰度大于或等于预设阈值的敏感图片进行存储,其中预设阈值的大小可以根据实际需要进行设置,本说明书实施例不作具体限定。
图片的清洗度也可以人工核验,如:由工作人员观察合成对抗噪声后的敏感图片是否清晰,是否影响视觉质量,即合成后的敏感图片中的信息是否清楚,若不影响视觉质量,则存储,若影响视觉质量,则调整对抗噪声,重新合成新的敏感图片。
本说明书实施例,根据合成对抗噪声后的图片的清晰度是否满足要求,在不满足要求的情况下,通过调整对抗噪声,合成新的图片,确保加入对抗噪声后的敏感图片不影响视觉质量,不影响正常业务使用敏感图片,同时保护了图片中用户的隐私安全。
图3是本说明书又一个实施例中图片中隐私信息保护的处理方法的结构示意图,下面结合图3具体介绍本说明书实施例中图片中隐私信息保护的处理方法的过程。
1)将原始加密的敏感数据进行解密,获得原始图片即上述实施例中的待处理敏感图片。
2)在原始图片中定位敏感信息的敏感位置,即标注需要加噪声的图片位置。
3)利用对抗样本生成器针对性的生成对抗噪声,该对抗噪声目的是对敏感信息进行加噪。本说明书实施例采用黑盒攻击的对抗样本生成方式,生成对抗噪声,具体的生成算法有boundary attack和one pixel attack等。
4)将原始图片与对抗噪声一起合成新的图片。需要注意的是,加噪声的目的是为了使得OCR算法不可识别,但是人眼无法区分,所以这一步骤不能对图片本身的视觉质量带来太大的影响。
5)将加噪后的图片存储在新的数据库中,方便业务系统调用。这样,即使不法用户盗取了这些加噪的敏感图片信息,仍然无法自动化准确识别其中的敏感信息,大大提升了其获利的成本。
此外,也可以直接在原始图片即待处理敏感图片中加高斯噪声或椒盐噪声的方法实现影响OCR等图片识别算法的识别能力,或者通过对敏感信息的部分图像进行形变进行对抗,但是这些方法可能会影响正常用户的视觉识别,影响正常业务的使用。
本说明书实施例通过采用加对抗样本噪声的方法来进行敏感图片信息的保护,避 免了加密方法导致正常业务不可读并且有大量解密开销的问题,同时相比与行为和权限异常检测的方法,可以防范于未然应对未知风险。本说明书实施例提供的图片中隐私信息保护的处理方法,可以部署在数据安全敏感数据保护系统以及用户行为分析系统中,降低OCR算法的识别能力,进一步降低不法用户获利的自动化识别能力,同时,满足正常业务的使用(不损失人类的视觉体验)。
本说明书中上述方法的各个实施例均采用递进的方式描述,各个实施例之间相同相似的部分互相参考即可,每个实施例重点说明的都是与其他实施例的不同之处。相关之处参考方法实施例的部分说明即可。
基于上述所述的图片中隐私信息保护的处理方法,本说明书一个或多个实施例还提供一种图片中隐私信息保护的处理装置。所述的装置可以包括使用了本说明书实施例所述方法的系统(包括分布式系统)、软件(应用)、模块、组件、服务器、客户端等并结合必要的实施硬件的装置。基于同一创新构思,本说明书实施例提供的一个或多个实施例中的装置如下面的实施例所述。由于装置解决问题的实现方案与方法相似,因此本说明书实施例具体的装置的实施可以参考前述方法的实施,重复之处不再赘述。以下所使用的,术语“单元”或者“模块”可以实现预定功能的软件和/或硬件的组合。尽管以下实施例所描述的装置较佳地以软件来实现,但是硬件,或者软件和硬件的组合的实现也是可能并被构想的。
具体地,图4是本说明书提供的图片中隐私信息保护的处理装置一个实施例的模块结构示意图,如图4所示,本说明书中提供的图片中隐私信息保护的处理装置可以包括:图片获取模块41、敏感位置定位模块42、噪声生成模块43、噪声合成模块44。
图片获取模块41,可以用于获取待处理敏感图片。
敏感位置定位模块42,可以用于定位出所述待处理敏感图片中的敏感信息的敏感位置。
噪声生成模块43,可以用于利用对抗样本生成法生成对抗噪声。
噪声合成模块44,可以用于将生成的所述对抗噪声合成到所述待处理敏感图片的敏感位置处的图片中,并将合成后的图片存储。
本说明书实施例提供的图片中隐私信息保护的处理装置,通过对抗样本生成方法生成对抗噪声,在待处理敏感图片的敏感信息所在的敏感位置处加入对抗噪声,使得图片识别算法或模型如:OCR算法无法识别出处理后的待处理敏感图片中的敏感信息,但 是不影响图片本身的视觉质量,即不影响人的视觉体验。正常业务使用时不需要做其他的处理如解密等,满足了正常业务的使用,同时降低了不法用户识别敏感信息的自动化识别能力。
在上述实施例的基础上,本说明书一些实施例中,所述敏感位置定位模块中包括:模型构建单元,用于预先利用历史敏感图片和所述历史敏感图片中标记的敏感位置,训练构建敏感位置定位模型;位置定位单元,用于利用所述敏感位置定位模型定位出所述待处理敏感图片中的敏感信息的敏感位置。
本说明书实施例,预先通过在历史敏感图片标记敏感位置,进行模型训练,通过模型自动的识别出待处理敏感图片中的敏感信息的敏感位置,实现了敏感位置的自动化快速识别,提高了图片的处理效率。
在上述实施例的基础上,本说明书一些实施例中,所述噪声生成模块具体用于:
采用黑盒攻击的对抗样本生成法,生成所述对抗噪声。
本说明书实施例,采用黑盒攻击的对抗样本生成法,可以生成适用于各种算法的对抗样本即对抗噪声,为后续待处理敏感图片的处理提供了数据基础。
在上述实施例的基础上,本说明书一些实施例中,所述噪声生成模块中的黑盒攻击的对抗样本生成法包括:边界攻击法或一次像素攻击法。
本说明书实施例,采用边界攻击法或一次像素攻击法可以生成针对不同算法模型的对抗噪声,降低模型的识别能力。
在上述实施例的基础上,本说明书一些实施例中,所述装置还包括图像调整模块,用于:获取合成后的图片的清晰度;若所述清晰度小于预设阈值,则调整所述对抗噪声,将调整后的对抗噪声合成到所述待处理敏感图片的敏感位置中,获得处理后敏感图片,直至所述处理后敏感图片的清晰度大于或等于所述预设阈值;将获得的清晰度大于或等于所述预设阈值的敏感图片进行存储。
本说明书实施例,根据合成对抗噪声后的图片的清晰度是否满足要求,在不满足要求的情况下,通过调整对抗噪声,合成新的图片,确保加入对抗噪声后的敏感图片不影响视觉质量,不影响正常业务使用敏感图片。
需要说明的,上述所述的装置根据方法实施例的描述还可以包括其他的实施方式。具体的实现方式可以参照上述对应的方法实施例的描述,在此不作一一赘述。
本说明书实施例还提供一种图片中隐私信息保护的处理设备,包括:至少一个处理器以及用于存储处理器可执行指令的存储器,所述处理器执行所述指令时实现上述实施例中图片中隐私信息保护的处理方法,如:获取待处理敏感图片;定位出所述待处理敏感图片中的敏感信息的敏感位置;利用对抗样本生成法生成对抗噪声;将生成的所述对抗噪声合成到所述待处理敏感图片的敏感位置处的图片中,并将合成后的图片存储。
需要说明的,上述所述的处理设备,根据方法实施例的描述还可以包括其他的实施方式。具体的实现方式可以参照上述对应的方法实施例的描述,在此不作一一赘述。
本说明书提供的图片中隐私信息保护的处理装置或处理设备,也可以应用在多种数据分析处理系统中。所述装置或处理设备可以包括上述实施例中任意一个图片中隐私信息保护的处理装置。所述装置或处理设备可以为单独的服务器,也可以包括使用了本说明书的一个或多个所述方法或一个或多个实施例装置的服务器集群、系统(包括分布式系统)、软件(应用)、实际操作装置、逻辑门电路装置、量子计算机等并结合必要的实施硬件的终端装置。所述核对差异数据的检测系统可以包括至少一个处理器以及存储计算机可执行指令的存储器,所述处理器执行所述指令时实现上述任意一个或者多个实施例中所述方法的步骤。
本说明书实施例所提供的方法实施例可以在移动终端、计算机终端、服务器或者类似的运算装置中执行。以运行在服务器上为例,图5是本说明书一个实施例中图片中隐私信息保护的处理服务器的硬件结构框图,该服务器可以是上述实施例中的图片中隐私信息保护的处理装置、图片中隐私信息保护的处理设备。如图5所示,服务器10可以包括一个或多个(图中仅示出一个)处理器100(处理器100可以包括但不限于微处理器MCU或可编程逻辑器件FPGA等的处理装置)、用于存储数据的存储器200、以及用于通信功能的传输模块300。本邻域普通技术人员可以理解,图5所示的结构仅为示意,其并不对上述电子装置的结构造成限定。例如,服务器10还可包括比图5中所示更多或者更少的组件,例如还可以包括其他的处理硬件,如数据库或多级缓存、GPU,或者具有与图5所示不同的配置。
存储器200可用于存储应用软件的软件程序以及模块,如本说明书实施例中的图片中隐私信息保护的处理方法对应的程序指令/模块,处理器100通过运行存储在存储器200内的软件程序以及模块,从而执行各种功能应用以及资源数据更新。存储器200可包括高速随机存储器,还可包括非易失性存储器,如一个或者多个磁性存储装置、闪存、或者其他非易失性固态存储器。在一些实例中,存储器200可进一步包括相对于处理器 100远程设置的存储器,这些远程存储器可以通过网络连接至计算机终端。上述网络的实例包括但不限于互联网、企业内部网、局域网、移动通信网及其组合。
传输模块300用于经由一个网络接收或者发送数据。上述的网络具体实例可包括计算机终端的通信供应商提供的无线网络。在一个实例中,传输模块300包括一个网络适配器(Network Interface Controller,NIC),其可通过基站与其他网络设备相连从而可与互联网进行通讯。在一个实例中,传输模块300可以为射频(Radio Frequency,RF)模块,其用于通过无线方式与互联网进行通讯。
上述对本说明书特定实施例进行了描述。其它实施例在所附权利要求书的范围内。在一些情况下,在权利要求书中记载的动作或步骤可以按照不同于实施例中的顺序来执行并且仍然可以实现期望的结果。另外,在附图中描绘的过程不一定要求示出的特定顺序或者连续顺序才能实现期望的结果。在某些实施方式中,多任务处理和并行处理也是可以的或者可能是有利的。
本说明书提供的上述实施例所述的方法或装置可以通过计算机程序实现业务逻辑并记录在存储介质上,所述的存储介质可以计算机读取并执行,实现本说明书实施例所描述方案的效果。
所述存储介质可以包括用于存储信息的物理装置,通常是将信息数字化后再以利用电、磁或者光学等方式的媒体加以存储。所述存储介质有可以包括:利用电能方式存储信息的装置如,各式存储器,如RAM、ROM等;利用磁能方式存储信息的装置如,硬盘、软盘、磁带、磁芯存储器、磁泡存储器、U盘;利用光学方式存储信息的装置如,CD或DVD。当然,还有其他方式的可读存储介质,例如量子存储器、石墨烯存储器等等。
本说明书实施例提供的上述图片中隐私信息保护的处理方法或装置可以在计算机中由处理器执行相应的程序指令来实现,如使用windows操作系统的c++语言在PC端实现、linux系统实现,或其他例如使用android、iOS系统程序设计语言在智能终端实现,以及基于量子计算机的处理逻辑实现等。
需要说明的是说明书上述所述的装置、计算机存储介质、系统根据相关方法实施例的描述还可以包括其他的实施方式,具体的实现方式可以参照对应方法实施例的描述,在此不作一一赘述。
本说明书中的各个实施例均采用递进的方式描述,各个实施例之间相同相似的部 分互相参考即可,每个实施例重点说明的都是与其他实施例的不同之处。尤其,对于硬件+程序类实施例而言,由于其基本相似于方法实施例,所以描述的比较简单,相关之处参考方法实施例的部分说明即可。
本说明书实施例并不局限于必须是符合行业通信标准、标准计算机资源数据更新和数据存储规则或本说明书一个或多个实施例所描述的情况。某些行业标准或者使用自定义方式或实施例描述的实施基础上略加修改后的实施方案也可以实现上述实施例相同、等同或相近、或变形后可预料的实施效果。应用这些修改或变形后的数据获取、存储、判断、处理方式等获取的实施例,仍然可以属于本说明书实施例的可选实施方案范围之内。
在20世纪90年代,对于一个技术的改进可以很明显地区分是硬件上的改进(例如,对二极管、晶体管、开关等电路结构的改进)还是软件上的改进(对于方法流程的改进)。然而,随着技术的发展,当今的很多方法流程的改进已经可以视为硬件电路结构的直接改进。设计人员几乎都通过将改进的方法流程编程到硬件电路中来得到相应的硬件电路结构。因此,不能说一个方法流程的改进就不能用硬件实体模块来实现。例如,可编程逻辑器件(Programmable Logic Device,PLD)(例如现场可编程门阵列(Field Programmable Gate Array,FPGA))就是这样一种集成电路,其逻辑功能由用户对器件编程来确定。由设计人员自行编程来把一个数字系统“集成”在一片PLD上,而不需要请芯片制造厂商来设计和制作专用的集成电路芯片。而且,如今,取代手工地制作集成电路芯片,这种编程也多半改用“逻辑编译器(logic compiler)”软件来实现,它与程序开发撰写时所用的软件编译器相类似,而要编译之前的原始代码也得用特定的编程语言来撰写,此称之为硬件描述语言(Hardware Description Language,HDL),而HDL也并非仅有一种,而是有许多种,如ABEL(Advanced Boolean Expression Language)、AHDL(Altera Hardware Description Language)、Confluence、CUPL(Cornell University Programming Language)、HDCal、JHDL(Java Hardware Description Language)、Lava、Lola、MyHDL、PALASM、RHDL(Ruby Hardware Description Language)等,目前最普遍使用的是VHDL(Very-High-Speed Integrated Circuit Hardware Description Language)与Verilog。本领域技术人员也应该清楚,只需要将方法流程用上述几种硬件描述语言稍作逻辑编程并编程到集成电路中,就可以很容易得到实现该逻辑方法流程的硬件电路。
控制器可以按任何适当的方式实现,例如,控制器可以采取例如微处理器或处理器以及存储可由该(微)处理器执行的计算机可读程序代码(例如软件或固件)的计算 机可读介质、逻辑门、开关、专用集成电路(Application Specific Integrated Circuit,ASIC)、可编程逻辑控制器和嵌入微控制器的形式,控制器的例子包括但不限于以下微控制器:ARC 625D、Atmel AT91SAM、Microchip PIC18F26K20以及Silicone Labs C8051F320,存储器控制器还可以被实现为存储器的控制逻辑的一部分。本领域技术人员也知道,除了以纯计算机可读程序代码方式实现控制器以外,完全可以通过将方法步骤进行逻辑编程来使得控制器以逻辑门、开关、专用集成电路、可编程逻辑控制器和嵌入微控制器等的形式来实现相同功能。因此这种控制器可以被认为是一种硬件部件,而对其内包括的用于实现各种功能的装置也可以视为硬件部件内的结构。或者甚至,可以将用于实现各种功能的装置视为既可以是实现方法的软件模块又可以是硬件部件内的结构。
上述实施例阐明的系统、装置、模块或单元,具体可以由计算机芯片或实体实现,或者由具有某种功能的产品来实现。一种典型的实现设备为计算机。具体的,计算机例如可以为个人计算机、膝上型计算机、车载人机交互设备、蜂窝电话、相机电话、智能电话、个人数字助理、媒体播放器、导航设备、电子邮件设备、游戏控制台、平板计算机、可穿戴设备或者这些设备中的任何设备的组合。
虽然本说明书一个或多个实施例提供了如实施例或流程图所述的方法操作步骤,但基于常规或者无创造性的手段可以包括更多或者更少的操作步骤。实施例中列举的步骤顺序仅仅为众多步骤执行顺序中的一种方式,不代表唯一的执行顺序。在实际中的装置或终端产品执行时,可以按照实施例或者附图所示的方法顺序执行或者并行执行(例如并行处理器或者多线程处理的环境,甚至为分布式资源数据更新环境)。术语“包括”、“包含”或者其任何其他变体意在涵盖非排他性的包含,从而使得包括一系列要素的过程、方法、产品或者设备不仅包括那些要素,而且还包括没有明确列出的其他要素,或者是还包括为这种过程、方法、产品或者设备所固有的要素。在没有更多限制的情况下,并不排除在包括所述要素的过程、方法、产品或者设备中还存在另外的相同或等同要素。第一,第二等词语用来表示名称,而并不表示任何特定的顺序。
为了描述的方便,描述以上装置时以功能分为各种模块分别描述。当然,在实施本说明书一个或多个时可以把各模块的功能在同一个或多个软件和/或硬件中实现,也可以将实现同一功能的模块由多个子模块或子单元的组合实现等。以上所描述的装置实施例仅仅是示意性的,例如,所述单元的划分,仅仅为一种逻辑功能划分,实际实现时可以有另外的划分方式,例如多个单元或组件可以结合或者可以集成到另一个系统,或一些特征可以忽略,或不执行。另一点,所显示或讨论的相互之间的耦合或直接耦合或通 信连接可以是通过一些接口,装置或单元的间接耦合或通信连接,可以是电性,机械或其它的形式。
本发明是参照根据本发明实施例的方法、装置(系统)、和计算机程序产品的流程图和/或方框图来描述的。应理解可由计算机程序指令实现流程图和/或方框图中的每一流程和/或方框、以及流程图和/或方框图中的流程和/或方框的结合。可提供这些计算机程序指令到通用计算机、专用计算机、嵌入式处理机或其他可编程资源数据更新设备的处理器以产生一个机器,使得通过计算机或其他可编程资源数据更新设备的处理器执行的指令产生用于实现在流程图一个流程或多个流程和/或方框图一个方框或多个方框中指定的功能的装置。
这些计算机程序指令也可存储在能引导计算机或其他可编程资源数据更新设备以特定方式工作的计算机可读存储器中,使得存储在该计算机可读存储器中的指令产生包括指令装置的制造品,该指令装置实现在流程图一个流程或多个流程和/或方框图一个方框或多个方框中指定的功能。
这些计算机程序指令也可装载到计算机或其他可编程资源数据更新设备上,使得在计算机或其他可编程设备上执行一系列操作步骤以产生计算机实现的处理,从而在计算机或其他可编程设备上执行的指令提供用于实现在流程图一个流程或多个流程和/或方框图一个方框或多个方框中指定的功能的步骤。
在一个典型的配置中,计算设备包括一个或多个处理器(CPU)、输入/输出接口、网络接口和内存。
内存可能包括计算机可读介质中的非永久性存储器,随机存取存储器(RAM)和/或非易失性内存等形式,如只读存储器(ROM)或闪存(flash RAM)。内存是计算机可读介质的示例。
计算机可读介质包括永久性和非永久性、可移动和非可移动媒体可以由任何方法或技术来实现信息存储。信息可以是计算机可读指令、数据结构、程序的模块或其他数据。计算机的存储介质的例子包括,但不限于相变内存(PRAM)、静态随机存取存储器(SRAM)、动态随机存取存储器(DRAM)、其他类型的随机存取存储器(RAM)、只读存储器(ROM)、电可擦除可编程只读存储器(EEPROM)、快闪记忆体或其他内存技术、只读光盘只读存储器(CD-ROM)、数字多功能光盘(DVD)或其他光学存储、磁盒式磁带,磁带磁磁盘存储、石墨烯存储或其他磁性存储设备或任何其他非传输介质, 可用于存储可以被计算设备访问的信息。按照本文中的界定,计算机可读介质不包括暂存电脑可读媒体(transitory media),如调制的数据信号和载波。
本领域技术人员应明白,本说明书一个或多个实施例可提供为方法、系统或计算机程序产品。因此,本说明书一个或多个实施例可采用完全硬件实施例、完全软件实施例或结合软件和硬件方面的实施例的形式。而且,本说明书一个或多个实施例可采用在一个或多个其中包含有计算机可用程序代码的计算机可用存储介质(包括但不限于磁盘存储器、CD-ROM、光学存储器等)上实施的计算机程序产品的形式。
本说明书一个或多个实施例可以在由计算机执行的计算机可执行指令的一般上下文中描述,例如程序模块。一般地,程序模块包括执行特定任务或实现特定抽象数据类型的例程、程序、对象、组件、数据结构等等。也可以在分布式计算环境中实践本本说明书一个或多个实施例,在这些分布式计算环境中,由通过通信网络而被连接的远程处理设备来执行任务。在分布式计算环境中,程序模块可以位于包括存储设备在内的本地和远程计算机存储介质中。
本说明书中的各个实施例均采用递进的方式描述,各个实施例之间相同相似的部分互相参考即可,每个实施例重点说明的都是与其他实施例的不同之处。尤其,对于系统实施例而言,由于其基本相似于方法实施例,所以描述的比较简单,相关之处参考方法实施例的部分说明即可。在本说明书的描述中,参考术语“一个实施例”、“一些实施例”、“示例”、“具体示例”、或“一些示例”等的描述意指结合该实施例或示例描述的具体特征、结构、材料或者特点包含于本说明书的至少一个实施例或示例中。在本说明书中,对上述术语的示意性表述不必须针对的是相同的实施例或示例。而且,描述的具体特征、结构、材料或者特点可以在任一个或多个实施例或示例中以合适的方式结合。此外,在不相互矛盾的情况下,本领域的技术人员可以将本说明书中描述的不同实施例或示例以及不同实施例或示例的特征进行结合和组合。
以上所述仅为本说明书一个或多个实施例的实施例而已,并不用于限制本说明书一个或多个实施例。对于本领域技术人员来说,本说明书一个或多个实施例可以有各种更改和变化。凡在本说明书的精神和原理之内所作的任何修改、等同替换、改进等,均应包含在权利要求范围之内。

Claims (11)

  1. 一种图片中隐私信息保护的处理方法,包括:
    获取待处理敏感图片;
    定位出所述待处理敏感图片中的敏感信息的敏感位置;
    利用对抗样本生成法生成对抗噪声;
    将生成的所述对抗噪声合成到所述待处理敏感图片的敏感位置处的图片中,并将合成后的图片存储。
  2. 如权利要求1所述的方法,所述定位出所述待处理敏感图片中的敏感信息的敏感位置,包括:
    预先利用历史敏感图片和所述历史敏感图片中标记的敏感位置,训练构建敏感位置定位模型;
    利用所述敏感位置定位模型定位出所述待处理敏感图片中的敏感信息的敏感位置。
  3. 如权利要求1所述的方法,所述利用对抗样本生成法生成对抗噪声,包括:
    采用黑盒攻击的对抗样本生成法,生成所述对抗噪声。
  4. 如权利要求3所述的方法,所述黑盒攻击的对抗样本生成法包括:边界攻击法或一次像素攻击法。
  5. 如权利要求1所述的方法,所述方法还包括:
    获取合成后的图片的清晰度;
    若所述清晰度小于预设阈值,则调整所述对抗噪声,将调整后的对抗噪声合成到所述待处理敏感图片的敏感位置中,获得处理后敏感图片,直至所述处理后敏感图片的清晰度大于或等于所述预设阈值;
    将获得的清晰度大于或等于所述预设阈值的敏感图片进行存储。
  6. 一种图片中隐私信息保护的处理装置,包括:
    图片获取模块,用于获取待处理敏感图片;
    敏感位置定位模块,用于定位出所述待处理敏感图片中的敏感信息的敏感位置;
    噪声生成模块,用于利用对抗样本生成法生成对抗噪声;
    噪声合成模块,用于将生成的所述对抗噪声合成到所述待处理敏感图片的敏感位置处的图片中,并将合成后的图片存储。
  7. 如权利要求6所述的装置,所述敏感位置定位模块中包括:
    模型构建单元,用于预先利用历史敏感图片和所述历史敏感图片中标记的敏感位置,训练构建敏感位置定位模型;
    位置定位单元,用于利用所述敏感位置定位模型定位出所述待处理敏感图片中的敏感信息的敏感位置。
  8. 如权利要求6所述的装置,所述噪声生成模块具体用于:
    采用黑盒攻击的对抗样本生成法,生成所述对抗噪声。
  9. 如权利要求8所述的装置,所述噪声生成模块中的黑盒攻击的对抗样本生成法包括:边界攻击法或一次像素攻击法。
  10. 如权利要求6所述的装置,所述装置还包括图像调整模块,用于:
    获取合成后的图片的清晰度;
    若所述清晰度小于预设阈值,则调整所述对抗噪声,将调整后的对抗噪声合成到所述待处理敏感图片的敏感位置中,获得处理后敏感图片,直至所述处理后敏感图片的清晰度大于或等于所述预设阈值;
    将获得的清晰度大于或等于所述预设阈值的敏感图片进行存储。
  11. 一种图片中隐私信息保护的处理设备,包括:至少一个处理器以及用于存储处理器可执行指令的存储器,所述处理器执行所述指令时实现权利要求1-5中任一项所述的方法。
PCT/CN2020/125306 2019-12-27 2020-10-30 一种图片中隐私信息保护的处理方法及装置 WO2021129146A1 (zh)

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
CN201911374421.X 2019-12-27
CN201911374421.XA CN111177757A (zh) 2019-12-27 2019-12-27 一种图片中隐私信息保护的处理方法及装置

Publications (1)

Publication Number Publication Date
WO2021129146A1 true WO2021129146A1 (zh) 2021-07-01

Family

ID=70655820

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/CN2020/125306 WO2021129146A1 (zh) 2019-12-27 2020-10-30 一种图片中隐私信息保护的处理方法及装置

Country Status (3)

Country Link
CN (1) CN111177757A (zh)
TW (1) TW202125298A (zh)
WO (1) WO2021129146A1 (zh)

Cited By (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN113536374A (zh) * 2021-07-15 2021-10-22 荣耀终端有限公司 图像隐私的保护方法及电子设备
CN113628150A (zh) * 2021-07-05 2021-11-09 深圳大学 攻击图像生成方法、电子设备及可读存储介质
CN114419719A (zh) * 2022-03-29 2022-04-29 北京爱笔科技有限公司 一种生物特征的处理方法及装置

Families Citing this family (7)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN111177757A (zh) * 2019-12-27 2020-05-19 支付宝(杭州)信息技术有限公司 一种图片中隐私信息保护的处理方法及装置
CN111753275B (zh) * 2020-06-04 2024-03-26 支付宝(杭州)信息技术有限公司 基于图像的用户隐私保护方法、装置、设备和存储介质
CN112347512A (zh) * 2020-11-13 2021-02-09 支付宝(杭州)信息技术有限公司 图像处理方法、装置、设备及存储介质
CN113450271B (zh) * 2021-06-10 2024-02-27 南京信息工程大学 一种基于人类视觉模型的鲁棒自适应对抗样本生成方法
CN113609507A (zh) * 2021-08-19 2021-11-05 上海明略人工智能(集团)有限公司 一种数据伦理方法、系统、电子设备及介质
CN115223010A (zh) * 2022-07-08 2022-10-21 广东省智能网联汽车创新中心有限公司 一种智能驾驶目标检测场景的对抗样本生成方法及系统
CN115223011A (zh) * 2022-07-08 2022-10-21 广东省智能网联汽车创新中心有限公司 一种智能驾驶场景的对抗样本生成方法及系统

Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN106600525A (zh) * 2016-12-09 2017-04-26 宇龙计算机通信科技(深圳)有限公司 图片模糊处理方法及系统
CN107368752A (zh) * 2017-07-25 2017-11-21 北京工商大学 一种基于生成式对抗网络的深度差分隐私保护方法
CN108366196A (zh) * 2018-01-25 2018-08-03 西安中科创达软件有限公司 一种保护图片隐私的方法
CN111177757A (zh) * 2019-12-27 2020-05-19 支付宝(杭州)信息技术有限公司 一种图片中隐私信息保护的处理方法及装置

Family Cites Families (12)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN104021350B (zh) * 2014-05-13 2016-07-06 小米科技有限责任公司 隐私信息隐藏方法及装置
CN107704877B (zh) * 2017-10-09 2020-05-29 哈尔滨工业大学深圳研究生院 一种基于深度学习的图像隐私感知方法
CN108257116A (zh) * 2017-12-30 2018-07-06 清华大学 一种生成对抗图像的方法
CN108364018A (zh) * 2018-01-25 2018-08-03 北京墨丘科技有限公司 一种标注数据的保护方法、终端设备和系统
CN109214973B (zh) * 2018-08-24 2020-10-27 中国科学技术大学 针对隐写分析神经网络的对抗安全载体生成方法
CN109815765A (zh) * 2019-01-21 2019-05-28 东南大学 一种提取含有二维码的营业执照信息的方法及装置
CN109993212B (zh) * 2019-03-06 2023-06-20 西安电子科技大学 社交网络图片分享中的位置隐私保护方法、社交网络平台
CN110189253B (zh) * 2019-04-16 2023-03-31 浙江工业大学 一种基于改进生成对抗网络的图像超分辨率重建方法
CN110287720A (zh) * 2019-07-01 2019-09-27 国网内蒙古东部电力有限公司 一种基于图像识别和用户等级的访问控制方法
CN110516812A (zh) * 2019-07-19 2019-11-29 南京航空航天大学 基于对抗样本的抗成员推理攻击的ai模型隐私保护方法
CN110363183B (zh) * 2019-07-30 2020-05-08 贵州大学 基于生成式对抗网络的服务机器人视觉图片隐私保护方法
CN110473135B (zh) * 2019-07-31 2022-12-27 哈尔滨工业大学(深圳) 图像处理方法、系统、可读存储介质及智能设备

Patent Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN106600525A (zh) * 2016-12-09 2017-04-26 宇龙计算机通信科技(深圳)有限公司 图片模糊处理方法及系统
CN107368752A (zh) * 2017-07-25 2017-11-21 北京工商大学 一种基于生成式对抗网络的深度差分隐私保护方法
CN108366196A (zh) * 2018-01-25 2018-08-03 西安中科创达软件有限公司 一种保护图片隐私的方法
CN111177757A (zh) * 2019-12-27 2020-05-19 支付宝(杭州)信息技术有限公司 一种图片中隐私信息保护的处理方法及装置

Non-Patent Citations (2)

* Cited by examiner, † Cited by third party
Title
ANONYMOUS: "AI Frontline: Anti-Facial Recognition Systems Reduce the Effectiveness of Facial Detection to 0.5%", 2 June 2018 (2018-06-02), XP055824715, Retrieved from the Internet <URL:https://www.secrss.com/articles/3081> *
BOSE AVISHEK JOEY; AARABI PARHAM: "Adversarial Attacks on Face Detectors Using Neural Net Based Constrained Optimization", 2018 IEEE 20TH INTERNATIONAL WORKSHOP ON MULTIMEDIA SIGNAL PROCESSING (MMSP), IEEE, 29 August 2018 (2018-08-29), pages 1 - 6, XP033457792, DOI: 10.1109/MMSP.2018.8547128 *

Cited By (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN113628150A (zh) * 2021-07-05 2021-11-09 深圳大学 攻击图像生成方法、电子设备及可读存储介质
CN113628150B (zh) * 2021-07-05 2023-08-08 深圳大学 攻击图像生成方法、电子设备及可读存储介质
CN113536374A (zh) * 2021-07-15 2021-10-22 荣耀终端有限公司 图像隐私的保护方法及电子设备
CN114419719A (zh) * 2022-03-29 2022-04-29 北京爱笔科技有限公司 一种生物特征的处理方法及装置

Also Published As

Publication number Publication date
CN111177757A (zh) 2020-05-19
TW202125298A (zh) 2021-07-01

Similar Documents

Publication Publication Date Title
WO2021129146A1 (zh) 一种图片中隐私信息保护的处理方法及装置
CN109117831B (zh) 物体检测网络的训练方法和装置
US11481869B2 (en) Cross-domain image translation
US9928836B2 (en) Natural language processing utilizing grammar templates
WO2022089360A1 (zh) 人脸检测神经网络及训练方法、人脸检测方法、存储介质
JP5843207B2 (ja) 直観的コンピューティング方法及びシステム
TW201923707A (zh) 圖像處理方法和處理設備
US11132800B2 (en) Real time perspective correction on faces
CN110019912A (zh) 基于形状的图形搜索
WO2022022043A1 (zh) 人脸图像生成方法、装置、服务器及存储介质
WO2023035531A1 (zh) 文本图像超分辨率重建方法及其相关设备
EP3791356B1 (en) Perspective distortion correction on faces
CN111275784A (zh) 生成图像的方法和装置
US20230137378A1 (en) Generating private synthetic training data for training machine-learning models
JP5832656B2 (ja) 画像中のテキストの検出を容易にする方法及び装置
US10909357B1 (en) Image landmark detection
CN105096353A (zh) 一种图像处理方法及装置
US11954883B2 (en) Long distance QR code decoding
US11823433B1 (en) Shadow removal for local feature detector and descriptor learning using a camera sensor sensitivity model
US10074033B2 (en) Using labels to track high-frequency offsets for patch-matching algorithms
CN111144466B (zh) 一种图像样本自适应的深度度量学习方法
CN115392216B (zh) 一种虚拟形象生成方法、装置、电子设备及存储介质
US9836799B2 (en) Service provision program
CN112102145A (zh) 图像处理方法及装置
CN107730566A (zh) 一种生成表情的方法、装置、移动终端及存储介质

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 20906516

Country of ref document: EP

Kind code of ref document: A1

NENP Non-entry into the national phase

Ref country code: DE

122 Ep: pct application non-entry in european phase

Ref document number: 20906516

Country of ref document: EP

Kind code of ref document: A1