CN118036057B - Method, system, storage medium and program product for protecting user privacy - Google Patents
Method, system, storage medium and program product for protecting user privacy Download PDFInfo
- Publication number
- CN118036057B CN118036057B CN202311859590.9A CN202311859590A CN118036057B CN 118036057 B CN118036057 B CN 118036057B CN 202311859590 A CN202311859590 A CN 202311859590A CN 118036057 B CN118036057 B CN 118036057B
- Authority
- CN
- China
- Prior art keywords
- information
- data
- user
- model
- privacy
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Active
Links
- 238000000034 method Methods 0.000 title claims abstract description 86
- 238000012545 processing Methods 0.000 claims abstract description 49
- 230000002452 interceptive effect Effects 0.000 claims description 49
- 238000012549 training Methods 0.000 claims description 26
- 238000004590 computer program Methods 0.000 claims description 10
- 230000004044 response Effects 0.000 claims description 5
- 238000012216 screening Methods 0.000 claims 3
- 238000001914 filtration Methods 0.000 claims 2
- 230000008569 process Effects 0.000 abstract description 26
- 230000003993 interaction Effects 0.000 abstract description 20
- 230000006870 function Effects 0.000 description 10
- 238000012986 modification Methods 0.000 description 9
- 230000004048 modification Effects 0.000 description 9
- 239000013589 supplement Substances 0.000 description 5
- 238000010586 diagram Methods 0.000 description 4
- 238000000605 extraction Methods 0.000 description 4
- 239000000047 product Substances 0.000 description 4
- 238000013461 design Methods 0.000 description 3
- 238000001514 detection method Methods 0.000 description 3
- 230000007937 eating Effects 0.000 description 3
- 230000006872 improvement Effects 0.000 description 3
- 230000003287 optical effect Effects 0.000 description 3
- 230000000644 propagated effect Effects 0.000 description 3
- 206010028813 Nausea Diseases 0.000 description 2
- 230000009471 action Effects 0.000 description 2
- 238000004458 analytical method Methods 0.000 description 2
- 238000003491 array Methods 0.000 description 2
- 238000004891 communication Methods 0.000 description 2
- 208000002173 dizziness Diseases 0.000 description 2
- 230000000694 effects Effects 0.000 description 2
- 238000005516 engineering process Methods 0.000 description 2
- 230000007613 environmental effect Effects 0.000 description 2
- 239000000835 fiber Substances 0.000 description 2
- 230000014509 gene expression Effects 0.000 description 2
- 230000008693 nausea Effects 0.000 description 2
- 238000003062 neural network model Methods 0.000 description 2
- 230000001502 supplementing effect Effects 0.000 description 2
- 208000024891 symptom Diseases 0.000 description 2
- 241000579895 Chlorostilbon Species 0.000 description 1
- 206010019233 Headaches Diseases 0.000 description 1
- 206010020772 Hypertension Diseases 0.000 description 1
- 208000005374 Poisoning Diseases 0.000 description 1
- 208000009205 Tinnitus Diseases 0.000 description 1
- 206010047513 Vision blurred Diseases 0.000 description 1
- 206010047700 Vomiting Diseases 0.000 description 1
- 230000006978 adaptation Effects 0.000 description 1
- 238000013473 artificial intelligence Methods 0.000 description 1
- 230000000712 assembly Effects 0.000 description 1
- 238000000429 assembly Methods 0.000 description 1
- 230000004397 blinking Effects 0.000 description 1
- 238000011161 development Methods 0.000 description 1
- 230000035622 drinking Effects 0.000 description 1
- 229940079593 drug Drugs 0.000 description 1
- 239000003814 drug Substances 0.000 description 1
- 239000010976 emerald Substances 0.000 description 1
- 229910052876 emerald Inorganic materials 0.000 description 1
- 235000013305 food Nutrition 0.000 description 1
- 231100000869 headache Toxicity 0.000 description 1
- 208000019622 heart disease Diseases 0.000 description 1
- 239000010977 jade Substances 0.000 description 1
- 238000004519 manufacturing process Methods 0.000 description 1
- 238000013507 mapping Methods 0.000 description 1
- 239000000463 material Substances 0.000 description 1
- 238000002483 medication Methods 0.000 description 1
- 231100000572 poisoning Toxicity 0.000 description 1
- 230000000607 poisoning effect Effects 0.000 description 1
- ZLIBICFPKPWGIZ-UHFFFAOYSA-N pyrimethanil Chemical compound CC1=CC(C)=NC(NC=2C=CC=CC=2)=N1 ZLIBICFPKPWGIZ-UHFFFAOYSA-N 0.000 description 1
- 238000012552 review Methods 0.000 description 1
- 239000010979 ruby Substances 0.000 description 1
- 229910001750 ruby Inorganic materials 0.000 description 1
- 239000004065 semiconductor Substances 0.000 description 1
- 238000004088 simulation Methods 0.000 description 1
- 239000007787 solid Substances 0.000 description 1
- 231100000886 tinnitus Toxicity 0.000 description 1
- 230000000007 visual effect Effects 0.000 description 1
- 230000008673 vomiting Effects 0.000 description 1
Classifications
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F21/00—Security arrangements for protecting computers, components thereof, programs or data against unauthorised activity
- G06F21/60—Protecting data
- G06F21/62—Protecting access to data via a platform, e.g. using keys or access control rules
- G06F21/6218—Protecting access to data via a platform, e.g. using keys or access control rules to a system of files or objects, e.g. local or distributed file system or database
- G06F21/6245—Protecting personal data, e.g. for financial or medical purposes
Landscapes
- Engineering & Computer Science (AREA)
- Health & Medical Sciences (AREA)
- Bioethics (AREA)
- General Health & Medical Sciences (AREA)
- Theoretical Computer Science (AREA)
- Computer Hardware Design (AREA)
- Databases & Information Systems (AREA)
- Computer Security & Cryptography (AREA)
- Software Systems (AREA)
- Physics & Mathematics (AREA)
- General Engineering & Computer Science (AREA)
- General Physics & Mathematics (AREA)
- Medical Informatics (AREA)
- Information Retrieval, Db Structures And Fs Structures Therefor (AREA)
- Management, Administration, Business Operations System, And Electronic Commerce (AREA)
Abstract
The embodiment of the specification discloses a method, a system, a storage medium and a program product for protecting user privacy, and relates to the technical field of information security. Wherein the method comprises the following steps: acquiring personal information related to a user based on first data input by the user; determining privacy information from the personal information; performing coding processing on the privacy information to obtain a coding vector corresponding to the privacy information; second data generated based on the encoding vector and at least part of the information in the first data is obtained, wherein the second data comprises personalized reply content corresponding to the first data. By the method, corresponding personalized reply content can be generated based on personal information containing user privacy information, and user privacy leakage in the information use process can be prevented, so that the accuracy of personalized interaction service is improved, and meanwhile, the safety of the user privacy information is improved.
Description
Technical Field
The present application relates to the field of information security technology, and in particular, to a method, system, storage medium and computer program product for protecting user privacy in personalized interactive services.
Background
With the development of artificial intelligence technology, intelligent interactive services are increasingly applied to various fields, such as role-playing intelligent chatting, intelligent question-answering, intelligent customer service, voice assistant, etc. These services often require personal information of the user to be obtained in order to provide more accurate, personalized services to the user. How to protect user privacy while providing personalized services is a challenge.
Based on this, it is necessary to study a method and a system capable of protecting the privacy of a user in a personalized interactive service, thereby preventing disclosure of private information of the user while realizing the personalized service.
Disclosure of Invention
To solve the above-mentioned problems, an aspect of embodiments of the present specification provides a method for protecting user privacy in a personalized interactive service, the method comprising: acquiring personal information related to a user based on first data input by the user; determining privacy information from the personal information; performing coding processing on the privacy information to obtain a coding vector corresponding to the privacy information; second data generated based on the encoding vector and at least part of the information in the first data is obtained, wherein the second data comprises personalized reply content corresponding to the first data.
Another aspect of embodiments of the present specification also provides a method for protecting user privacy in a personalized interactive service, the method comprising: receiving a coding vector and at least part of information in first data input by a user, wherein the coding vector is obtained by coding privacy information of the user; processing the coding vector and at least part of information in the first data through a trained second model to obtain second data; and sending the second data to display personalized reply content corresponding to the problem information contained in the first data to the user.
Another aspect of embodiments of the present specification also provides a system for protecting user privacy in a personalized interactive service, the system comprising: the personal information acquisition module is used for acquiring personal information related to the user based on first data input by the user; a privacy information determining module for determining privacy information from the personal information; the encoding module is used for encoding the privacy information to obtain an encoding vector corresponding to the privacy information; and a reply content acquisition module for acquiring second data generated based on the encoding vector and at least part of the information in the first data, wherein the second data comprises personalized reply content corresponding to the first data.
Another aspect of embodiments of the present specification also provides a system for protecting user privacy in a personalized interactive service, the system comprising: the receiving module is used for receiving a coding vector and at least part of information in first data input by a user, wherein the coding vector is obtained by coding privacy information of the user; the processing module is used for processing the coding vector and at least part of information in the first data through a trained second model to obtain second data; and the sending module is used for sending the second data so as to display the personalized reply content corresponding to the problem information contained in the first data to the user.
Another aspect of embodiments of the present specification also provides a system for protecting user privacy in a personalized interactive service, the system comprising: the personal information acquisition module is used for acquiring personal information related to the user based on first data input by the user; a privacy information determining module for determining privacy information from the personal information; the encoding module is used for encoding the privacy information to obtain an encoding vector corresponding to the privacy information; the processing module is used for processing the coding vector and at least part of information in the first data through a trained second model to obtain second data, wherein the second data comprises personalized reply content corresponding to the first data; and the display module is used for displaying the personalized reply content included in the second data to the user.
Another aspect of embodiments of the present description also provides a computer-readable storage medium storing computer instructions which, when executed by a processor, implement a method as claimed in any one of the preceding claims.
Another aspect of embodiments of the present description also provides a computer program product comprising a computer program or instructions which, when executed by a processor, performs a method as claimed in any one of the preceding claims.
Additional features will be set forth in part in the description which follows. As will become apparent to those skilled in the art upon review of the following and drawings, or may be learned by the production or operation of the examples. The features of the present specification can be implemented and obtained by practicing or using the various aspects of the methods, tools, and combinations set forth in the detailed examples below.
Drawings
The present specification will be further described by way of exemplary embodiments, which will be described in detail by way of the accompanying drawings. The embodiments are not limited, and like numbers indicate like steps or structures in the embodiments.
Fig. 1 is a schematic illustration of an exemplary application scenario of a system for protecting user privacy in a personalized interactive service, according to some embodiments of the present description.
Fig. 2 is an exemplary block diagram of a system for protecting user privacy in a personalized interactive service, according to some embodiments of the present description.
Fig. 3 is an exemplary block diagram of a system for protecting user privacy in personalized interactive services according to further embodiments of the present specification.
Fig. 4 is an exemplary block diagram of a system for protecting user privacy in a personalized interactive service according to further embodiments of the present specification.
Fig. 5 is an exemplary flow chart of a method for protecting user privacy in a personalized interactive service according to some embodiments of the present description.
Fig. 6 is an exemplary flow chart of a method for protecting user privacy in a personalized interactive service according to further embodiments of the present description.
Detailed Description
In order to more clearly illustrate the technical solutions of the embodiments of the present specification, the drawings that are required to be used in the description of the embodiments will be briefly described below. It is apparent that the drawings in the following description are only some examples or embodiments of the present specification, and it is possible for those of ordinary skill in the art to apply the present specification to other similar situations according to the drawings without inventive effort. Unless otherwise apparent from the context of the language or otherwise specified, like reference numerals in the figures refer to like structures or operations.
It should be appreciated that as used in this specification, a "system," "apparatus," "unit" and/or "module" is one method for distinguishing between different components, elements, parts, portions or assemblies at different levels. However, if other words can achieve the same purpose, the words can be replaced by other expressions.
As used in this specification and the claims, the terms "a," "an," "the," and/or "the" are not specific to a singular, but may include a plurality, unless the context clearly dictates otherwise. In general, the terms "comprises" and "comprising" merely indicate that the steps and elements are explicitly identified, and they do not constitute an exclusive list, as other steps or elements may be included in a method or apparatus.
A flowchart is used in this specification to describe the operations performed by the system according to embodiments of the present specification. It should be appreciated that the preceding or following operations are not necessarily performed in order precisely. Rather, the steps may be processed in reverse order or simultaneously. Also, other operations may be added to or removed from these processes.
Taking role-playing intelligent chat (also called personalized chat) as an example, text data and/or voice data input by a user can be processed through a data processing model pre-configured at the user terminal 110 and/or the service terminal 120, and interaction with the user can be performed with different roles according to identities set by the user, for example, the model can chat with the user with a doctor/lawyer/teacher identity, chat with the user with an ancient persona identity, or chat with the user with a secondary role (which can be a role created by the user or a role in a animation), and the like. It can be understood that if the interactive product (such as a large model for question and answer) knows a certain real identity or character information (which can be real information of the user or fictional information of the user), more accurate and personalized interactive service can be provided, so that chat is more real, and the user is immersed in the interactive process, thereby achieving the purpose of improving the user experience.
Illustratively, in some embodiments, the user terminal 110 and/or the service terminal 120 may obtain information about the user's age, name, preference, graduation, occupation, love history, medical history, etc., and then input the information into the model. Further, the model may analyze the information and conduct a question-answer dialogue with the user based on the analysis.
However, from a user perspective, it may not be desirable to pass its private information on to a server (e.g., service terminal 120). Therefore, how to protect the privacy of the user while performing personalized interaction by using the real information of the user is important.
In view of the above problems, the embodiments of the present disclosure provide a method and a system for protecting user privacy in personalized interaction service, which obtain a coded vector corresponding to privacy information by performing coding processing on the privacy information of a user, and then obtain corresponding personalized reply content based on the coded vector, so that the user privacy information received by the service terminal 120 is no longer in a plaintext state while personalized interaction is implemented, and leakage of the user privacy information at the service terminal 120 is prevented.
The following describes in detail a method and a system for protecting user privacy in personalized interactive services provided in the embodiments of the present specification with reference to the accompanying drawings.
Fig. 1 is a schematic illustration of an exemplary application scenario of a system for protecting user privacy in a personalized interactive service according to some embodiments of the present description.
Referring to fig. 1, in some embodiments, an application scenario 100 of a system for protecting user privacy in a personalized interactive service may include a user terminal 110, a service terminal 120, a storage device 130, and a network 140. The various components in the application scenario 100 may be connected in a variety of ways. For example, the user terminal 110 may be connected to the service terminal 120 and/or the storage device 130 through the network 140, or may be directly connected to the service terminal 120 and/or the storage device 130. As another example, service terminal 120 may be connected to storage device 130 via network 140 or directly.
User terminal 110 may receive, transmit, input, and/or output data. The data input by the user terminal 110 may include first data and/or operation instructions input by a user during the interaction process. In this embodiment of the present disclosure, the aforementioned first data may refer to interaction data input by a user during an interaction process, which may be in the form of text data, voice data, action data (e.g. gesture data, nodding or waving, blinking), and so on. For example, the user may ask a question by inputting text or voice, and during the interaction, the user terminal 110 may ask the user about the relevant personal information (i.e., output the question), at which time the user may respond to the query by inputting text, voice, or performing a specified action. It should be noted that, in the embodiment of the present specification, the user terminal 110 may output the foregoing query in text, voice, or any other feasible form.
The data transmitted by the user terminal 110 may include the aforementioned first data, an operation instruction, or data processed based on the aforementioned first data and/or operation instruction. For example, in some embodiments, the user terminal 110 may obtain personal information related to the user based on the first data input by the user, then determine privacy information therefrom, perform coding processing on the privacy information to obtain a coded vector corresponding to the privacy information, and finally send the coded vector and at least part of the information in the first data to the service terminal 120 for processing. For another example, in some embodiments, the user terminal 110 may perform problem information extraction on the first data to obtain target problem information, and then send a data acquisition instruction related to the target problem information to the service terminal 120.
The data received by the user terminal 110 may include data issued by the service terminal 120 and/or data acquired from the storage device 130. For example, in some embodiments, the service terminal 120 may generate second data based on the foregoing encoding vector and at least part of the information in the first data, and issue the second data to the user terminal 110. For another example, in some embodiments, the service terminal 120 may filter personal information related to the problem information included in the first data from preset information input by the user based on a data acquisition instruction transmitted from the user terminal 110, and transmit the personal information to the user terminal 110. For another example, in some embodiments, the user terminal 110 may directly acquire personal information related to the target problem information from preset information stored in the storage device 130. For another example, in some embodiments, the user terminal 110 may obtain pre-stored computer instructions from the storage device 130 and execute the computer instructions to implement the methods described herein for protecting user privacy in personalized interactive services.
The data output by the user terminal 110 may include the aforementioned query and second data generated based on the aforementioned encoding vector and at least some of the information in the first data. It should be noted that, in the embodiment of the present specification, the second data may include personalized reply content corresponding to the foregoing first data.
In some embodiments, user terminal 110 may include mobile device 111, tablet computer 112, laptop computer 113, or the like, or any combination thereof. For example, mobile device 111 may include a mobile phone, a Personal Digital Assistant (PDA), an in-vehicle terminal, a dedicated mobile terminal, and the like, or any combination thereof. In some embodiments, user terminal 110 may include input devices, which may include a keyboard, a touch screen, a microphone, a camera, etc., and output devices, which may include a display, speakers, etc.
Service terminal 120 may process obtaining data and/or information from user terminal 110, storage device 130, and/or other components of application scenario 100. In some embodiments, the service terminal 120 may obtain the encoding vector corresponding to the privacy information and at least part of the information in the first data from the user terminal 110 or the storage device 130, and then obtain the second data by processing the encoding vector and at least part of the information in the first data. In this embodiment of the present specification, the second data includes personalized reply content corresponding to the aforementioned first data. By displaying the second data to the user, personalized reply to the first data input by the user can be realized.
It should be noted that, in the embodiment of the present disclosure, since the second data is obtained based on the encoding vector corresponding to the privacy information, the second data may provide the user with more accurate and more personalized reply content, so that the man-machine interaction is more real, and the user is more immersed, so as to achieve the effect of improving the user experience. Meanwhile, in the embodiment of the present disclosure, the privacy information of the user is encoded, and the encoded vector is then sent to the service terminal 120 for processing, so that the privacy information of the user received by the service terminal 120 is no longer in a plaintext state, thereby preventing the disclosure of the privacy information of the user at the service terminal 120.
In some embodiments, the service terminal 120 may receive a data acquisition instruction related to the problem information in the first data, and then acquire personal information related to the aforementioned problem information based on the data acquisition instruction, and transmit the personal information to the user terminal 110. In some embodiments, the service terminal 120 may receive the target question information extracted from the first data, then screen personal information related to the target question information from preset information input by the user based on the target question information, and detect whether the personal information meets a reply condition corresponding to the target question information. If yes, personal information related to the target problem information is sent; if the target question information is not satisfied, the personal information is transmitted and a query command is transmitted to query the target question information for the personal information.
In some embodiments, the service terminal 120 may retrieve pre-stored computer instructions from the storage device 130 and execute the computer instructions to implement the methods described herein for protecting user privacy in personalized interactive services.
In some embodiments, the service terminal 120 may be a single server or a group of servers. The server farm may be centralized or distributed. In some embodiments, the service terminal 120 may be local or remote. For example, the service terminal 120 may access information and/or data from the user terminal 110 or the storage device 130 through the network 140. As another example, service terminal 120 may be directly connected to user terminal 110 and/or storage device 130 to access information and/or data. In some embodiments, the service terminal 120 may be implemented on a cloud platform. For example, the cloud platform may include a private cloud, a public cloud, a hybrid cloud, a community cloud, a distributed cloud, an inter-cloud, a multi-cloud, and the like, or any combination thereof.
The network 140 may facilitate the exchange of information and/or data. The network 140 may include any suitable network capable of facilitating the exchange of information and/or data of the application scenario 100. In some embodiments, at least one component of the application scenario 100 (e.g., the user terminal 110, the service terminal 120, the storage device 130) may exchange information and/or data with at least one other component in the application scenario 100 via the network 140. For example, the service terminal 120 may obtain at least part of the information in the aforementioned encoded vector and the first data from the user terminal 110 and/or the storage device 130 through the network 140. For another example, the service terminal 120 may obtain a data acquisition instruction from the user terminal 110 through the network 140, and issue personal information related to the problem information in the first data based on the data acquisition instruction. For another example, the user terminal 110 may receive second data generated based on the foregoing encoding vector and at least part of the information in the first data through the network 140.
In some embodiments, network 140 may be any form of wired or wireless network, or any combination thereof. By way of example only, the network 140 may include a cable network, a wired network, a fiber optic network, a telecommunications network, an intranet, the internet, a Local Area Network (LAN), a Wide Area Network (WAN), a Wireless Local Area Network (WLAN), a Metropolitan Area Network (MAN), a Public Switched Telephone Network (PSTN), a bluetooth network, a ZigBee network, a Near Field Communication (NFC) network, or the like, or any combination thereof. In some embodiments, the network 140 may include at least one network access point through which at least one component of the application scenario 100 may connect to the network 140 to exchange data and/or information.
Storage device 130 may store data, instructions, and/or any other information. In some embodiments, the storage device 130 may store data obtained from the user terminal 110 and/or the service terminal 120. For example, the storage device 130 may store first data acquired by the user terminal 110; for another example, the storage device 130 may store an encoded vector obtained by encoding the aforementioned privacy information; for another example, the storage device 130 may store second data generated based on the foregoing encoding vector and at least part of the information in the first data. In some embodiments, the storage device 130 may store data and/or instructions used by the service terminal 120 to perform or use the exemplary methods described in this specification. In some embodiments, the storage device 130 may include mass storage, removable storage, volatile read-write memory, read-only memory (ROM), and the like, or any combination thereof. Exemplary mass storage devices may include magnetic disks, optical disks, solid state disks, and the like. In some embodiments, storage device 130 may be implemented on a cloud platform. For example only, the cloud platform may include a private cloud, a public cloud, a hybrid cloud, a community cloud, a distributed cloud, an internal cloud, a multi-layer cloud, or the like, or any combination thereof.
In some embodiments, the storage device 130 may be connected to the network 140 to communicate with at least one other component (e.g., the user terminal 110, the service terminal 120) in the application scenario 100. At least one component in the application scenario 100 may access data, instructions, or other information stored in the storage device 130 through the network 140. In some embodiments, the storage device 130 may be directly connected or in communication with one or more components (e.g., user terminal 110) in the application scenario 100. In some embodiments, storage device 130 may be part of user terminal 110 and/or service terminal 120.
It should be noted that the above description about the application scenario 100 is only for illustration and description, and does not limit the application scope of the present specification. Various modifications and changes to the application scenario 100 may be made by those skilled in the art under the guidance of the present specification. However, such modifications and variations are still within the scope of the present description. For example, user terminal 110 and service terminal 120 may include more or fewer functional components.
Fig. 2-4 are schematic block diagrams of systems for protecting user privacy in personalized interactive services according to some embodiments of the present disclosure. In some embodiments, the system for protecting user privacy in personalized interactive services shown in fig. 2 to 4 may be applied to the application scenario 100 shown in fig. 1 in a software and/or hardware manner, for example, may be configured to the user terminal 110 and/or the service terminal 120 in a software and/or hardware manner, so as to prevent leakage of private information of a user while providing personalized services.
Referring to fig. 2, in some embodiments, a system 210 for protecting user privacy in a personalized interactive service may include a personal information acquisition module 211, a privacy information determination module 212, an encoding module 213, a reply content acquisition module 214. Wherein each module can be used to at least perform the following functions.
The personal information acquisition module 211 may be configured to acquire personal information related to the user based on first data input by the user.
The privacy information determination module 212 may be used to determine privacy information from the personal information.
The encoding module 213 may be configured to perform encoding processing on the private information to obtain an encoding vector corresponding to the private information.
The reply content acquisition module 214 may be configured to obtain second data generated based on the encoding vector and at least a portion of the information in the first data, wherein the second data includes personalized reply content corresponding to the first data.
It should be noted that, in some embodiments, the system 210 shown in fig. 2 may be applied to the user terminal 110. In some embodiments, the system 210 shown in fig. 2 may be applied to the service terminal 120.
Referring to fig. 3, in some embodiments, a system 220 for protecting user privacy in a personalized interactive service may include a receiving module 221, a processing module 222, and a transmitting module 223. Wherein each module can be used to at least perform the following functions.
The receiving module 221 may be configured to receive an encoding vector and at least part of information in the first data input by the user, where the encoding vector is obtained by encoding privacy information of the user.
The processing module 222 may be configured to process the encoded vector and the at least part of the information in the first data through a trained second model to obtain second data.
The sending module 223 may be configured to send the second data to display personalized reply content corresponding to the question information contained in the first data to the user.
In some embodiments, the system 220 shown in fig. 3 may be applied to the user terminal 110. In some embodiments, the system 220 shown in fig. 3 may be applied to the service terminal 120.
Referring to fig. 4, in some embodiments, a system 230 for protecting user privacy in a personalized interactive service may include a personal information acquisition module 211, a privacy information determination module 212, an encoding module 213, a processing module 222, a presentation module 231. Wherein each module can be used to at least perform the following functions.
The personal information acquisition module 211 may be configured to acquire personal information related to the user based on first data input by the user.
The privacy information determination module 212 may be used to determine privacy information from the personal information.
The encoding module 213 may be configured to perform encoding processing on the private information to obtain an encoding vector corresponding to the private information.
And the processing module 222 is configured to process the encoded vector and the at least part of the information in the first data through a trained second model to obtain second data, where the second data includes personalized reply content corresponding to the first data.
And a display module 231, configured to display the personalized reply content included in the second data to the user.
In some embodiments, the system 230 for protecting user privacy in a personalized interactive service may further include a first transmitting module, a first receiving module, a second transmitting module, and a second receiving module. Wherein the first transmitting module may be configured to transmit the encoded vector and at least part of the information in the first data; the first receiving module may be configured to receive the encoded vector and the at least part of the information in the first data; the second transmitting module may be configured to transmit the second data; the second receiving module may be configured to receive the second data.
In some embodiments, the first sending module and the second receiving module may be part of the reply content acquisition module 214 described previously. In some embodiments, the first receiving module may refer to the aforementioned receiving module 221, or may be considered as part of the aforementioned receiving module 221. Similarly, the second transmitting module may refer to the aforementioned transmitting module 223, or the second transmitting module may be considered as a part of the aforementioned transmitting module 223.
For more details on the above modules, reference may be made to other positions (e.g. fig. 5-6 and related descriptions) in this specification, and details are not repeated here.
It should be appreciated that the systems for protecting user privacy in personalized interactive services and modules thereof shown in fig. 2-4 may be implemented in a variety of ways. For example, in some embodiments, the system and its modules may be implemented in hardware, software, or a combination of software and hardware. Wherein the hardware portion may be implemented using dedicated logic; the software portions may then be stored in a memory and executed by a suitable instruction execution system, such as a microprocessor or special purpose design hardware. Those skilled in the art will appreciate that the methods and systems described above may be implemented using computer executable instructions and/or embodied in processor control code, such as provided on a carrier medium such as a magnetic disk, CD or DVD-ROM, a programmable memory such as read only memory (firmware), or a data carrier such as an optical or electronic signal carrier. The system of the present specification and its modules may be implemented not only with hardware circuits such as very large scale integrated circuits or gate arrays, semiconductors such as logic chips, transistors, etc., or programmable hardware devices such as field programmable gate arrays, programmable logic devices, etc., but also with software executed by various types of processors, for example, and with a combination of the above hardware circuits and software (e.g., firmware).
It should be noted that the above description of a system for protecting user privacy in a personalized interactive service is provided for illustrative purposes only and is not intended to limit the scope of the present description. It will be appreciated by those skilled in the art from this disclosure that various modules may be combined arbitrarily or constituting a subsystem in connection with other modules without departing from this concept. For example, each of the above modules may be a different module in one system, or the functions of two or more modules may be implemented by one module. Such variations are within the scope of the present description. In some embodiments, the foregoing modules may be part of the service terminal 120 and/or the user terminal 110.
Fig. 5 is an exemplary flow chart of a method for protecting user privacy in a personalized interactive service according to some embodiments of the present description. In some embodiments, method 300 may be performed by processing logic that may comprise hardware (e.g., circuitry, dedicated logic, programmable logic, microcode, etc.), software (instructions run on a processing device to perform hardware simulation), or the like, or any combination thereof. In some embodiments, one or more operations in the flowchart of the method 300 for protecting user privacy in personalized interactive services shown in fig. 5 may be implemented by the service terminal 120 and/or the user terminal 110 shown in fig. 1. For example, the method 300 may be stored in the storage device 130 in the form of a computer program or instructions and invoked and/or executed by the service terminal 120 and/or the user terminal 110.
Referring to fig. 5, in some embodiments, a method 300 for protecting user privacy in a personalized interactive service may include.
Step 310, personal information related to a user is acquired based on first data input by the user. In some embodiments, step 310 may be performed by personal information acquisition module 211.
In some embodiments, the user may input first data at the user terminal 110, where the first data may include text data and/or voice data, and the user terminal 110 may process the first data, or send the first data to the service terminal 120 for processing, so as to obtain reply content corresponding to the first data. In some embodiments, the user terminal 110 and/or the service terminal 120 may generate the reply content corresponding to the foregoing first data based on the personal information of the user, so that the reply content is more accurate and more accords with the actual requirement of the user.
In some embodiments, the personal information obtaining module 211 may screen personal information related to the problem information included in the first data from preset information input by the user. The preset information input by the user may refer to information input by the user before interaction. For example, user identity information input when a user registers an account or modifies account information; for another example, character information input when the user sets an interactive character before starting the interaction. In some embodiments, the user may set the roles of both parties to the interaction prior to commencing the interaction, thereby interacting in the form of role-playing.
In some embodiments, the question information included in the first data may refer to question information extracted based on the latest piece of data input by the user. In some embodiments, the problem information included in the first data may refer to problem information extracted after comprehensively analyzing a latest piece of data input by the user and historical data input by the user. In some embodiments, the personal information related to the question information contained in the aforementioned first data may refer to a condition that needs to be considered in answering the question information, and the condition may include user's own information and information related to the user (e.g., environmental information in which the user is located, etc.).
Illustratively, when the question information entered by the user is "i have symptoms of dizziness and nausea, i may get what is? "when personal information related to the problem information may include, but is not limited to, past medical history (e.g., heart disease, hypertension, etc.), eating (e.g., using certain medications that may cause dizziness and nausea, drinking, eating foods that may be at risk of poisoning, not eating for a long time, etc.), environmental factors (e.g., pungent odor, high/low temperature, altitude, etc.), recent activities (e.g., head impact, etc.), other symptoms (e.g., vomiting, headache, blurred vision, tinnitus, etc.), and the like.
In some embodiments, the personal information obtaining module 211 may further extract personal information related to the problem information from the aforementioned first data. The first data may refer to all data entered by the user during the interaction, which may include one or more pieces of data entered by the user during the interaction. In other words, in some embodiments, the personal information acquisition module 211 may extract personal information related to the problem information from current interaction data and/or historical interaction data entered by the user.
In some embodiments, the foregoing preset information may be input through the user terminal 110 and stored locally to the user terminal 110 (e.g., the storage device 130 configured locally to the user terminal 110), and the personal information obtaining module 211 performs problem information extraction on the first data to obtain the target problem information. As described above, in some embodiments, the personal information acquisition module 211 may perform question information extraction based on the latest piece of data input by the user, resulting in the target question information. In some embodiments, the personal information obtaining module 211 may also perform the comprehensive analysis on the latest piece of data input by the user and the historical data input by the user, and then extract the problem information to obtain the target problem information. Further, the personal information acquisition module 211 may screen personal information related to the target problem information from preset information stored locally at the user terminal 110.
In some embodiments, the personal information acquisition module 211 may filter personal information related to the target issue information through a pre-configured relationship table or model. Illustratively, in some embodiments, personal information related to each question (or question type) may be preconfigured, and in the subsequent process, the personal information related to the target question information may be screened by means of a table look-up. In some embodiments, the similarity between the target problem information and the preset problem configured in advance may be calculated, and the relevant personal information corresponding to the preset problem may be used as the personal information relevant to the target problem information when the similarity is greater than or equal to a preset threshold. In some embodiments, the foregoing preset threshold may be set according to actual requirements, for example, may be set to 85% or other larger or smaller values.
In some embodiments, the personal information related to the target problem information may also be processed by a pre-trained neural network model, where the neural network model may be trained by a number of sample problems and labels corresponding to the sample problems. Wherein the tag may include relevant personal information corresponding to the sample question.
In some embodiments, the foregoing preset information may be input through the user terminal 110 and then transmitted to the service terminal 120 (e.g., the storage device 130 configured at the service terminal 120) for storage. The personal information obtaining module 211 may extract the problem information of the first data to obtain the target problem information, then send a data obtaining instruction related to the target problem information to the service terminal 120, and receive the personal information related to the target problem issued by the service terminal 120 based on the data obtaining instruction. In the embodiment of the present specification, the data acquisition instruction related to the target question information may be understood as an instruction for acquiring personal information related to the target question. One or more pieces of personal information associated with answering the target question information may be included in the instruction. In some embodiments, the data acquisition instructions may be generated based on personal information required to answer the target question information.
In some embodiments, the personal information acquisition module 211 may extract personal information related to the aforementioned target problem from the first data input by the user. For example, the user may input first data to describe the self situation during the interaction, and the personal information obtaining module 211 may obtain the personal information related to the foregoing objective problem by extracting the first data.
In some embodiments, the aforementioned first data may include data entered by a user in response to an inquiry of the user terminal 110. Specifically, in some embodiments, the personal information obtaining module 211 may detect whether the personal information related to the target problem information screened from the preset information meets the reply condition corresponding to the problem information, if the personal information screened from the preset information meets the reply condition of the target problem information, it indicates that the reply corresponding to the target problem information can be completed based on the currently screened information, and at this time, the subsequent processing may be performed based on the personal information related to the target problem information screened from the preset information. If the information is not satisfied, it indicates that the answer corresponding to the target question information cannot be completed based on the currently screened information (for example, a requirement is lacking), at this time, the user may be queried to obtain the required information, and then personal information related to the target question information may be extracted from the first data input by the user for the query.
In some embodiments, the personal information obtaining module 211 may query, through a pre-configured detection table, whether the personal information related to the target question information screened from the pre-set information meets the reply condition corresponding to the question information. Specifically, in some embodiments, the detection table may include necessary information corresponding to a plurality of preset questions, and the personal information obtaining module 211 may determine whether the personal information related to the target question information screened from the preset information satisfies the reply condition corresponding to the question information by querying the preset questions corresponding to the target question information in the detection table and comparing the personal information related to the target question information screened from the preset information with the necessary information corresponding to the preset questions. For example, when the personal information related to the target problem information screened from the preset information completely covers the necessary information corresponding to the target problem information, it indicates that the answer condition corresponding to the problem information is satisfied, and otherwise, the answer condition is not satisfied.
In some embodiments, the personal information acquisition module 211 may acquire the personal information related to the problem information contained in the first data in any of the above ways. In some embodiments, the personal information obtaining module 211 may obtain the personal information related to the problem information contained in the first data by both obtaining the personal information from the preset information and obtaining the personal information from the first data.
Step 320, determining privacy information from the personal information. In some embodiments, step 320 may be performed by the privacy information determination module 212.
The personal information required to answer the aforementioned target question information may be obtained through step 310, but may include personal privacy that some users do not wish to reveal. Based on this, in some embodiments of the present description, in order to protect the personal privacy of the user, the privacy information to be protected may be determined from the personal information acquired in step 310 by the privacy information determination module 212.
In some embodiments, the privacy information determination module 212 may determine the privacy information to be protected from the foregoing personal information by means of sensitive word recognition. In some embodiments, the foregoing personal information may be presented to the user through the user terminal 110, and the privacy information determination module 212 may determine the privacy information to be protected from the foregoing personal information in response to the operation of the user. Illustratively, in some embodiments, the user terminal 110 may present the personal information obtained in step 310 to the user through a user interface, and the user may select or reselect the privacy information to be protected through a touch screen, a key operation, and/or voice. In some embodiments, the user terminal 110 may output the personal information acquired in step 310 to the user through voice, and the user may determine the privacy information to be protected through touch screen, key operation, and/or voice.
In some embodiments, considering that the user may keep the preset information filling and interacting process (for example, not inputting all information or inputting real information) due to privacy disclosure, a certain degree of trust may be increased when the step of determining the privacy information to be protected is operated, so that there may be a need for supplementing personal information at this time. Based on this, in order to satisfy the need for the user to supplement the personal information, in some embodiments, the privacy information determination module 212 may add additional information not included in the aforementioned personal information (i.e., the personal information determined in step 310) in response to the user's operation, and regard the additional information as the privacy information to be protected.
In some embodiments, the additional information that the user supplements may conflict with some of the aforementioned personal information (e.g., the user has entered spurious information before supplementing the additional information), in which case the privacy information determination module 212 may use the additional information to replace the corresponding portion of the personal information determined in step 310.
In some embodiments, the additional information entered by the user may be in the form of a non-formal expression. For this case, in order to ensure that the additional information can be normally used in the subsequent process, the additional information input by the user can be subjected to semantic extraction to obtain a corresponding semantic text, and then the semantic text is used as privacy information to be protected.
And 330, encoding the privacy information to obtain an encoding vector corresponding to the privacy information. In some embodiments, step 330 may be performed by encoding module 213.
In some embodiments, in order to protect the foregoing private information from disclosure during the application process, the encoding module 213 may encode the private information determined in step 320 to obtain the encoding vector corresponding to the private information. In this specification, the coded vector may have a mapping relationship with the aforementioned privacy information, in other words, the coded vector may contain the privacy information of the user. Through converting the privacy information of the user into the coding vector and then using the coding vector, corresponding personalized reply content can be generated based on the personal information (including the privacy information) of the user in the subsequent process, and privacy leakage of the user in the using process can be prevented, so that the accuracy and privacy safety of personalized interaction service are improved.
In some embodiments, the foregoing encoding vector may be obtained by processing the privacy information determined in step 320 through the trained first model. The first model may have a Embedding function (an embedding function, i.e., converting unstructured data such as text, image, voice, etc. into structured vector data), which may convert privacy information of a user into a coded vector. The encoding vector is a vector for converting text or voice data corresponding to the privacy information into a numerical representation. In some embodiments, the encoded vector may be a high-dimensional vector, which may contain features corresponding to multiple dimensions. In some embodiments, the high-dimensional vector may point to a number of quantity parameters greater than a particular threshold, such as 50, 100, 200, 1000, etc. In some embodiments, the first model may be configured in the user terminal 110 or other third party device that allows the user to be recognized by the product party. The training manner of the first model may refer to other locations in the present specification (e.g., step 340 and related descriptions), and will not be described in detail herein.
Step 340, obtaining second data generated based on the encoding vector and at least part of the information in the first data, wherein the second data comprises personalized reply content corresponding to the first data. In some embodiments, step 340 may be performed by reply content acquisition module 214.
After obtaining the encoding vector corresponding to the user privacy information through the above steps, the reply content obtaining module 214 may obtain second data generated based on the encoding vector and at least part of the information in the first data, where at least part of the information in the first data includes the foregoing question information, and the second data includes the personalized reply content corresponding to the question information in the first data.
In some embodiments, the second data may be obtained by processing at least part of the information in the first data and the encoded vector by a trained second model. In some embodiments, the second model may be configured at the user terminal 110 or the service terminal 120. Specifically, when the second model is configured at the user terminal 110, the reply content acquisition module 214 may process, locally at the user terminal 110, the second data based on the encoded vector and the problem information contained in the first data, resulting in the second data. When the second model is configured in the service terminal 120, the reply content acquisition module 214 may transmit the encoded vector and the problem information contained in the first data to the service terminal 120, and then receive the second data processed by the service terminal 120 based on the encoded vector and the problem information contained in the first data.
In the embodiment of the present specification, the second model is similar to the first model, and may also have Embedding functions. In some embodiments, the second model may include two training phases, wherein a first training phase (also referred to as a preliminary training phase) may be trained using a number of sample personal information (plaintext data) and sample problem information, and after completion of the first training phase, the second model may obtain the ability to convert the sample personal information into a coded vector. In the subsequent process, the second model may be used to process the personal information that is not subjected to the encoding processing and the problem information included in the first data, so as to obtain a corresponding encoding vector.
In some embodiments, after the second model completes the first training phase, the part of the second model that includes Embedding functions (i.e., the part that generates the encoded vector) may be split as the trained first model. In some embodiments, after the second model completes the first training stage, the first model may be trained using intermediate layer data (for example, the sample personal information and the corresponding code vector) of the second model as training data, to obtain a trained first model. By way of example only, in some embodiments, the foregoing sample personal information may be input into a first model, and the first model may be trained using, as a label, a coded vector obtained by processing the sample personal information by a primarily trained second model, until a preset number of iterations or a loss threshold is reached, to obtain the foregoing trained first model.
After the first model has been trained, it may be used to encode the user's privacy information to convert the privacy information into a corresponding encoded vector. Further, the coded vector and the corresponding sample question information output by the first model can be used as training data to perform secondary training on the second model, wherein the output corresponding to the coded vector and the sample question information is sample reply content corresponding to the coded vector and the sample question information, and therefore the second model can obtain the capability of generating corresponding personalized reply content based on the coded vector and the question information. In some embodiments, the second model may generate personalized reply content corresponding to the encoded vector and the question information based on the knowledge-graph.
It should be noted that, in the embodiment of the present specification, by using the intermediate layer data of the first model after the preliminary training as the training data of the second model, and using the encoded vector output by the trained first model as the input when the second model is subjected to the secondary training, the second model can better learn the deviation between the first model and the second model, thereby further improving the accuracy of the second model. In some embodiments, by splitting the part of the second model that includes Embedding functions as the first model, the output of the first model can be more matched with the input of the second model, thereby reducing the loss of information and noise, and at the same time, the training efficiency and quality of the first model can be improved, and additional design and adjustment can be avoided.
Fig. 6 is an exemplary flow chart of a method for protecting user privacy in a personalized interactive service according to further embodiments of the present description. Similar to method 300, method 400 may be performed by processing logic. In some embodiments, one or more operations in the flowchart of the method 400 for protecting user privacy in personalized interactive services shown in fig. 6 may be implemented by the service terminal 120 and/or the user terminal 110 shown in fig. 1.
Referring to fig. 6, in some embodiments, a method 400 for protecting user privacy in a personalized interactive service may include.
In step 410, a coding vector and at least part of information in first data input by a user are received, wherein the coding vector is obtained by coding privacy information of the user. In some embodiments, step 410 may be performed by the receiving module 221.
In some embodiments, step 410 may be implemented by user terminal 110 or service terminal 120. Specifically, when step 410 is implemented by the user terminal 110, the receiving module 221 may receive at least part of information in the first data input by the user and the encoded vector transmitted by other modules of the user terminal 110. When step 410 is implemented by the service terminal 120, the receiving module 221 may receive at least part of information in the first data input by the user and the encoded vector transmitted by the user terminal 110. As can be seen from the above, in the embodiment of the present specification, at least part of the information in the first data includes problem information. In some embodiments, the problem information may be plaintext data or other processed data.
It should be noted that, in some embodiments, the data received by the receiving module 221 may include, but is not limited to, the foregoing encoding vector and the problem information included in the first data. For example, in some embodiments, the data received by the receiving module 221 may further include personal information related to problem information in the first data without being encoded by the encoding module 213.
In some embodiments, the problem information and/or the personal information that is not encoded by the encoding module 213 may also be obtained by encoding by the encoding module 213. In other words, the data received by the receiving module 221 may be a coded vector obtained by performing the coding process on the problem information included in the first data and a coded vector obtained by performing the coding process on all the personal information obtained in the above step 310.
And step 420, processing the coding vector and the at least part of information in the first data through a trained second model to obtain second data. In some embodiments, step 420 may be performed by processing module 222.
Step 420 may be implemented by user terminal 110 or service terminal 120, similar to step 410. When this step is implemented by the user terminal 110, the second model may be configured at the user terminal 110. When this step is implemented by the service terminal 120, the second model may be configured at the service terminal 120. For further details regarding this second model, reference may be made to other locations in this specification (e.g. the relevant discussion of fig. 5 section), which will not be repeated here.
And step 430, sending the second data to display personalized reply content corresponding to the question information contained in the first data to the user. In some embodiments, step 430 may be performed by the transmit module 223.
In some embodiments, step 430 may be implemented by user terminal 110. Specifically, when this step is implemented by the user terminal 110, the sending module 223 may send the second data processed by the processing module 222 to other modules (for example, a display module) of the user terminal 110, so as to display the personalized reply content corresponding to the question information included in the first data to the user.
In some embodiments, step 430 may also be implemented by service terminal 120. When this step is implemented by the service terminal 120, the sending module 223 may generate the second data processed by the processing module 222 to the user terminal 110, and display the personalized reply content corresponding to the question information included in the first data to the user through the user terminal 110.
In some embodiments, steps 310-340 may be implemented by the user terminal 110, and steps 410-430 may be implemented by the service terminal 120. In the process of the user terminal 110 interacting with the service terminal 120, the user terminal 110 may extract the first data to obtain the target problem information, and then send a data acquisition instruction related to the target problem information to the service terminal 120. Further, the service terminal 120 may receive the data acquisition instruction and acquire personal information related to the target problem information based on the data acquisition instruction, and then transmit the personal information related to the target problem information to the user terminal 110. Finally, the user terminal 110 may receive the personal information related to the aforementioned target problem information issued by the service terminal 120 based on the data acquisition instruction.
In some embodiments, the user terminal 110 may extract the aforementioned first data to obtain the target problem information, and then send the target problem information to the service terminal 120. Further, the service terminal 120 may receive the target problem information and screen personal information related to the target problem information from preset information input by the user based on the target problem information. Then, the service terminal 120 may detect whether the personal information related to the target question information satisfies the corresponding reply condition, and if so, transmit the personal information related to the target question information to the user terminal 110, and if not, transmit an inquiry command to inquire the personal information about the target question information while transmitting the personal information related to the target question information to the user terminal 110, thereby guiding the user to supplement necessary information for replying to the target question information.
In the embodiment of the present disclosure, when it is detected that the current information does not meet the reply condition corresponding to the target question information, the query instruction is sent to guide the user to supplement necessary information for answering the target question information, so that the reference information is more comprehensive, the personalized reply content obtained in the subsequent processing process is more accurate, and the use experience of the user is improved.
In summary, the possible benefits of the embodiments of the present disclosure include, but are not limited to: (1) In the method and the system for protecting user privacy in personalized interactive service provided by some embodiments of the present disclosure, by converting the user's privacy information into the encoded vector and then using the encoded vector, not only can corresponding personalized reply content be generated based on personal information including the user's privacy information in a subsequent process, but also user privacy disclosure in the information using process can be prevented, thereby improving the security of the user's privacy information while improving the accuracy of the personalized interactive service; (2) In the method and the system for protecting user privacy in personalized interactive service provided by some embodiments of the present disclosure, by using intermediate layer data of a primarily trained second model as training data of a first model, and using a coded vector output by the trained first model as input when the second model is secondarily trained, the second model can better learn deviation between the first model and the second model, thereby further improving accuracy of the second model; (3) In the method and the system for protecting user privacy in personalized interactive service provided in some embodiments of the present disclosure, by splitting the part including Embedding functions in the second model as the first model, the output of the first model can be more matched with the input of the second model, so that loss and noise of information are reduced, and meanwhile, training efficiency and quality of the first model can be improved, and additional design and adjustment are avoided; (4) In the method and the system for protecting the privacy of the user in the personalized interactive service provided by some embodiments of the present disclosure, when the current information is detected not to meet the reply condition corresponding to the target question information, the query instruction is sent to guide the user to supplement the necessary information for replying the target question information, so that the reference information is more comprehensive, the personalized reply content obtained in the subsequent processing process is more accurate, and the use experience of the user is improved.
It should be noted that, the benefits that may be generated by different embodiments may be different, and in different embodiments, the benefits that may be generated may be any one or a combination of several of the above, or any other benefits that may be obtained.
The present description also provides a computer-readable storage medium that may be used to store computer instructions that, when executed by a processor, may implement the method for protecting user privacy in a personalized interactive service described in any of the embodiments of the present description. For more details on this method, reference is made to the above, and no further description is given here.
The embodiments of the present specification also provide a computer program product comprising a computer program or instructions which, when executed by a processor, implement the method for protecting user privacy in a personalized interactive service described in any of the embodiments of the present specification.
While the basic concepts have been described above, it will be apparent to those skilled in the art that the foregoing detailed disclosure is by way of example only and is not intended to be limiting. Although not explicitly described herein, various modifications, improvements, and adaptations to the present disclosure may occur to one skilled in the art. Such modifications, improvements, and modifications are intended to be suggested within this specification, and therefore, such modifications, improvements, and modifications are intended to be included within the spirit and scope of the exemplary embodiments of the present invention.
Meanwhile, the specification uses specific words to describe the embodiments of the specification. Reference to "one embodiment," "an embodiment," and/or "some embodiments" means that a particular feature, structure, or characteristic is associated with at least one embodiment of the present description. Thus, it should be emphasized and should be appreciated that two or more references to "an embodiment" or "one embodiment" or "an alternative embodiment" in various positions in this specification are not necessarily referring to the same embodiment. Furthermore, certain features, structures, or characteristics of one or more embodiments of the present description may be combined as suitable.
Furthermore, those skilled in the art will appreciate that the various aspects of the specification can be illustrated and described in terms of several patentable categories or circumstances, including any novel and useful procedures, machines, products, or materials, or any novel and useful modifications thereof. Accordingly, aspects of the present description may be performed entirely by hardware, entirely by software (including firmware, resident software, micro-code, etc.), or by a combination of hardware and software. The above hardware or software may be referred to as a "data block," module, "" engine, "" unit, "" component, "or" system. Furthermore, aspects of the specification may take the form of a computer product, comprising computer-readable program code, embodied in one or more computer-readable media.
The computer storage medium may contain a propagated data signal with the computer program code embodied therein, for example, on a baseband or as part of a carrier wave. The propagated signal may take on a variety of forms, including electro-magnetic, optical, etc., or any suitable combination thereof. A computer storage medium may be any computer readable medium that can communicate, propagate, or transport a program for use by or in connection with an instruction execution system, apparatus, or device. Program code located on a computer storage medium may be propagated through any suitable medium, including radio, cable, fiber optic cable, RF, or the like, or a combination of any of the foregoing.
The computer program code necessary for operation of portions of the present description may be written in any one or more programming languages, including an object oriented programming language such as Java, scala, smalltalk, eiffel, JADE, emerald, C ++, c#, vb net, python, and the like, a conventional programming language such as C language, visual Basic, fortran2003, perl, COBOL2002, PHP, ABAP, a dynamic programming language such as Python, ruby, and Groovy, or other programming languages, and the like. The program code may execute entirely on the user's computer or as a stand-alone software package, partly on the user's computer and partly on a remote computer or entirely on the remote computer or processing device. In the latter scenario, the remote computer may be connected to the user's computer through any form of network, such as a Local Area Network (LAN) or a Wide Area Network (WAN), or the connection may be made to an external computer (for example, through the Internet), or the use of services such as software as a service (SaaS) in a cloud computing environment.
Furthermore, the order in which the elements and sequences are processed, the use of numerical letters, or other designations in the description are not intended to limit the order in which the processes and methods of the description are performed unless explicitly recited in the claims. While certain presently useful inventive embodiments have been discussed in the foregoing disclosure, by way of various examples, it is to be understood that such details are merely illustrative and that the appended claims are not limited to the disclosed embodiments, but, on the contrary, are intended to cover all modifications and equivalent arrangements included within the spirit and scope of the embodiments of the present disclosure. For example, while the system components described above may be implemented by hardware devices, they may also be implemented solely by software solutions, such as installing the described system on an existing processing device or mobile device.
Likewise, it should be noted that in order to simplify the presentation disclosed in this specification and thereby aid in understanding one or more inventive embodiments, various features are sometimes grouped together in a single embodiment, figure, or description thereof. This method of disclosure does not imply that the subject matter of the present description requires more features than are set forth in the claims. Indeed, less than all of the features of a single embodiment disclosed above.
Finally, it should be understood that the embodiments described in this specification are merely illustrative of the principles of the embodiments of this specification. Other variations are possible within the scope of this description. Thus, by way of example, and not limitation, alternative configurations of embodiments of the present specification may be considered as consistent with the teachings of the present specification. Accordingly, the embodiments of the present specification are not limited to only the embodiments explicitly described and depicted in the present specification.
Claims (15)
1. A method for protecting user privacy in a personalized interactive service, the method comprising:
Acquiring personal information related to a user based on first data input by the user; the first data includes problem information;
determining privacy information from the personal information;
performing coding processing on the privacy information to obtain a coding vector corresponding to the privacy information;
obtaining second data generated based on the encoding vector and at least part of the information in the first data, wherein the second data comprises personalized reply content corresponding to the problem information in the first data;
The coding vector is obtained by processing the privacy information through a trained first model, and the second data is obtained by processing at least part of information in the coding vector and the first data through a trained second model;
The trained first model is obtained through training based on intermediate layer data of a primarily trained second model or is obtained through splitting from the primarily trained second model; and the trained second model is obtained by performing secondary training based on the output data of the trained first model.
2. The method of claim 1, wherein the obtaining personal information related to the user based on the first data entered by the user comprises:
Screening personal information related to problem information contained in the first data from preset information input by the user, and/or extracting personal information related to the problem information from the first data.
3. The method of claim 2, wherein the preset information is input through a user terminal and stored locally at the user terminal, and the filtering personal information related to the problem information included in the first data from the preset information input by the user comprises:
extracting problem information from the first data to obtain target problem information; and screening personal information related to the target problem information from the preset information.
4. The method of claim 2, wherein the preset information is input through a user terminal and is sent to a server terminal for storage, and the filtering personal information related to the problem information included in the first data from the preset information input by the user comprises:
extracting the first data to obtain target problem information; and sending a data acquisition instruction related to the target problem information to the server, and receiving personal information related to the target problem information, which is issued by the server based on the data acquisition instruction.
5. The method of claim 2, wherein the extracting personal information related to the problem information from the first data comprises:
Detecting whether personal information related to the problem information contained in the first data screened from preset information input by the user meets a reply condition corresponding to the problem information;
if not, inquiring the user, and extracting personal information related to the problem information from first data input by the user for the inquiry.
6. The method of claim 1, wherein said determining privacy information from said personal information comprises:
Displaying the personal information to the user, and determining privacy information from the personal information in response to the operation of the user; the user operation at least comprises selecting privacy information to be protected.
7. The method of claim 6, wherein said determining privacy information from said personal information further comprises:
And adding additional information which is not contained in the personal information in response to the operation of the user, and taking the additional information as privacy information to be protected.
8. A method for protecting user privacy in a personalized interactive service, the method comprising:
receiving a coding vector and at least part of information in first data input by a user, wherein the coding vector is obtained by coding privacy information of the user; the first data includes problem information;
Processing the coding vector and at least part of information in the first data through a trained second model to obtain second data; the second data comprises personalized reply content corresponding to the problem information in the first data;
transmitting the second data to display the personalized reply content to the user;
The coding vector is obtained by processing the privacy information through a trained first model, the trained first model is obtained by training based on intermediate layer data of a primarily trained second model or is obtained by splitting the intermediate layer data from the primarily trained second model, and the trained second model is obtained by performing secondary training based on output data of the trained first model.
9. The method of claim 8, wherein prior to receiving the encoded vector and at least a portion of the information in the first data entered by the user, the method further comprises:
receiving a data acquisition instruction related to the problem information;
and acquiring personal information related to the problem information based on the data acquisition instruction, and transmitting the personal information related to the problem information.
10. The method of claim 8, wherein prior to receiving the encoded vector and at least a portion of the information in the first data entered by the user, the method further comprises:
Receiving target problem information obtained by extracting the first data;
screening personal information related to the target problem information from preset information input by a user based on the target problem information;
detecting whether personal information related to the target question information meets a reply condition corresponding to the target question information;
If yes, personal information related to the target problem information is sent; if the target question information is not satisfied, sending a query instruction to query the target question information for the personal information while sending the personal information related to the target question information.
11. A system for protecting user privacy in a personalized interactive service, the system comprising:
The personal information acquisition module is used for acquiring personal information related to the user based on first data input by the user; the first data includes problem information;
A privacy information determining module for determining privacy information from the personal information;
the encoding module is used for encoding the privacy information to obtain an encoding vector corresponding to the privacy information;
A reply content acquisition module for acquiring second data generated based on the encoding vector and at least part of the information in the first data, the second data including personalized reply content corresponding to the question information in the first data;
The coding vector is obtained by processing the privacy information through a trained first model, and the second data is obtained by processing at least part of information in the coding vector and the first data through a trained second model;
The trained first model is obtained through training based on intermediate layer data of a primarily trained second model or is obtained through splitting from the primarily trained second model; and the trained second model is obtained by performing secondary training based on the output data of the trained first model.
12. A system for protecting user privacy in a personalized interactive service, the system comprising:
the receiving module is used for receiving a coding vector and at least part of information in first data input by a user, wherein the coding vector is obtained by coding privacy information of the user; the first data includes problem information;
The processing module is used for processing the coding vector and at least part of information in the first data through a trained second model to obtain second data; the second data comprises personalized reply content corresponding to the problem information in the first data;
The sending module is used for sending the second data so as to display the personalized reply content to the user;
The coding vector is obtained by processing the privacy information through a trained first model, the trained first model is obtained by training based on intermediate layer data of a primarily trained second model or is obtained by splitting the intermediate layer data from the primarily trained second model, and the trained second model is obtained by performing secondary training based on output data of the trained first model.
13. A system for protecting user privacy in a personalized interactive service, comprising:
The personal information acquisition module is used for acquiring personal information related to the user based on first data input by the user; the first data includes problem information;
A privacy information determining module for determining privacy information from the personal information;
the encoding module is used for encoding the privacy information to obtain an encoding vector corresponding to the privacy information;
The processing module is used for processing the coding vector and at least part of information in the first data through a trained second model to obtain second data, wherein the second data comprises personalized reply content corresponding to the problem information in the first data;
the display module is used for displaying the personalized reply content included in the second data to the user;
The coding vector is obtained by processing the privacy information through a trained first model, the trained first model is obtained by training based on intermediate layer data of a primarily trained second model or is obtained by splitting the intermediate layer data from the primarily trained second model, and the trained second model is obtained by performing secondary training based on output data of the trained first model.
14. A computer readable storage medium storing computer instructions which, when executed by a processor, implement the method of any one of claims 1 to 10.
15. A computer program product comprising a computer program or instructions which, when executed by a processor, carries out the method of any one of claims 1 to 10.
Priority Applications (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN202311859590.9A CN118036057B (en) | 2023-12-30 | 2023-12-30 | Method, system, storage medium and program product for protecting user privacy |
Applications Claiming Priority (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN202311859590.9A CN118036057B (en) | 2023-12-30 | 2023-12-30 | Method, system, storage medium and program product for protecting user privacy |
Publications (2)
Publication Number | Publication Date |
---|---|
CN118036057A CN118036057A (en) | 2024-05-14 |
CN118036057B true CN118036057B (en) | 2024-08-30 |
Family
ID=90994181
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
CN202311859590.9A Active CN118036057B (en) | 2023-12-30 | 2023-12-30 | Method, system, storage medium and program product for protecting user privacy |
Country Status (1)
Country | Link |
---|---|
CN (1) | CN118036057B (en) |
Citations (1)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN117216224A (en) * | 2023-10-18 | 2023-12-12 | 中国建设银行股份有限公司 | Text robot response method, device, processor and storage medium |
Family Cites Families (4)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN111523324B (en) * | 2020-03-18 | 2024-01-26 | 大箴(杭州)科技有限公司 | Named entity recognition model training method and device |
CN112131366B (en) * | 2020-09-23 | 2024-02-09 | 腾讯科技(深圳)有限公司 | Method, device and storage medium for training text classification model and text classification |
CN112308573A (en) * | 2020-10-29 | 2021-02-02 | 平安普惠企业管理有限公司 | Intelligent customer service method and device, storage medium and computer equipment |
CN116595977A (en) * | 2023-05-21 | 2023-08-15 | 深圳市元世界软件科技有限公司 | Method for detecting and protecting personal information in large language model |
-
2023
- 2023-12-30 CN CN202311859590.9A patent/CN118036057B/en active Active
Patent Citations (1)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN117216224A (en) * | 2023-10-18 | 2023-12-12 | 中国建设银行股份有限公司 | Text robot response method, device, processor and storage medium |
Also Published As
Publication number | Publication date |
---|---|
CN118036057A (en) | 2024-05-14 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
US12107805B2 (en) | Systems and methods for multi-channel messaging and communication | |
CN117521675A (en) | Information processing method, device, equipment and storage medium based on large language model | |
US20220129556A1 (en) | Systems and Methods for Implementing Smart Assistant Systems | |
US9672467B2 (en) | Systems and methods for creating and implementing an artificially intelligent agent or system | |
US11928985B2 (en) | Content pre-personalization using biometric data | |
US20220156823A1 (en) | System and method for product searching based on natural language processing | |
US11288293B2 (en) | Methods and systems for ensuring quality of unstructured user input content | |
US10540440B2 (en) | Relation extraction using Q and A | |
CN114757176A (en) | Method for obtaining target intention recognition model and intention recognition method | |
CN109739969A (en) | Answer generation method and intelligent conversational system | |
CN115803734A (en) | Natural language enrichment using action interpretation | |
KR20240091051A (en) | Deep learning techniques for extracting embedded data from documents | |
CN117370512A (en) | Method, device, equipment and storage medium for replying to dialogue | |
CN116933800A (en) | Template-based generation type intention recognition method and device | |
CN118036057B (en) | Method, system, storage medium and program product for protecting user privacy | |
CN117764373A (en) | Risk prediction method, apparatus, device and storage medium | |
CN116186771A (en) | Text processing method, device and medium applied to artificial intelligence generated content | |
CN113609275B (en) | Information processing method, device, equipment and storage medium | |
Pavel | Comparing chatbot frameworks: A study of Rasa and Botkit | |
CN112163078B (en) | Intelligent response method, device, server and storage medium | |
Yeasmin | Privacy analysis of voice user interfaces | |
KR102494944B1 (en) | Contents creating method and a system thereof | |
CN111968632A (en) | Call voice acquisition method and device, computer equipment and storage medium | |
Bouras et al. | On the Development of a Novel Chatbot Generator Architecture: Design and Assessment Issues | |
Imrie | An Investigation into the Development of a Theoretical Model for a Virtual Personal Assistant |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
PB01 | Publication | ||
PB01 | Publication | ||
SE01 | Entry into force of request for substantive examination | ||
SE01 | Entry into force of request for substantive examination | ||
GR01 | Patent grant | ||
GR01 | Patent grant |