CN112667881A - Method and apparatus for generating information - Google Patents

Method and apparatus for generating information Download PDF

Info

Publication number
CN112667881A
CN112667881A CN201910982603.9A CN201910982603A CN112667881A CN 112667881 A CN112667881 A CN 112667881A CN 201910982603 A CN201910982603 A CN 201910982603A CN 112667881 A CN112667881 A CN 112667881A
Authority
CN
China
Prior art keywords
information
user
behavior information
behavior
historical
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
CN201910982603.9A
Other languages
Chinese (zh)
Inventor
刘海
陈洋
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Individual
Original Assignee
Individual
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Individual filed Critical Individual
Priority to CN201910982603.9A priority Critical patent/CN112667881A/en
Publication of CN112667881A publication Critical patent/CN112667881A/en
Pending legal-status Critical Current

Links

Images

Landscapes

  • Information Transfer Between Computers (AREA)

Abstract

The embodiment of the disclosure discloses a method and a device for generating information. One embodiment of the method comprises: acquiring historical user information from at least one terminal device used by a user, wherein the historical user information comprises user basic information, historical scene information and historical behavior information corresponding to the historical scene information; establishing a behavior information generation model aiming at the user according to the historical user information, wherein the behavior information generation model is used for generating the behavior information of the user according to the scene information of the scene where the user is located; and generating behavior information based on the scene information of the scene where the user is located and the behavior information generation model. The embodiment realizes the generation of the behavior information.

Description

Method and apparatus for generating information
Technical Field
The embodiment of the disclosure relates to the technical field of computers, in particular to a method and a device for generating information.
Background
The rapid development of the internet technology enables the life of people to be more convenient. For example, the advent of web browser applications, instant chat tools, shopping applications, search applications, map applications, and the like, has opened a wide activity space for people. People leave a myriad of data when they move about through the internet. How to effectively utilize these data is a hot topic of research in the industry. For example, a user profile for a user may be constructed from data left over internet activity by the user, and further decisions may be made based on the user profile. In addition, various information may be generated based on data left over by the user through internet activity.
Disclosure of Invention
The embodiment of the disclosure provides a method and a device for generating information.
In a first aspect, an embodiment of the present disclosure provides a method for generating information, where the method includes: acquiring historical user information from at least one terminal device used by a user, wherein the historical user information comprises user basic information, historical scene information and historical behavior information corresponding to the historical scene information; establishing a behavior information generation model aiming at the user according to the historical user information, wherein the behavior information generation model is used for generating the behavior information of the user according to the scene information of the scene where the user is located; and generating behavior information based on the scene information of the scene where the user is located and the behavior information generation model.
In some embodiments, the above method further comprises: sending preset specific scene information to the user, and receiving feedback behavior information input by the user aiming at the specific scene information; acquiring the generated behavior information generated by the behavior information generation model aiming at the specific scene information; in response to determining that the feedback behavior information does not match the generated behavior information, training the behavior information generation model based on the specific scenario information and the feedback behavior information.
In some embodiments, the above method further comprises: receiving a viewing request sent by the user, wherein the viewing request is used for viewing the behavior information generated by the behavior information generation model within a preset time period; and presenting the behavior information generated in the preset time period of the behavior information generation model to the user according to the viewing request.
In some embodiments, the above method further comprises: receiving modification information sent by the user aiming at the presented behavior information, wherein the modification information is used for modifying the presented behavior information; and training the behavior information generation model based on the modification information.
In some embodiments, the generating behavior information based on the scene information of the scene where the user is located and the behavior information generating model includes: receiving associated behavior information generated by at least one associated behavior information generation model, wherein the at least one associated behavior information generation model is established based on user information of at least one associated user having an associated relationship with the user; and generating behavior information based on the behavior information generation model by using the received associated behavior information as scene information.
In some embodiments, the behavior information generation model includes at least one sub-model, wherein a sub-model of the at least one sub-model is used for generating behavior information of a predetermined category; and the establishing of the behavior information generation model for the user according to the historical user information comprises: and training the sub-model in the at least one sub-model by using the basic information of the user, the historical scene information and the historical behavior information corresponding to the historical scene information based on a machine learning algorithm to obtain a behavior information generation model for the user.
In some embodiments, the building a behavior information generation model for the user according to the historical user information includes: receiving adjustment data input by the user for the historical user information, wherein the adjustment data is used for adjusting the historical user information; and establishing a behavior information generation model aiming at the user by using the adjusted historical user information.
In some embodiments, the above method further comprises: acquiring behavior information generated by a plurality of models including the behavior information generation model; and performing statistical analysis on the acquired behavior information, and displaying a statistical analysis result.
In a second aspect, an embodiment of the present disclosure provides an apparatus for generating information, the apparatus including: an acquisition unit configured to acquire historical user information from at least one terminal device used by a user, wherein the historical user information includes user basic information, historical scene information and historical behavior information corresponding to the historical scene information; the establishing unit is configured to establish a behavior information generation model for the user according to the historical user information, wherein the behavior information generation model is used for generating behavior information of the user according to scene information of a scene where the user is located; and the generating unit is configured to generate behavior information based on the scene information of the scene where the user is located and the behavior information generating model.
In some embodiments, the above apparatus further comprises: a receiving unit configured to send preset specific scene information to the user and receive feedback behavior information input by the user for the specific scene information; a feedback unit configured to acquire generated behavior information generated by the behavior information generation model for the specific scene information; a first training unit configured to train the behavior information generation model based on the specific scenario information and the feedback behavior information in response to determining that the feedback behavior information does not match the generation behavior information.
In some embodiments, the above apparatus further comprises: a request receiving unit configured to receive a viewing request sent by the user, wherein the viewing request is used for viewing the behavior information generated by the behavior information generation model within a preset time period; and the presentation unit is configured to present the behavior information generated in the preset time period of the behavior information generation model to the user according to the viewing request.
In some embodiments, the above apparatus further comprises: an information receiving unit configured to receive modification information sent by the user for the presented behavior information, wherein the modification information is used for modifying the presented behavior information; and the second training unit is configured to train the behavior information generation model based on the modification information.
In some embodiments, the generating unit is further configured to: receiving associated behavior information generated by at least one associated behavior information generation model, wherein the at least one associated behavior information generation model is established based on user information of at least one associated user having an associated relationship with the user; and generating behavior information based on the behavior information generation model by using the received associated behavior information as scene information.
In some embodiments, the behavior information generation model includes at least one sub-model, wherein a sub-model of the at least one sub-model is used for generating behavior information of a predetermined category; and the establishing unit is further configured to: and training the sub-model in the at least one sub-model by using the basic information of the user, the historical scene information and the historical behavior information corresponding to the historical scene information based on a machine learning algorithm to obtain a behavior information generation model for the user.
In some embodiments, the establishing unit is further configured to: receiving adjustment data input by the user for the historical user information, wherein the adjustment data is used for adjusting the historical user information; and establishing a behavior information generation model aiming at the user by using the adjusted historical user information.
In some embodiments, the above apparatus further comprises: an information acquisition unit configured to acquire behavior information generated by a plurality of models including the behavior information generation model; and the statistical unit is configured to perform statistical analysis on the acquired behavior information and display a statistical analysis result.
In a third aspect, an embodiment of the present disclosure provides a server, including: one or more processors; a storage device, on which one or more programs are stored, which, when executed by the one or more processors, cause the one or more processors to implement the method as described in any implementation manner of the first aspect.
In a fourth aspect, the disclosed embodiments provide a computer-readable medium on which a computer program is stored, wherein the computer program, when executed by a processor, implements the method as described in any implementation manner of the first aspect.
According to the method and the device for generating information, historical user information is obtained from at least one terminal device used by a user, a behavior information generation model for the user is established according to the historical user information, and finally the behavior information of the user is generated based on the scene information of the scene where the user is located and the behavior information generation model, so that the behavior information of the user is generated.
Drawings
Other features, objects and advantages of the disclosure will become more apparent upon reading of the following detailed description of non-limiting embodiments thereof, made with reference to the accompanying drawings in which:
FIG. 1 is an exemplary system architecture diagram in which one embodiment of the present disclosure may be applied;
FIG. 2 is a flow diagram for one embodiment of a method for generating information, according to the present disclosure;
FIG. 3 is a schematic diagram of one application scenario of a method for generating information according to the present disclosure;
FIG. 4 is a schematic block diagram illustrating one embodiment of an apparatus for generating information according to the present disclosure;
FIG. 5 is a schematic block diagram of a computer system suitable for use in implementing a server according to an embodiment of the present disclosure.
Detailed Description
The present disclosure is described in further detail below with reference to the accompanying drawings and examples. It is to be understood that the specific embodiments described herein are merely illustrative of the relevant invention and not restrictive of the invention. It should be noted that, for convenience of description, only the portions related to the related invention are shown in the drawings.
It should be noted that, in the present disclosure, the embodiments and features of the embodiments may be combined with each other without conflict. The present disclosure will be described in detail below with reference to the accompanying drawings in conjunction with embodiments.
Fig. 1 illustrates an exemplary system architecture 100 of a method for generating information or an apparatus for generating information to which embodiments of the present disclosure may be applied.
As shown in fig. 1, the system architecture 100 may include terminal devices 101, 102, 103, a network 104, and a server 105. The network 104 serves as a medium for providing communication links between the terminal devices 101, 102, 103 and the server 105. Network 104 may include various connection types, such as wired, wireless communication links, or fiber optic cables, to name a few.
The user may use the terminal devices 101, 102, 103 to interact with the server 105 via the network 104 to receive or send messages or the like. The terminal devices 101, 102, 103 may have various communication client applications installed thereon, such as a web browser application, a shopping application, a search application, an instant messaging tool, a mailbox client, social platform software, and the like.
The terminal apparatuses 101, 102, and 103 may be hardware or software. When the terminal devices 101, 102, 103 are hardware, they may be various electronic devices having a display screen and supporting information interaction, including but not limited to smart phones, tablet computers, e-book readers, MP3 players (Moving Picture Experts Group Audio Layer III, mpeg compression standard Audio Layer 3), MP4 players (Moving Picture Experts Group Audio Layer IV, mpeg compression standard Audio Layer 4), laptop portable computers, desktop computers, and the like. When the terminal apparatuses 101, 102, 103 are software, they can be installed in the electronic apparatuses listed above. It may be implemented as multiple pieces of software or software modules (e.g., to provide distributed services) or as a single piece of software or software module. And is not particularly limited herein.
The server 105 may be a server that provides various services, such as a background server that processes user information generated by the terminal devices 101, 102, 103. The backend server may analyze and perform other processing on the received data such as the user information, and feed back a processing result (e.g., behavior information) to the terminal devices 101, 102, and 103.
The server 105 may be hardware or software. When the server 105 is hardware, it may be implemented as a distributed server cluster composed of a plurality of servers, or may be implemented as a single server. When the server 105 is software, it may be implemented as multiple pieces of software or software modules (e.g., to provide distributed services), or as a single piece of software or software module. And is not particularly limited herein.
It should be understood that the number of terminal devices, networks, and servers in fig. 1 is merely illustrative. There may be any number of terminal devices, networks, and servers, as desired for implementation.
It should be noted that the method for generating information provided by the embodiment of the present disclosure is generally performed by the server 105, and accordingly, the apparatus for generating information is generally disposed in the server 105.
With continued reference to FIG. 2, a flow 200 of one embodiment of a method for generating information in accordance with the present disclosure is shown. The method for generating information comprises the following steps:
step 201, obtaining historical user information from at least one terminal device used by a user.
In the present embodiment, an execution subject of the method for generating information (e.g., the server 105 shown in fig. 1) may acquire the historical user information from at least one terminal device used by the user through a wired connection manner or a wireless connection manner. Here, the historical user information may include user basic information, historical scenario information, and historical behavior information corresponding to the historical scenario information. Here, the user basic information may refer to basic attribute information of the user, and the user basic information may include, but is not limited to, age, gender, occupation, residence, native place, academic calendar, marital status, income, hobbies, height, weight, and the like. Historical scene information may be used to describe historical scenes, which may include, but is not limited to, time, place, weather, related people, and so forth. The historical behavior information corresponding to the historical scene information may be information of the historical behavior generated by the user in the historical scene.
Generally, a user generates a large amount of behavior information during the use of a terminal device. For example, a user may generate behavior information related to shopping while shopping using a smart phone, tablet, etc. The user also can produce a large amount of action information when using intelligent wearing equipment (e.g., intelligent bracelet, intelligent wrist-watch, smart glasses). For example, the motion information can be generated through a smart bracelet. In practice, the executing entity may acquire the historical user information from at least one terminal device used by the user in various ways. For example, after the user is authorized, the historical behavior information generated during the process of using the Application program by the user may be acquired by accessing an API (Application Programming Interface) of the Application program. For another example, for some social software, the executing agent may obtain, in a friend-adding manner, information published by the user through the social software, so as to obtain user behavior information. Here, the user behavior information may include time, place, content, and the like of the user publishing the information. In some scenarios, the user may also send own information as historical user information directly to the execution subject through the terminal device. For example, the user may send behavior information of his/her clothes, food, live, line, entertainment, etc. to the executing body. For example, when a user eats eggs, milk, and bread for breakfast at a certain day, the user may transmit information on the time of breakfast, the place of breakfast, and the kind of food eaten to the executing body through the terminal device (e.g., a mobile phone) used.
Step 202, establishing a behavior information generation model aiming at the user according to the historical user information.
In this embodiment, the execution subject may establish a behavior information generation model for the user according to the historical user information acquired in step 201. The behavior information generation model can be used for generating behavior information of the user according to scene information of a scene where the user is located. The behavior information generation model can be used for representing the corresponding relation between the scene information and the behavior information. The above behavior information generation model may be a model obtained by various means. For example, the resulting model is trained based on a machine learning algorithm. Here, the behavior information generation model described above may include an information input side for inputting scene information and an information output side for outputting behavior information.
In some optional implementations of the present embodiment, the behavior information generation model may include at least one sub-model. Wherein each of the at least one submodel may be used to generate predefined category behavior information. In practice, the user's behavior may be divided into a number of predefined categories, such as learning, work, entertainment, sports, communication, dining, and so forth. For each predetermined category a corresponding sub-model may be established. And the step 202 may be specifically performed as follows:
based on a machine learning algorithm, using the basic information of the user, the historical scene information and the historical behavior information corresponding to the historical scene information to train the sub-model in at least one sub-model, and obtaining a behavior information generation model aiming at the user.
In this implementation, the executing agent may train each of the at least one sub-model using the user basic information, the historical scenario information, and the historical behavior information corresponding to the historical scenario information based on a machine learning algorithm, so as to obtain a behavior information generation model for the user.
In practice, when different submodels are trained, the content included in the used scene information may be different. When training different sub-models, the machine learning algorithm used may be the same or different. For example, for a certain sub-model, the content included in the scene information used when training the sub-model may be determined according to a predetermined category of behavior corresponding to the generated behavior information.
In some optional implementations of this embodiment, the step 202 may specifically be performed as follows:
first, adjustment data input by a user for historical user information is received.
In this implementation, the execution subject may receive adjustment data input by a user for the historical user information, where the adjustment data may be used to adjust the historical user information. By way of example, the user may adjust all or a portion of the historical user information as needed. For example, assume that the historical user information includes information "the study of the students in month 1 2012 failed to enter the exam," and the user wants to know how the exam would be if successful, the user can adjust the information to "the study of the students in month 1 2012 succeeded in the exam, entered XX university, XX specialty.
Then, using the adjusted historical user information, a behavior information generation model for the user is established.
In this implementation, the execution subject may use the adjusted historical user information to build a behavior information generation model for the user. In this embodiment, the user may adjust the historical user information used for establishing the behavior information generation model by inputting the adjustment data, so that the established behavior information generation model better meets the user requirements.
Step 203, generating a model based on the scene information and the behavior information of the scene where the user is located, and generating the behavior information.
In this embodiment, the execution subject may generate the behavior information of the user based on the scene information of the scene where the user is located at the current time and the behavior information generation model. As an example, the execution subject may input scene information of a scene where the user is located at the current time from an information input side of the behavior information generation model, and may obtain the behavior information on an information output side of the behavior information generation model. In some application scenarios, the executive may generate behavior information from context information (e.g., time, place, weather, etc.) of the current context even if the user is incapacitated (e.g., dead, comatose, etc.). In this scenario, part of the scenario information may be determined according to the history information of the user, for example, the location may be the last location located by the user, or may be a location (e.g., a residence) where the user stays for the longest time before.
In some optional implementations of this embodiment, step 203 may specifically include the following:
first, the associated behavior information generated by at least one associated behavior information generation model is received.
In this implementation manner, the execution subject may receive the associated behavior information generated by the at least one associated behavior information generation model. The at least one correlation behavior information generation model is established based on user information of at least one correlation user having a correlation relationship with the user. Here, the associated user may refer to a user having an association with the user in the real world, for example, the associated user may be a parent, a spouse, a child, a sibling, a colleague, a friend, a neighbor, or the like of the user. In practice, one behavior information generation model may be established for each associated user as an associated behavior information generation model, and the associated behavior information generation model may be established based on historical user information of the associated user. Meanwhile, for each associated behavior information generation model, an associated behavior information generation model for the associated behavior information generation model can be established. Therefore, a large number of behavior information generation models can be obtained, and the large number of behavior information generation models can form a virtual community, and each behavior information generation model in the virtual community can be correspondingly regarded as one person in the real world. The behavior information generation models in the virtual community can receive or send information with each other, and the communication between two people in the real world is the same.
Then, the received associated behavior information is used as scene information, and behavior information is generated based on the behavior information generation model.
In this implementation, the execution agent may generate the behavior information based on the behavior information generation model using the received associated behavior information as the scene information. It is understood that, here, the scene information may include other information such as time, place, weather, etc. in addition to the associated behavior information.
In some optional implementations of this embodiment, the method for generating information may further include the following steps not shown in fig. 2:
step S1, sending preset specific scene information to the user, and receiving feedback behavior information input by the user for the specific scene information.
In this implementation, the execution body may send preset specific scene information to the terminal device used by the user. Here, the specific scene information may be set according to actual needs. For example, the specific scene information may be used to describe a scene of a meal, and may include time, place, selectable dishes, a common diner, and the like. After receiving the specific context information, the user may input feedback behavior information, for example, behavior information of which dishes are ordered, whether a purchase order is made, etc., for the specific context information.
In step S2, the behavior information generation model generates the generated behavior information for the specific scene information.
In this implementation, the execution main body may further input the specific scenario information from an information input side of the behavior information generation model, thereby acquiring the generated behavior information generated by the behavior information generation model for the specific scenario information. Since the generated behavior information is behavior information generated by the behavior information generation model, it may not be the same as the feedback behavior information.
Step S3, in response to determining that the feedback behavior information does not match the generation behavior information, training a behavior information generation model based on the specific scenario information and the feedback behavior information.
In this implementation, the execution subject may match the feedback behavior information received at step S1 with the generation behavior information acquired at step S2, thereby determining whether the feedback behavior information and the generation behavior information match. As an example, the feedback behavior information and the generation behavior information matching may mean that the feedback behavior information is the same as the generation behavior information. And if the feedback behavior information is matched with the generation behavior information, the generation behavior information generated by the behavior information generation model is correct aiming at the specific scene information. And if the feedback behavior information does not match the generation behavior information, the generation behavior information generated by the behavior information generation model is incorrect according to the specific scene information. At this time, the behavior information generation model needs to be trained based on the specific scenario information and the feedback behavior information. As an example, the execution subject may train the behavior information generation model using the specific scenario information and the feedback behavior information as sample data. Through the implementation mode, the executive body can acquire the sample data for training the behavior information generation model, so that the behavior information generated by the behavior information generation model is more accurate.
In some optional implementations of this embodiment, the method for generating information may further include the following steps not shown in fig. 2:
step one, receiving a viewing request sent by a user.
In this implementation, the execution subject may receive a viewing request sent by a user through a terminal device used. The viewing request may be used to view behavior information generated by the behavior information generation model within a preset time period. In practice, the behavior information generation model can generate behavior information in real time according to scene information of a scene where a user is located. In this way, the user can send a viewing request to the execution subject at any time to view the behavior information generated by the behavior information generation model within the preset time period. As an example, the preset time period may be a time period selected by the user, for example, if the user wants to view behavior information generated by the behavior information generation model on the previous day, the user may select the previous day as the preset time period.
And step two, presenting the behavior information generated in the preset time period of the behavior information generation model to the user according to the viewing request.
In this implementation manner, the execution main body may send behavior information generated by the behavior information generation model within the preset time period to the terminal device used by the user according to the viewing request, so that the terminal device used by the user is presented to the user.
In some optional implementations, the method for generating information may further include the following steps not shown in fig. 2:
and step three, receiving modification information sent by the user aiming at the presented behavior information.
In this implementation manner, after the behavior information generated by the behavior information generation model in the preset time period is presented to the user in the step, the user can check the presented behavior information, and if the user considers that the presented behavior information does not conform to the real behavior of the user, the user can send modification information to the execution main body through the terminal device. The modification information may be used to modify the presented behavior information. In this way, the execution subject may receive modification information sent by the user for the presented behavior information. The modification information may be, for example, text information, voice information, picture information, or the like. When the modification information is voice information or picture information, the execution main body can recognize the voice information or the picture information, so that a text corresponding to the voice information or the picture information is obtained.
And step four, training the behavior information generation model based on the modification information.
In this implementation, the execution subject may train the behavior information generation model based on the modification information received in step three. Specifically, the execution subject may use scene information and modification information corresponding to the presented behavior information as sample data to train the behavior information generation model. Through the implementation mode, the execution main body can update the behavior information generation model by using the modification information sent by the user, so that the behavior information generated by the behavior information generation model is more accurate.
In some optional implementations of this embodiment, the method for generating information may further include the following not shown in fig. 2:
first, behavior information generated by a plurality of models including a behavior information generation model is acquired.
In this implementation, the execution subject may acquire behavior information generated by a plurality of models including the behavior information generation model described above. As an example, the plurality of models may be established based on historical user information of a plurality of persons in the real world, and the plurality of persons may or may not have a relationship in the real world. Multiple persons may refer to a group. As an example, all people on the earth may be taken as a group, and behavior information generated by the model corresponding to each of the group may be acquired. By way of example, information interaction may be performed between multiple models, for example, behavior information generated by one model may be sent to another model as scenario information of another model. And the virtual space formed between the multiple models can be regarded as a space parallel to the real world.
Then, statistical analysis is performed on the acquired behavior information, and a result of the statistical analysis is displayed.
In this implementation, statistical analysis may be performed on the acquired behavior information, and a statistical analysis result may be displayed. Here, the statistical analysis may be various forms of statistical analysis, for example, a statistical analysis for a certain behavior, for example, analyzing the number of models that generate the behavior, a time distribution that generates the behavior, and the like.
With continued reference to fig. 3, fig. 3 is a schematic diagram of an application scenario of the method for generating information according to the present embodiment. In the application scenario of fig. 3, the server 301 may obtain historical user information from at least one terminal device 302 used by a user, where the historical user information may include user basic information, historical scenario information, and historical behavior information corresponding to the historical scenario information. Then, the server 301 may establish a behavior information generation model for the user according to the historical user information, where the behavior information generation model is used to generate the behavior information of the user according to the scene information of the scene where the user is located. Finally, the server 301 may generate behavior information based on the scene information and the behavior information generation model of the scene in which the user is located.
The method provided by the embodiment of the disclosure can establish the behavior information generation model based on the historical user information of the user, and generate the behavior information of the user based on the scene information of the scene where the user is located and the behavior information generation model, thereby realizing the generation of the behavior information of the user.
With further reference to fig. 4, as an implementation of the methods shown in the above figures, the present disclosure provides an embodiment of an apparatus for generating information, which corresponds to the method embodiment shown in fig. 2, and which is particularly applicable in various electronic devices.
As shown in fig. 4, the apparatus 400 for generating information of the present embodiment includes: an acquisition unit 401, a creation unit 402 and a generation unit 403. The obtaining unit 401 is configured to obtain historical user information from at least one terminal device used by a user, where the historical user information includes user basic information, historical scene information, and historical behavior information corresponding to the historical scene information; the establishing unit 402 is configured to establish a behavior information generation model for the user according to the historical user information, wherein the behavior information generation model is used for generating behavior information of the user according to scene information of a scene where the user is located; the generating unit 403 is configured to generate behavior information based on scene information of a scene in which the user is located and the behavior information generation model.
In this embodiment, specific processes of the obtaining unit 401, the establishing unit 402, and the generating unit 403 of the apparatus 400 for generating information and technical effects brought by the specific processes may refer to related descriptions of step 201, step 202, and step 203 in the corresponding embodiment of fig. 2, which are not described herein again.
In some optional implementations of this embodiment, the apparatus 400 further includes: a receiving unit (not shown in the figures) configured to send preset specific scene information to the user and receive feedback behavior information input by the user for the specific scene information; a feedback unit (not shown in the figure) configured to acquire generated behavior information generated by the behavior information generation model for the specific scene information; a first training unit (not shown in the figure) configured to train the behavior information generation model based on the specific scenario information and the feedback behavior information in response to determining that the feedback behavior information does not match the generation behavior information.
In some optional implementations of this embodiment, the apparatus 400 further includes: a request receiving unit (not shown in the figures) configured to receive a viewing request sent by the user, wherein the viewing request is used for viewing the behavior information generated by the behavior information generation model within a preset time period; and a presentation unit (not shown in the figure) configured to present the behavior information generated within the preset time period of the behavior information generation model to the user according to the viewing request.
In some optional implementations of this embodiment, the apparatus 400 further includes: an information receiving unit (not shown in the figure) configured to receive modification information sent by the user for the presented behavior information, wherein the modification information is used for modifying the presented behavior information; and a second training unit (not shown in the figure) configured to train the behavior information generation model based on the modification information.
In some optional implementations of the present embodiment, the generating unit 403 is further configured to: receiving associated behavior information generated by at least one associated behavior information generation model, wherein the at least one associated behavior information generation model is established based on user information of at least one associated user having an associated relationship with the user; and generating behavior information based on the behavior information generation model by using the received associated behavior information as scene information.
In some optional implementations of this embodiment, the behavior information generation model includes at least one sub-model, where a sub-model in the at least one sub-model is used to generate behavior information of a predetermined category; and the establishing unit 402 is further configured to: and training the sub-model in the at least one sub-model by using the basic information of the user, the historical scene information and the historical behavior information corresponding to the historical scene information based on a machine learning algorithm to obtain a behavior information generation model for the user.
In some optional implementations of this embodiment, the establishing unit 402 is further configured to: receiving adjustment data input by the user for the historical user information, wherein the adjustment data is used for adjusting the historical user information; and establishing a behavior information generation model aiming at the user by using the adjusted historical user information.
In some optional implementations of this embodiment, the method further includes: an information acquisition unit (not shown in the figure) configured to acquire behavior information generated by a plurality of models including the above behavior information generation model; a statistical unit (not shown in the figure) configured to perform statistical analysis on the acquired behavior information and display a result of the statistical analysis.
Referring now to FIG. 5, shown is a block diagram of a computer system 500 suitable for use in implementing a server of an embodiment of the present disclosure. The server shown in fig. 5 is only an example, and should not bring any limitation to the functions and the scope of use of the embodiments of the present disclosure.
As shown in fig. 5, the computer system 500 includes a Central Processing Unit (CPU)501 that can perform various appropriate actions and processes according to a program stored in a Read Only Memory (ROM)502 or a program loaded from a storage section 508 into a Random Access Memory (RAM) 503. In the RAM 503, various programs and data necessary for the operation of the system 500 are also stored. The CPU 501, ROM 502, and RAM 503 are connected to each other via a bus 504. An input/output (I/O) interface 505 is also connected to bus 504.
The following components are connected to the I/O interface 505: an input portion 506 including a keyboard, a mouse, and the like; an output portion 507 including a display such as a Cathode Ray Tube (CRT), a Liquid Crystal Display (LCD), and the like, and a speaker; a storage portion 508 including a hard disk and the like; and a communication section 509 including a network interface card such as a LAN card, a modem, or the like. The communication section 509 performs communication processing via a network such as the internet. The driver 510 is also connected to the I/O interface 505 as necessary. A removable medium 511 such as a magnetic disk, an optical disk, a magneto-optical disk, a semiconductor memory, or the like is mounted on the drive 510 as necessary, so that a computer program read out therefrom is mounted into the storage section 508 as necessary.
In particular, according to an embodiment of the present disclosure, the processes described above with reference to the flowcharts may be implemented as computer software programs. For example, the disclosed embodiments include a computer program product comprising a computer program embodied on a computer readable medium, the computer program comprising program code for performing the method illustrated in the flow chart. In such an embodiment, the computer program may be downloaded and installed from a network through the communication section 509, and/or installed from the removable medium 511. The above-described functions defined in the method of the present disclosure are performed when the computer program is executed by a Central Processing Unit (CPU) 501.
It should be noted that the computer readable medium in the present disclosure may be a computer readable signal medium or a computer readable storage medium or any combination of the two. A computer readable storage medium may be, for example, but not limited to, an electronic, magnetic, optical, electromagnetic, infrared, or semiconductor system, apparatus, or device, or any combination of the foregoing. More specific examples of the computer readable storage medium may include, but are not limited to: an electrical connection having one or more wires, a portable computer diskette, a hard disk, a Random Access Memory (RAM), a read-only memory (ROM), an erasable programmable read-only memory (EPROM or flash memory), an optical fiber, a portable compact disc read-only memory (CD-ROM), an optical storage device, a magnetic storage device, or any suitable combination of the foregoing. In the present disclosure, a computer readable storage medium may be any tangible medium that can contain, or store a program for use by or in connection with an instruction execution system, apparatus, or device. In contrast, in the present disclosure, a computer-readable signal medium may include a propagated data signal with computer-readable program code embodied therein, for example, in baseband or as part of a carrier wave. Such a propagated data signal may take many forms, including, but not limited to, electro-magnetic, optical, or any suitable combination thereof. A computer readable signal medium may also be any computer readable medium that is not a computer readable storage medium and that can communicate, propagate, or transport a program for use by or in connection with an instruction execution system, apparatus, or device. Program code embodied on a computer readable medium may be transmitted using any appropriate medium, including but not limited to: wireless, wire, fiber optic cable, RF, etc., or any suitable combination of the foregoing.
Computer program code for carrying out operations for aspects of the present disclosure may be written in any combination of one or more programming languages, including an object oriented programming language such as Java, Smalltalk, C + +, and conventional procedural programming languages, such as the "C" programming language or similar programming languages. The program code may execute entirely on the user's computer, partly on the user's computer, as a stand-alone software package, partly on the user's computer and partly on a remote computer or entirely on the remote computer or server. In the case of a remote computer, the remote computer may be connected to the user's computer through any type of network, including a Local Area Network (LAN) or a Wide Area Network (WAN), or the connection may be made to an external computer (for example, through the Internet using an Internet service provider).
The flowchart and block diagrams in the figures illustrate the architecture, functionality, and operation of possible implementations of systems, methods and computer program products according to various embodiments of the present disclosure. In this regard, each block in the flowchart or block diagrams may represent a module, segment, or portion of code, which comprises one or more executable instructions for implementing the specified logical function(s). It should also be noted that, in some alternative implementations, the functions noted in the block may occur out of the order noted in the figures. For example, two blocks shown in succession may, in fact, be executed substantially concurrently, or the blocks may sometimes be executed in the reverse order, depending upon the functionality involved. It will also be noted that each block of the block diagrams and/or flowchart illustration, and combinations of blocks in the block diagrams and/or flowchart illustration, can be implemented by special purpose hardware-based systems which perform the specified functions or acts, or combinations of special purpose hardware and computer instructions.
The units described in the embodiments of the present disclosure may be implemented by software or hardware. The described units may also be provided in a processor, and may be described as: a processor includes an acquisition unit, a creation unit, and a generation unit. The names of these units do not in some cases constitute a limitation on the unit itself, and for example, the acquisition unit may also be described as a "unit that acquires historical user information from at least one terminal device used by the user".
As another aspect, the present disclosure also provides a computer-readable medium, which may be contained in the apparatus described in the above embodiments; or may be present separately and not assembled into the device. The computer readable medium carries one or more programs which, when executed by the apparatus, cause the apparatus to: acquiring historical user information from at least one terminal device used by a user, wherein the historical user information comprises user basic information, historical scene information and historical behavior information corresponding to the historical scene information; establishing a behavior information generation model aiming at the user according to the historical user information, wherein the behavior information generation model is used for generating the behavior information of the user according to the scene information of the scene where the user is located; and generating behavior information based on the scene information of the scene where the user is located and the behavior information generation model.
The foregoing description is only exemplary of the preferred embodiments of the disclosure and is illustrative of the principles of the technology employed. It will be appreciated by those skilled in the art that the scope of the invention in the present disclosure is not limited to the specific combination of the above-mentioned features, but also encompasses other embodiments in which any combination of the above-mentioned features or their equivalents is possible without departing from the inventive concept as defined above. For example, the above features and (but not limited to) the features disclosed in this disclosure having similar functions are replaced with each other to form the technical solution.

Claims (10)

1. A method for generating information, comprising:
acquiring historical user information from at least one terminal device used by a user, wherein the historical user information comprises user basic information, historical scene information and historical behavior information corresponding to the historical scene information;
establishing a behavior information generation model aiming at the user according to the historical user information, wherein the behavior information generation model is used for generating the behavior information of the user according to the scene information of the scene where the user is located;
and generating behavior information based on the scene information of the scene where the user is located and the behavior information generation model.
2. The method of claim 1, wherein the method further comprises:
sending preset specific scene information to the user, and receiving feedback behavior information input by the user aiming at the specific scene information;
acquiring the generated behavior information generated by the behavior information generation model aiming at the specific scene information;
in response to determining that the feedback behavior information does not match the generation behavior information, training the behavior information generation model based on the particular scenario information and the feedback behavior information.
3. The method of claim 1, wherein the method further comprises:
receiving a viewing request sent by the user, wherein the viewing request is used for viewing the behavior information generated by the behavior information generation model within a preset time period;
and presenting the behavior information generated in the preset time period of the behavior information generation model to the user according to the viewing request.
4. The method of claim 3, wherein the method further comprises:
receiving modification information sent by the user aiming at the presented behavior information, wherein the modification information is used for modifying the presented behavior information;
and training the behavior information generation model based on the modification information.
5. The method of claim 1, wherein generating behavior information based on context information of a context in which the user is located and the behavior information generation model comprises:
receiving associated behavior information generated by at least one associated behavior information generation model, wherein the at least one associated behavior information generation model is established based on user information of at least one associated user having an association relationship with the user;
and generating behavior information by taking the received associated behavior information as scene information and generating a model based on the behavior information.
6. The method of claim 1, wherein the behavior information generation model comprises at least one submodel, wherein a submodel of the at least one submodel is used to generate a predetermined category of behavior information; and
the establishing of the behavior information generation model aiming at the user according to the historical user information comprises the following steps:
and training the sub-model in the at least one sub-model by using the basic information of the user, the historical scene information and the historical behavior information corresponding to the historical scene information based on a machine learning algorithm to obtain a behavior information generation model for the user.
7. The method of claim 1, wherein the building a behavioral information generative model for the user from the historical user information comprises:
receiving adjustment data input by the user for the historical user information, wherein the adjustment data is used for adjusting the historical user information;
and establishing a behavior information generation model aiming at the user by using the adjusted historical user information.
8. The method of claim 1, wherein the method further comprises:
acquiring behavior information generated by a plurality of models including the behavior information generation model;
and performing statistical analysis on the acquired behavior information, and displaying a statistical analysis result.
9. An apparatus for generating information, comprising:
an acquisition unit configured to acquire historical user information from at least one terminal device used by a user, wherein the historical user information includes user basic information, historical scene information, and historical behavior information corresponding to the historical scene information;
the establishing unit is configured to establish a behavior information generation model aiming at the user according to the historical user information, wherein the behavior information generation model is used for generating the behavior information of the user according to the scene information of the scene where the user is located;
and the generating unit is configured to generate behavior information based on scene information of a scene where the user is located and the behavior information generating model.
10. The apparatus of claim 9, wherein the apparatus further comprises:
a receiving unit configured to send preset specific scene information to the user and receive feedback behavior information input by the user for the specific scene information;
a feedback unit configured to acquire generated behavior information generated by the behavior information generation model for the specific scene information;
a first training unit configured to train the behavior information generation model based on the specific scenario information and the feedback behavior information in response to determining that the feedback behavior information does not match the generation behavior information.
CN201910982603.9A 2019-10-16 2019-10-16 Method and apparatus for generating information Pending CN112667881A (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN201910982603.9A CN112667881A (en) 2019-10-16 2019-10-16 Method and apparatus for generating information

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN201910982603.9A CN112667881A (en) 2019-10-16 2019-10-16 Method and apparatus for generating information

Publications (1)

Publication Number Publication Date
CN112667881A true CN112667881A (en) 2021-04-16

Family

ID=75400163

Family Applications (1)

Application Number Title Priority Date Filing Date
CN201910982603.9A Pending CN112667881A (en) 2019-10-16 2019-10-16 Method and apparatus for generating information

Country Status (1)

Country Link
CN (1) CN112667881A (en)

Similar Documents

Publication Publication Date Title
CN109460513B (en) Method and apparatus for generating click rate prediction model
US10146768B2 (en) Automatic suggested responses to images received in messages using language model
CN109522483B (en) Method and device for pushing information
CN107609506B (en) Method and apparatus for generating image
US20190080148A1 (en) Method and apparatus for generating image
CN111476871B (en) Method and device for generating video
CN109993150B (en) Method and device for identifying age
CN109981787B (en) Method and device for displaying information
CN111800671B (en) Method and apparatus for aligning paragraphs and video
CN110288705B (en) Method and device for generating three-dimensional model
CN111414543B (en) Method, device, electronic equipment and medium for generating comment information sequence
CN111061956A (en) Method and apparatus for generating information
CN110866040A (en) User portrait generation method, device and system
CN111897950A (en) Method and apparatus for generating information
US11232560B2 (en) Method and apparatus for processing fundus image
CN110046571B (en) Method and device for identifying age
CN110008926B (en) Method and device for identifying age
CN112449217B (en) Method and device for pushing video, electronic equipment and computer readable medium
CN109472028B (en) Method and device for generating information
CN109034085B (en) Method and apparatus for generating information
CN111125501B (en) Method and device for processing information
CN109584012B (en) Method and device for generating item push information
US10699127B1 (en) Method and apparatus for adjusting parameter
CN112667881A (en) Method and apparatus for generating information
CN109727072B (en) Method and apparatus for processing information

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination