CN110544108B - Social user classification method and device, electronic equipment and medium - Google Patents

Social user classification method and device, electronic equipment and medium Download PDF

Info

Publication number
CN110544108B
CN110544108B CN201910315093.XA CN201910315093A CN110544108B CN 110544108 B CN110544108 B CN 110544108B CN 201910315093 A CN201910315093 A CN 201910315093A CN 110544108 B CN110544108 B CN 110544108B
Authority
CN
China
Prior art keywords
user
attention
information
group
interest
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Active
Application number
CN201910315093.XA
Other languages
Chinese (zh)
Other versions
CN110544108A (en
Inventor
刘伟
陈训逊
郑礼雄
黄亮
张良
党向磊
李明哲
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
National Computer Network and Information Security Management Center
Original Assignee
National Computer Network and Information Security Management Center
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by National Computer Network and Information Security Management Center filed Critical National Computer Network and Information Security Management Center
Priority to CN201910315093.XA priority Critical patent/CN110544108B/en
Publication of CN110544108A publication Critical patent/CN110544108A/en
Application granted granted Critical
Publication of CN110544108B publication Critical patent/CN110544108B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06QINFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR ADMINISTRATIVE, COMMERCIAL, FINANCIAL, MANAGERIAL OR SUPERVISORY PURPOSES; SYSTEMS OR METHODS SPECIALLY ADAPTED FOR ADMINISTRATIVE, COMMERCIAL, FINANCIAL, MANAGERIAL OR SUPERVISORY PURPOSES, NOT OTHERWISE PROVIDED FOR
    • G06Q30/00Commerce
    • G06Q30/02Marketing; Price estimation or determination; Fundraising
    • G06Q30/0201Market modelling; Market analysis; Collecting market data

Landscapes

  • Business, Economics & Management (AREA)
  • Strategic Management (AREA)
  • Engineering & Computer Science (AREA)
  • Accounting & Taxation (AREA)
  • Development Economics (AREA)
  • Finance (AREA)
  • Entrepreneurship & Innovation (AREA)
  • Game Theory and Decision Science (AREA)
  • Data Mining & Analysis (AREA)
  • Economics (AREA)
  • Marketing (AREA)
  • Physics & Mathematics (AREA)
  • General Business, Economics & Management (AREA)
  • General Physics & Mathematics (AREA)
  • Theoretical Computer Science (AREA)
  • User Interface Of Digital Computer (AREA)

Abstract

The application discloses a social user classification method and device, electronic equipment and media. In the application, after first attention object group information of a first user and second attention object group information of a second user are obtained, a corresponding relationship between the first user and the second user may be established based on a matching relationship between the first attention object group information and the second attention object group information. By applying the technical scheme of the application, the corresponding relation among the users can be established through the pre-collected respective concerned objects of the users. And then when data information for the attention object is acquired subsequently, the data information can be pushed to users with corresponding relations in a targeted manner.

Description

Social user classification method and device, electronic equipment and medium
Technical Field
The present application relates to data processing technologies, and in particular, to a method and an apparatus for classifying social users, an electronic device, and a medium.
Background
The internet has been continuously developed with the use of more and more users due to the rise of the communications era and society.
With the rapid development of the internet, people have become a normal state for acquiring various data information by using microblogs. In the process of using the microblog, the user can pay attention to the object which is interested by the user, and corresponding information is obtained in real time through the object concerned. For example, a user may obtain relevant information for an educational class from an object by focusing attention on the educational class. Or, the user can also obtain the related information of the food culture class from the object by focusing attention on the food class
However, since the information of interest of each user is different, when pushing and browsing data information to a user, how to provide the data information required by each user becomes a problem to be solved by those skilled in the art.
Disclosure of Invention
The embodiment of the invention provides a social user classification method and device, electronic equipment and a medium.
According to an aspect of the embodiments of the present application, a method for classifying social users is provided, including:
acquiring attention object information of each user in a first user group, wherein the attention object is an object which is attended by the user, and the attention object information is used for identifying identity information of the attention object;
when the fact that the same attention object information exists in the attention object information is detected, screening the same attention object to generate a first object group;
generating a second user group based on the first object group, wherein the second user group is a set of users corresponding to the attention objects in the first object group;
and establishing corresponding relations of all users in the second user group, and adding all the corresponding relations to a user data table.
Optionally, according to another aspect of the embodiment of the present application, the object-of-interest information further includes: identity parameter information and category labels;
the screening of the same attention object in each of the attention object information to generate a second user group includes:
screening the concerned objects with the same identity parameter information in the concerned object information to generate a second user group; or the like, or, alternatively,
and screening the concerned objects with the same identity parameter information and the same category label in the concerned object information to generate the second user group.
Optionally, according to another aspect of the embodiment of the present application, the screening, from among the pieces of attention object information, attention objects with the same identity parameter information and class labels to generate the second user group includes:
detecting whether the attention objects with the same category label exist in all the attention objects;
if the object information exists, screening the concerned objects with the same identity parameter information and class labels in the concerned object information to generate a second user group;
and if the second user group does not exist, generating a stopping instruction, wherein the stopping instruction is used for stopping generating the second user group.
Optionally, according to another aspect of the embodiment of the present application, after the establishing a corresponding relationship between users in the second user group and adding each corresponding relationship to a user data table, the method further includes:
and when a query instruction for a first concerned object is received, obtaining user information corresponding to the first concerned object according to the user data table.
Optionally, according to another aspect of the embodiment of the present application, the generating a second user group based on the first object group includes:
detecting the number of the concerned users corresponding to each concerned object in the first object group;
screening out the attention objects of which the corresponding attention user number is lower than a preset number in the first object group to obtain a second object group;
and obtaining attention users corresponding to the attention objects in the second object group based on the second object group, and generating the second user group.
Optionally, according to another aspect of the embodiment of the present application, the category label includes at least any one of:
education labels, entertainment labels, organization labels, and information labels.
According to another aspect of the embodiments of the present application, there is provided a classification apparatus for social users, including:
a first obtaining module configured to obtain a first object of interest group of a first user, the first object of interest group being a set of objects of interest to the first user;
a second obtaining module, configured to obtain a second attention object group of a second user, where the second attention object group is a set of objects focused by the second user, and the second user is a user different from the first user;
the establishing module is configured to establish a corresponding relationship between the first user and the second user based on the matching relationship between the first object of interest group and the second object of interest group.
According to another aspect of the embodiments of the present application, there is provided an electronic device including:
a memory for storing executable instructions; and
a display for displaying with the memory to execute the executable instructions to perform the operations of any of the social user classification methods described above.
According to a further aspect of the embodiments of the present application, there is provided a computer-readable storage medium for storing computer-readable instructions, which when executed, perform the operations of any one of the above classification methods for social users.
In the application, after the first attention object group information of the first user and the second attention object group information of the second user are acquired, the corresponding relationship between the first user and the second user may be established based on the matching relationship between the first attention object group information and the second attention object group information. By applying the technical scheme of the application, the corresponding relation between the users can be established through the respective attention objects of the users collected in advance. And then when data information for the attention object is acquired subsequently, the data information can be pushed to users with corresponding relations in a targeted manner.
The technical solution of the present application is further described in detail by the accompanying drawings and examples.
Drawings
The accompanying drawings, which are incorporated in and constitute a part of this specification, illustrate embodiments of the application and together with the description, serve to explain the principles of the application.
The present application may be more clearly understood from the following detailed description with reference to the accompanying drawings, in which:
fig. 1 is a flowchart of an embodiment of a social user classification method according to the present application.
FIG. 2 is a flowchart of another embodiment of the social user classification method of the present application.
FIG. 3 is a flowchart illustrating a social user classification method according to another embodiment of the present invention.
Fig. 4 is a schematic structural diagram of a classification device for social users according to the present application.
Fig. 5 is a schematic view of an electronic device according to the present application.
Detailed Description
Various exemplary embodiments of the present application will now be described in detail with reference to the accompanying drawings. It should be noted that: the relative arrangement of the components and steps, the numerical expressions, and numerical values set forth in these embodiments do not limit the scope of the present application unless specifically stated otherwise.
Meanwhile, it should be understood that the sizes of the respective portions shown in the drawings are not drawn in an actual proportional relationship for the convenience of description.
The following description of at least one exemplary embodiment is merely illustrative in nature and is in no way intended to limit the application, its application, or uses.
Techniques, methods, and apparatus known to those of ordinary skill in the relevant art may not be discussed in detail but are intended to be part of the specification where appropriate.
It should be noted that: like reference numbers and letters refer to like items in the following figures, and thus, once an item is defined in one figure, it need not be discussed further in subsequent figures.
It should be noted that all directional indicators (such as up, down, left, right, front, and back \8230;) in the embodiments of the present application are only used to explain the relative positional relationship between the components, the motion situation, etc. in a specific posture (as shown in the drawing), and if the specific posture is changed, the directional indicators are correspondingly changed.
In addition, descriptions in this application as to "first", "second", etc. are for descriptive purposes only and are not to be construed as indicating or implying relative importance or implicit to the number of technical features indicated. Thus, a feature defined as "first" or "second" may explicitly or implicitly include at least one such feature. In the description of the present application, "plurality" means at least two, e.g., two, three, etc., unless specifically limited otherwise.
In this application, unless expressly stated or limited otherwise, the terms "connected," "secured," and the like are to be construed broadly, and for example, "secured" may be a fixed connection, a removable connection, or an integral part; can be mechanically or electrically connected; they may be directly connected or indirectly connected through intervening media, or they may be connected internally or in any other suitable relationship, unless expressly stated otherwise. The specific meaning of the above terms in the present application can be understood by those of ordinary skill in the art as appropriate.
In addition, technical solutions in the embodiments of the present application may be combined with each other, but it is necessary to be based on the realization of the technical solutions by a person skilled in the art, and when the technical solutions are contradictory to each other or cannot be realized, such a combination of technical solutions should not be considered to exist, and is not within the protection scope claimed in the present application.
A classification method for social users according to an exemplary embodiment of the present application is described below with reference to fig. 1 to 5. It should be noted that the following application scenarios are merely illustrated for the convenience of understanding the spirit and principles of the present application, and the embodiments of the present application are not limited in this respect. Rather, embodiments of the present application may be applied to any scenario where applicable.
Fig. 1 shows a schematic diagram of an exemplary system architecture 100 to which the query method or query device of the embodiments of the present application may be applied.
As shown in fig. 1, the system architecture 100 may include one or more of terminal devices 101, 102, 103, a network 104, and a server 105. The network 104 serves as a medium for providing communication links between the terminal devices 101, 102, 103 and the server 105. Network 104 may include various connection types, such as wired, wireless communication links, or fiber optic cables, to name a few.
It should be understood that the number of terminal devices, networks, and servers in fig. 1 is merely illustrative. There may be any number of terminal devices, networks, and servers, as desired for an implementation. For example, server 105 may be a server cluster comprised of multiple servers, or the like.
The user may use the terminal devices 101, 102, 103 to interact with the server 105 via the network 104 to receive or send messages or the like. The terminal devices 101, 102, 103 may be various electronic devices having a display screen, including but not limited to smart phones, tablet computers, portable computers, desktop computers, and the like.
The terminal apparatuses 101, 102, 103 in the present application may be terminal apparatuses that provide various services. For example, a user acquires, through a terminal device 103 (which may also be the terminal device 101 or 102), attention object information of each user in a first user group, where the attention object is an object that is in a microblog and is attended by the user, and the attention object information is used to identify identity information of the attention object; screening the same attention objects in the information of the attention objects to generate a first object group; generating a second user group based on the first object group, wherein the second user group is a set of users corresponding to the attention objects in the first object group; and establishing corresponding relations of all users in the second user group, and adding all the corresponding relations to a user data table, wherein the user data table is a user data table for recording the corresponding relations between the users and the concerned objects.
It should be noted that the method for classifying social users of the image capturing apparatus provided in the embodiment of the present application may be performed by one or more of the terminal devices 101, 102, and 103, and/or the server 105, and accordingly, the apparatus for classifying social users provided in the embodiment of the present application is generally disposed in the corresponding terminal device, and/or the server 105, but the present application is not limited thereto.
The application also provides a social user classification method, a social user classification device, a target terminal and a medium.
Fig. 2 schematically shows a flowchart of a classification method for social users according to an embodiment of the present application. As shown in fig. 2, the method includes:
s101, first concerned object group information of a first user is obtained, and the first concerned object group is a set of objects concerned by the first user.
Firstly, it should be noted that the classification method for social users provided by the application can be applied to microblogs. The microblog is a micro blog and is also a broadcast type social network platform which can share, propagate and acquire short real-time information through an attention mechanism based on user relationship information. After logging in the microblog, the user can pay attention to other microblog objects according to the interest of the user. And obtain the published information from the object of interest.
Optionally, in this application, a first object of interest group of the first user may be obtained first. It should be noted that the number of the objects of interest in the first object of interest group is not specifically limited in the present application, and in a preferred embodiment, the number of the objects of interest in the first object of interest group may be 10. Further, the object of interest in the application is an object of interest to the first user in the microblog. It should also be noted that, in the present application, the first user is not specifically limited, that is, the first user may be any one of the microblog users.
It should be noted that the present application is not limited to the object of interest. That is, the attention object in the present application may be an attention object of any category. For example, the attention object may be an attention object of a gourmet class, and the attention object may be an attention object of a category of an education class.
In addition, it should be noted that, in the present application, a device for acquiring the first object-of-interest group information of the first user is not specifically limited, for example, in the present application, the first object-of-interest group information of the first user may be acquired by an intelligent device of the user, or the first object-of-interest group information of the first user may be acquired by the server. It is understood that, in the present application, the smart device is not limited specifically, that is, the smart device may be any smart device, such as a mobile phone, an electronic notebook, a PDA, and the like.
And S102, acquiring second attention object group information of a second user, wherein the second attention object group is a set of objects which are concerned by the second user, and the second user is a user different from the first user.
Optionally, after acquiring the first interested object group information of the first user, the application may further acquire second interested object group information of a second user, where the second user is a user different from the first user.
It should be noted that, in the present application, each object of interest in the second object of interest group of the second user is not specifically limited. That is, each of the attention objects in the second attention object group in the present application may be an arbitrary category of attention object. For example, the attention object may be an attention object of a gourmet class, and the attention object may be an attention object of a category of an education class. Likewise, the number of the objects of interest of the second object of interest group is not specifically limited in the present application.
S103, establishing a corresponding relation between the first user and the second user based on the matching relation between the first concerned object group information and the second concerned object group information.
Optionally, in the present application, after obtaining the first attention object group information of the first user and the second attention object group information of the second user, the corresponding relationship between the first user and the second user may be further established based on the matching relationship between the first attention object group information and the second attention object group information.
For example, when the first user is a user a, and the user a focuses on the object of interest X, the object of interest Y, and the object of interest Z at the same time, the first object of interest group information is an information group about the object of interest X, the object of interest Y, and the object of interest Z. The second user is a user B, and the user B pays attention to the attention object M, the attention object N, and the attention object Z at the same time, that is, the second attention object group information is an information group about the attention object M, the attention object N, and the attention object Z. Further, the method and the device for generating the second attention object group information can establish the corresponding relation between the first user and the second user based on the matching relation between the first attention object group information and the second attention object group information.
In the application, after first attention object group information of a first user and second attention object group information of a second user are obtained, a corresponding relationship between the first user and the second user may be established based on a matching relationship between the first attention object group information and the second attention object group information. By applying the technical scheme of the application, the corresponding relation between the users can be established through the respective attention objects of the users collected in advance. And then when data information for the attention object is acquired subsequently, the data information can be pushed to users with corresponding relations in a targeted manner.
Further optionally, in an embodiment of the application, in S103 (establishing a corresponding relationship between the first user and the second user based on the matching relationship between the first object-of-interest group information and the second object-of-interest group information), a specific embodiment is included, as shown in fig. 3, including:
s201, first attention object group information of a first user is obtained.
S202, second attention object group information of a second user is obtained.
Optionally, in the present application, after the first attention object group of the first user and the second attention object group of the second user are acquired, the number of the attention objects in the first attention object group and the number of the attention objects in the second attention object group may be further detected.
Further optionally, when the number of the attention objects in the first attention object group and the number of the attention objects in the second attention object group are both greater than a preset number, the corresponding relationship between the first user and the second user is established based on the matching relationship between the first attention object group and the second attention object group.
It should be noted that, in the present application, the preset number is not specifically limited, that is, the preset number may be 10, and the preset number may also be 20.
S203, calculating a matching degree value between the first object of interest group and the second object of interest group.
In one possible embodiment of the present application, first, each piece of first attention object information in the first attention object group information and each piece of second attention object information in the second attention object group information may be acquired respectively.
The attention object information comprises identity parameter information and a category label.
Optionally, in the present application, the interested object information in each first interested object information and each second interested object information may be identity parameter information and a category label corresponding to each interested object. The identity parameter information is used for identifying identity information corresponding to the concerned object. The identity parameter information of the user is not specifically limited in the application, for example, the identity parameter information may be parameter information such as a name of a microblog user of the object of interest, and the identity parameter information may also be parameter information such as an IP address of the object of interest. The specific change of the identity parameter information of the object of interest does not affect the protection scope of the present application.
In addition, the category label in the application is used for identifying the category of the information issued by the corresponding attention object on the microblog. The present application also does not specifically limit the category label of the attention object. For example, the category label may be a label for marking the attention object as issuing entertainment information, and the category label may be a label for marking the attention object as issuing education information. In another possible implementation, the category tag may also be a tag that identifies the frequency or the number of times that the object of interest issues information on the microblog. The specific change of the category label does not affect the protection scope of the present application.
Further, first attention object information of each first attention object and second attention object information of each second attention object are detected;
optionally, calculating a matching degree value between the first attention object group and the second attention object group based on each piece of first attention object information and each piece of second attention object information, includes:
based on each of the first attention object information, each of the second attention object information, and the weight coefficient, a matching degree value of the first attention object group and the second attention object group is calculated.
Optionally, after the information of each first attention object is obtained, according to a preset weight coefficient, a corresponding attention value may be generated for each first attention object. Similarly, after the information of each second attention object is acquired, according to the preset weight coefficient, a corresponding attention value is generated for each second attention object.
Further, the present application does not specifically limit the weighting factor. In a preferred embodiment, the attention value corresponding to each first attention object and the attention value corresponding to each second attention object may be calculated and generated by the following formulas:
P1=a*W+c*(1-W);
wherein, P1 is a matching degree value of any object of interest, a is identity parameter information corresponding to P1, W is a preset weight coefficient, and c is a category label corresponding to P1.
Alternatively, after the attention value of P1 is generated according to the above manner, corresponding attention values may be sequentially generated for each first attention object and each second attention object according to the above manner. Further, after a plurality of corresponding first interest values are generated for each first interest object, the first interest values may be matched with a plurality of corresponding second interest values generated for each first interest object one by one. It will be appreciated that the closer the values of interest are between the two, the higher the match between the two. Furthermore, a corresponding matching degree value can be generated according to the one-to-one matching result. And then subsequently establishing the corresponding relation between the first user and the second user.
And S204, when the matching degree value is larger than a preset threshold value, establishing a corresponding relation between the first user and the second user.
Optionally, in order to determine whether the first user and the second user are of the same type, the matching degree value of the first object group of interest and the second object group of interest may be specifically compared with a preset threshold after the matching degree value is calculated. And when the matching degree value is larger than a preset threshold value, judging that the attention similarity between the first user and the second user is closer, and establishing a corresponding relation between the first user and the second user, wherein the first user and the second user are successfully matched.
It should be noted that. In the present application, the preset threshold is not specifically limited, that is, the preset threshold may be 1, and the preset threshold may also be 10. The specific variation of the preset threshold does not affect the protection scope of the present application.
S205, when the matching degree value is not larger than the preset threshold value, establishing a corresponding relation of the matching failure of the first user and the second user.
Further optionally, when the detected matching degree value is smaller than or equal to the preset threshold, it is determined that the attention similarity between the first user and the second user is not close, so as to establish a corresponding relationship between the first user and the second user in which matching fails.
In the application, after the first attention object group information of the first user and the second attention object group information of the second user are obtained, the matching degree value of the first attention object group information and the second attention object group information can be calculated, and whether the corresponding relationship between the first user and the second user is established or not is selected according to the comparison result of the matching degree value and the preset threshold value. By applying the technical scheme of the application, the corresponding relation between the users can be established through the respective attention objects of the users collected in advance. And then when data information for the attention object is acquired subsequently, the data information can be pushed to users with corresponding relations in a targeted manner.
In another embodiment of the present application, as shown in fig. 4, the present application further provides a social user classifying device, which includes a first obtaining module 301, a second obtaining module 302, and a building module 303, wherein,
a first obtaining module 301 configured to obtain first attention object group information of a first user, where the first attention object group is a set of objects focused by the first user;
a second obtaining module 302, configured to obtain second attention object group information of a second user, where the second attention object group is a set of objects that are attended by the second user, and the second user is a different user from the first user;
an establishing module 303, configured to establish a corresponding relationship between the first user and the second user based on a matching relationship between the first attention object group information and the second attention object group information.
In the application, after the first attention object group information of the first user and the second attention object group information of the second user are acquired, the corresponding relationship between the first user and the second user may be established based on the matching relationship between the first attention object group information and the second attention object group information. By applying the technical scheme of the application, the corresponding relation between the users can be established through the respective attention objects of the users collected in advance. And then when data information for the attention object is acquired subsequently, the data information can be pushed to users with corresponding relations in a targeted manner.
Optionally, in another embodiment of the present application, the establishing module 303 further includes a calculating unit and a matching unit, where:
a calculation unit configured to calculate a matching degree value of the first object of interest group and the second object of interest group;
the matching unit is configured to establish a corresponding relation between the first user and the second user when the matching degree value is larger than a preset threshold value.
The matching unit is further configured to establish a corresponding relation of matching failure of the first user and the second user when the matching degree value is not larger than a preset threshold value.
In another embodiment of the present application, the establishing module 303 further includes an obtaining unit and a detecting unit, where:
an acquisition unit configured to acquire each of first attention object information in first attention object group information and each of second attention object information in the second attention object group information, respectively;
a detection unit configured to detect first attention object information of each first attention object, and second attention object information of each second attention object;
a calculation unit configured to calculate a matching degree value of the first object of interest group and the second object of interest group based on the respective first object of interest information and the respective second object of interest information.
In another embodiment of the present application, the establishing module 303 further includes:
the calculating a matching degree value of the first object of interest group and the second object of interest group based on the first object of interest information and the second object of interest information includes:
an establishing module 303 configured to calculate a matching degree value of the first object of interest group and the second object of interest group based on the first object of interest information, the second object of interest information, and the weight coefficient.
In another embodiment of the present application, the method further comprises a detecting module 304, wherein:
a detection module 304 configured to detect a number of objects of interest in the first object of interest group and to detect a number of objects of interest in the second object of interest group;
the detection module 304 is further configured to, when the number of the objects of interest in the first object of interest group and the number of the objects of interest in the second object of interest group are both greater than a preset number, establish a corresponding relationship between the first user and the second user based on a matching relationship between the first object of interest group and the second object of interest group.
FIG. 5 is a block diagram illustrating a logical configuration of an electronic device in accordance with an exemplary embodiment. For example, the electronic device 400 may be a mobile phone, a computer, a digital broadcast terminal, a messaging device, a game console, a tablet device, a medical device, an exercise device, a personal digital assistant, and the like.
Referring to fig. 5, electronic device 400 may include one or more of the following components: a processor 401 and a memory 402.
Processor 401 may include one or more processing cores, such as a 4-core processor, an 8-core processor, or the like. The processor 401 may be implemented in at least one hardware form of a DSP (Digital Signal Processing), an FPGA (Field-Programmable Gate Array), and a PLA (Programmable Logic Array). The processor 401 may also include a main processor and a coprocessor, where the main processor is a processor for Processing data in an awake state, and is also called a Central Processing Unit (CPU); a coprocessor is a low power processor for processing data in a standby state. In some embodiments, the processor 401 may be integrated with a GPU (Graphics Processing Unit), which is responsible for rendering and drawing the content required to be displayed by the display screen. In some embodiments, the processor 401 may further include an AI (Artificial Intelligence) processor for processing a calculation operation related to machine learning.
Memory 402 may include one or more computer-readable storage media, which may be non-transitory. Memory 402 may also include high speed random access memory, as well as non-volatile memory, such as one or more magnetic disk storage devices, flash memory storage devices. In some embodiments, a non-transitory computer readable storage medium in the memory 402 is used to store at least one instruction for execution by the processor 401 to implement the interactive special effect calibration method provided by the method embodiments of the present application.
In some embodiments, the electronic device 400 may further optionally include: a peripheral interface 403 and at least one peripheral. The processor 401, memory 402 and peripheral interface 403 may be connected by buses or signal lines. Each peripheral may be connected to the peripheral interface 403 via a bus, signal line, or circuit board. Specifically, the peripheral device includes: at least one of radio frequency circuitry 404, touch screen display 405, camera 406, audio circuitry 407, positioning components 408, and power source 409.
The peripheral interface 403 may be used to connect at least one peripheral related to I/O (Input/Output) to the processor 401 and the memory 402. In some embodiments, processor 401, memory 402, and peripheral interface 403 are integrated on the same chip or circuit board; in some other embodiments, any one or two of the processor 401, the memory 402 and the peripheral interface 403 may be implemented on a separate chip or circuit board, which is not limited by this embodiment.
The Radio Frequency circuit 404 is used for receiving and transmitting RF (Radio Frequency) signals, also called electromagnetic signals. The radio frequency circuitry 404 communicates with a communication network and other communication devices via electromagnetic signals. The rf circuit 404 converts an electrical signal into an electromagnetic signal to transmit, or converts a received electromagnetic signal into an electrical signal. Optionally, the radio frequency circuit 404 includes: an antenna system, an RF transceiver, one or more amplifiers, a tuner, an oscillator, a digital signal processor, a codec chipset, a subscriber identity module card, and so forth. The radio frequency circuitry 404 may communicate with other terminals via at least one wireless communication protocol. The wireless communication protocols include, but are not limited to: metropolitan area networks, various generation mobile communication networks (2G, 3G, 4G, and 5G), wireless local area networks, and/or WiFi (Wireless Fidelity) networks. In some embodiments, the rf circuit 404 may further include NFC (Near Field Communication) related circuits, which are not limited in this application.
The display screen 405 is used to display a UI (user interface). The UI may include graphics, text, icons, video, and any combination thereof. When the display screen 405 is a touch display screen, the display screen 405 also has the ability to capture touch signals on or above the surface of the display screen 405. The touch signal may be input to the processor 401 as a control signal for processing. At this point, the display screen 405 may also be used to provide virtual buttons and/or a virtual keyboard, also referred to as soft buttons and/or a soft keyboard. In some embodiments, the display screen 405 may be one, providing the front panel of the electronic device 400; in other embodiments, the display screen 405 may be at least two, respectively disposed on different surfaces of the electronic device 400 or in a folded design; in still other embodiments, the display screen 405 may be a flexible display screen disposed on a curved surface or a folded surface of the electronic device 400. Even further, the display screen 405 may be arranged in a non-rectangular irregular pattern, i.e. a shaped screen. The Display screen 405 may be made of LCD (Liquid Crystal Display), OLED (Organic Light-Emitting Diode), and other materials.
The camera assembly 406 is used to capture images or video. Optionally, camera assembly 406 includes a front camera and a rear camera. Generally, a front camera is disposed at a front panel of a terminal, and a rear camera is disposed at a rear surface of the terminal. In some embodiments, the number of the rear cameras is at least two, and each rear camera is any one of a main camera, a depth-of-field camera, a wide-angle camera and a telephoto camera, so that the main camera and the depth-of-field camera are fused to realize a background blurring function, the main camera and the wide-angle camera are fused to realize panoramic shooting and a VR (Virtual Reality) shooting function or other fusion shooting functions. In some embodiments, camera assembly 406 may also include a flash. The flash lamp can be a monochrome temperature flash lamp or a bicolor temperature flash lamp. The double-color-temperature flash lamp is a combination of a warm-light flash lamp and a cold-light flash lamp, and can be used for light compensation at different color temperatures.
The audio circuit 407 may include a microphone and a speaker. The microphone is used for collecting sound waves of a user and the environment, converting the sound waves into electric signals, and inputting the electric signals to the processor 401 for processing, or inputting the electric signals to the radio frequency circuit 404 for realizing voice communication. The microphones may be provided in a plurality, respectively, at different portions of the electronic device 400 for the purpose of stereo sound acquisition or noise reduction. The microphone may also be an array microphone or an omni-directional acquisition microphone. The speaker is used to convert electrical signals from the processor 401 or the radio frequency circuit 404 into sound waves. The loudspeaker can be a traditional film loudspeaker or a piezoelectric ceramic loudspeaker. When the speaker is a piezoelectric ceramic speaker, the speaker can be used for purposes such as converting an electric signal into a sound wave audible to a human being, or converting an electric signal into a sound wave inaudible to a human being to measure a distance. In some embodiments, audio circuitry 407 may also include a headphone jack.
The positioning component 408 is used to locate the current geographic Location of the electronic device 400 for navigation or LBS (Location Based Service). The Positioning component 408 may be a Positioning component based on the GPS (Global Positioning System) of the united states, the beidou System of china, the graves System of russia, or the galileo System of the european union.
The power supply 409 is used to supply power to the various components in the electronic device 400. The power source 409 may be alternating current, direct current, disposable or rechargeable. When power source 409 comprises a rechargeable battery, the rechargeable battery may support wired or wireless charging. The rechargeable battery may also be used to support fast charge technology.
In some embodiments, the electronic device 400 also includes one or more sensors 410. The one or more sensors 410 include, but are not limited to: acceleration sensor 411, gyro sensor 412, pressure sensor 413, fingerprint sensor 414, optical sensor 415, and proximity sensor 416.
The acceleration sensor 411 may detect the magnitude of acceleration in three coordinate axes of a coordinate system established with the electronic apparatus 400. For example, the acceleration sensor 411 may be used to detect components of the gravitational acceleration in three coordinate axes. The processor 401 may control the touch display screen 405 to display the user interface in a landscape view or a portrait view according to the gravitational acceleration signal collected by the acceleration sensor 411. The acceleration sensor 411 may also be used for acquisition of motion data of a game or a user.
The gyro sensor 412 may detect a body direction and a rotation angle of the electronic device 400, and the gyro sensor 412 may cooperate with the acceleration sensor 411 to acquire a 3D motion of the user on the electronic device 400. From the data collected by the gyro sensor 412, the processor 401 may implement the following functions: motion sensing (such as changing the UI according to a user's tilting operation), image stabilization at the time of photographing, game control, and inertial navigation.
The pressure sensors 413 may be disposed on a side bezel of the electronic device 400 and/or on a lower layer of the touch display screen 405. When the pressure sensor 413 is arranged on the side frame of the electronic device 400, the holding signal of the user to the electronic device 400 can be detected, and the processor 401 performs left-right hand identification or shortcut operation according to the holding signal collected by the pressure sensor 413. When the pressure sensor 413 is disposed at the lower layer of the touch display screen 405, the processor 401 controls the operability control on the UI interface according to the pressure operation of the user on the touch display screen 405. The operability control comprises at least one of a button control, a scroll bar control, an icon control and a menu control.
The fingerprint sensor 414 is used for collecting a fingerprint of the user, and the processor 401 identifies the identity of the user according to the fingerprint collected by the fingerprint sensor 414, or the fingerprint sensor 414 identifies the identity of the user according to the collected fingerprint. Upon recognizing that the user's identity is a trusted identity, processor 401 authorizes the user to perform relevant sensitive operations including unlocking the screen, viewing encrypted information, downloading software, paying, and changing settings, etc. The fingerprint sensor 414 may be disposed on the front, back, or side of the electronic device 400. When a physical button or vendor Logo is provided on the electronic device 400, the fingerprint sensor 414 may be integrated with the physical button or vendor Logo.
The optical sensor 415 is used to collect the ambient light intensity. In one embodiment, the processor 401 may control the display brightness of the touch display screen 405 based on the ambient light intensity collected by the optical sensor 415. Specifically, when the ambient light intensity is high, the display brightness of the touch display screen 405 is increased; when the ambient light intensity is low, the display brightness of the touch display screen 405 is turned down. In another embodiment, the processor 401 may also dynamically adjust the shooting parameters of the camera assembly 406 according to the ambient light intensity collected by the optical sensor 415.
Proximity sensor 416, also known as a distance sensor, is typically disposed on the front panel of electronic device 400. The proximity sensor 416 is used to capture the distance between the user and the front of the electronic device 400. In one embodiment, the processor 401 controls the touch display screen 405 to switch from the bright screen state to the dark screen state when the proximity sensor 416 detects that the distance between the user and the front surface of the electronic device 400 gradually decreases; when the proximity sensor 416 detects that the distance between the user and the front of the electronic device 400 is gradually increased, the processor 401 controls the touch display screen 405 to switch from the breath screen state to the bright screen state.
Those skilled in the art will appreciate that the configuration shown in fig. 5 does not constitute a limitation of the electronic device 400, and may include more or fewer components than those shown, or combine certain components, or employ a different arrangement of components.
In an exemplary embodiment, there is also provided a non-transitory computer-readable storage medium, such as a memory 404, including instructions executable by a processor 420 of an electronic device 400 to perform the method for social user classification described above, the method including obtaining first object-of-interest group information of a first user, the first object-of-interest group being a collection of objects of interest to the first user; acquiring second attention object group information of a second user, wherein the second attention object group is a set of objects which are concerned by the second user, and the second user is a user different from the first user; and establishing a corresponding relation between the first user and the second user based on the matching relation between the first object-of-interest group information and the second object-of-interest group information. Optionally, the instructions may also be executed by the processor 420 of the electronic device 400 to perform other steps involved in the exemplary embodiments described above. For example, the non-transitory computer readable storage medium may be a ROM, a Random Access Memory (RAM), a CD-ROM, a magnetic tape, a floppy disk, an optical data storage device, and the like.
In an exemplary embodiment, there is also provided an application/computer program product including one or more instructions executable by the processor 420 of the electronic device 400 to perform the above method of social user classification, the method comprising: acquiring first attention object group information of a first user, wherein the first attention object group is a set of objects which are concerned by the first user; acquiring second attention object group information of a second user, wherein the second attention object group is a set of objects which are concerned by the second user, and the second user is a user different from the first user; and establishing a corresponding relation between the first user and the second user based on the matching relation between the first concerned object group information and the second concerned object group information. Optionally, the instructions may also be executable by the processor 420 of the electronic device 400 to perform other steps involved in the exemplary embodiments described above.
Other embodiments of the disclosure will be apparent to those skilled in the art from consideration of the specification and practice of the invention disclosed herein. This application is intended to cover any variations, uses, or adaptations of the disclosure following, in general, the principles of the disclosure and including such departures from the present disclosure as come within known or customary practice in the art to which the disclosure pertains. It is intended that the specification and examples be considered as exemplary only, with a true scope and spirit of the disclosure being indicated by the following claims.
It will be understood that the present disclosure is not limited to the precise arrangements described above and shown in the drawings and that various modifications and changes may be made without departing from the scope thereof. The scope of the present disclosure is limited only by the appended claims.

Claims (6)

1. A method for classifying social users, comprising:
acquiring first attention object group information of a first user, wherein the first attention object group is a set of objects which are focused by the first user;
acquiring second attention object group information of a second user, wherein the second attention object group is a set of objects which are concerned by the second user, and the second user is a user different from the first user;
establishing a corresponding relation between the first user and the second user based on the matching relation between the first concerned object group information and the second concerned object group information;
establishing a corresponding relationship between the first user and the second user based on the matching relationship between the first attention object group information and the second attention object group information, including:
calculating a matching degree value of the first object of interest group and the second object of interest group;
the calculating the matching degree value of the first object of interest group and the second object of interest group comprises:
respectively acquiring each piece of first attention object information in the first attention object group information and each piece of second attention object information in the second attention object group information;
detecting first object-of-interest information of a first object-of-interest group and second object-of-interest information of a second object-of-interest group;
calculating a matching degree value of the first object of interest group and the second object of interest group based on the first object of interest information and the second object of interest information;
calculating a matching degree value of the first object of interest group and the second object of interest group based on the first object of interest information and the second object of interest information, including:
matching the attention values of the attention objects in the first attention object group with the attention values of the attention objects in the second attention object group one by one, and generating corresponding matching degree values according to the result of one-to-one matching;
when the matching degree value is larger than a preset threshold value, establishing a corresponding relation between the first user and the second user, wherein the first user and the second user are successfully matched;
when the matching degree value is not larger than a preset threshold value, establishing a corresponding relation of matching failure of the first user and the second user;
wherein the interested object information in the first interested object information and the second interested object information is identity parameter information and category labels corresponding to the interested objects;
the identity parameter information is used for identifying identity information corresponding to the concerned object; the category label is used for identifying a label corresponding to the release information of the attention object;
the attention value of the attention object in the first attention object group is determined based on the identity parameter information and the class label corresponding to the attention object, and the attention value of the attention object in the second attention object group is determined based on the identity parameter information and the class label corresponding to the attention object.
2. The method of claim 1, wherein the calculating a match metric for the first object of interest group and the second object of interest group based on the respective first object of interest information and the respective second object of interest information comprises:
and calculating the attention value of the first attention object group and the second attention object group based on the first attention object information, the second attention object information and the weight coefficient.
3. The method of claim 1, further comprising, after said obtaining a second set of objects of interest for a second user:
detecting the number of the objects of interest in the first object of interest group and the number of the objects of interest in the second object of interest group;
and when the number of the objects of interest in the first object group of interest and the number of the objects of interest in the second object group of interest are both greater than a preset number, establishing a corresponding relationship between the first user and the second user based on a matching relationship between the first object group of interest and the second object group of interest.
4. A social user classification device, comprising:
a first obtaining module configured to obtain a first object of interest group of a first user, the first object of interest group being a set of objects of interest to the first user;
a second obtaining module configured to obtain a second attention object group of a second user, where the second attention object group is a set of objects that are attended by the second user, and the second user is a different user from the first user;
the establishing module is configured to establish a corresponding relation between the first user and the second user based on a matching relation between the first concerned object group and the second concerned object group;
the building module further comprises a computing unit and a matching unit, wherein:
a calculation unit configured to calculate a matching degree value of the first object of interest group and the second object of interest group;
the calculating a matching degree value of the first object of interest group and the second object of interest group comprises:
respectively acquiring each piece of first attention object information in the first attention object group information and each piece of second attention object information in the second attention object group information;
detecting first object-of-interest information of a first object-of-interest group and second object-of-interest information of a second object-of-interest group;
calculating a matching degree value of the first object of interest group and the second object of interest group based on the first object of interest information and the second object of interest information;
calculating a matching degree value of the first object of interest group and the second object of interest group based on the first object of interest information and the second object of interest information, including:
matching the attention values of the attention objects in the first attention object group with the attention values of the attention objects in the second attention object group one by one, and generating corresponding matching degree values according to the result of one-to-one matching;
the matching unit is configured to establish a corresponding relation between the first user and the second user when the matching degree value is larger than a preset threshold value;
the matching unit is further configured to establish a corresponding relation of matching failure of the first user and the second user when the matching degree value is not larger than a preset threshold value;
the concerned object information in the first concerned object information and the second concerned object information is identity parameter information and a category label corresponding to each concerned object; the identity parameter information is used for identifying identity information corresponding to the concerned object; the category label is used for identifying a label corresponding to the release information of the attention object;
the attention value of the attention object in the first attention object group is determined based on the identity parameter information and the class label corresponding to the attention object, and the attention value of the attention object in the second attention object group is determined based on the identity parameter information and the class label corresponding to the attention object.
5. An electronic device, comprising:
a memory for storing executable instructions; and the number of the first and second groups,
a display for display with the memory to execute the executable instructions to perform the operations of the method of classifying a social user of any of claims 1-3.
6. A computer-readable storage medium storing computer-readable instructions that, when executed, perform the operations of the method of classifying a social user of any of claims 1-3.
CN201910315093.XA 2019-04-18 2019-04-18 Social user classification method and device, electronic equipment and medium Active CN110544108B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN201910315093.XA CN110544108B (en) 2019-04-18 2019-04-18 Social user classification method and device, electronic equipment and medium

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN201910315093.XA CN110544108B (en) 2019-04-18 2019-04-18 Social user classification method and device, electronic equipment and medium

Publications (2)

Publication Number Publication Date
CN110544108A CN110544108A (en) 2019-12-06
CN110544108B true CN110544108B (en) 2022-12-13

Family

ID=68702680

Family Applications (1)

Application Number Title Priority Date Filing Date
CN201910315093.XA Active CN110544108B (en) 2019-04-18 2019-04-18 Social user classification method and device, electronic equipment and medium

Country Status (1)

Country Link
CN (1) CN110544108B (en)

Citations (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN108197330A (en) * 2014-11-10 2018-06-22 北京字节跳动网络技术有限公司 Data digging method and device based on social platform

Family Cites Families (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20140047089A1 (en) * 2012-08-10 2014-02-13 International Business Machines Corporation System and method for supervised network clustering
CN108600076A (en) * 2017-03-07 2018-09-28 中移(杭州)信息技术有限公司 A kind of social networks method for building up and system
CN108228847B (en) * 2018-01-10 2021-10-22 北京奇艺世纪科技有限公司 User matching method and device and electronic equipment

Patent Citations (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN108197330A (en) * 2014-11-10 2018-06-22 北京字节跳动网络技术有限公司 Data digging method and device based on social platform

Non-Patent Citations (1)

* Cited by examiner, † Cited by third party
Title
协同过滤推荐算法的原理及实现;猪逻辑公园;《https://blog.csdn.net/qq_15111861/article/details/80666451》;20180612;全文 *

Also Published As

Publication number Publication date
CN110544108A (en) 2019-12-06

Similar Documents

Publication Publication Date Title
CN110795236B (en) Method, device, electronic equipment and medium for adjusting capacity of server
CN110532170B (en) Method and device for building test environment, electronic equipment and medium
CN111104980B (en) Method, device, equipment and storage medium for determining classification result
WO2020211607A1 (en) Video generation method, apparatus, electronic device, and medium
CN112965683A (en) Volume adjusting method and device, electronic equipment and medium
CN111131392A (en) Method, device, electronic equipment and medium for processing message
CN111327819A (en) Method, device, electronic equipment and medium for selecting image
CN111192072A (en) User grouping method and device and storage medium
CN110659895A (en) Payment method, payment device, electronic equipment and medium
CN111857793A (en) Network model training method, device, equipment and storage medium
CN111064657B (en) Method, device and system for grouping concerned accounts
CN110795660B (en) Data analysis method, data analysis device, electronic device, and medium
CN111563201A (en) Content pushing method, device, server and storage medium
CN112860046A (en) Method, apparatus, electronic device and medium for selecting operation mode
CN110781032A (en) Data transmission method, device, electronic equipment and medium
CN110544108B (en) Social user classification method and device, electronic equipment and medium
CN113836426A (en) Information pushing method and device and electronic equipment
CN113051485A (en) Group searching method, device, terminal and storage medium
CN113051494A (en) Information display method and device, electronic equipment and storage medium
CN110555081B (en) Social interaction user classification method and device, electronic equipment and medium
CN112132472A (en) Resource management method and device, electronic equipment and computer readable storage medium
CN112214115A (en) Input mode identification method and device, electronic equipment and storage medium
CN111897709A (en) Method, device, electronic equipment and medium for monitoring user
CN110795639A (en) Method, device, electronic equipment and medium for receiving information
CN111125095A (en) Data prefix adding method and device, electronic equipment and medium

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant