CN110795178B - Application sign-in method and device and electronic equipment - Google Patents

Application sign-in method and device and electronic equipment Download PDF

Info

Publication number
CN110795178B
CN110795178B CN201810881964.XA CN201810881964A CN110795178B CN 110795178 B CN110795178 B CN 110795178B CN 201810881964 A CN201810881964 A CN 201810881964A CN 110795178 B CN110795178 B CN 110795178B
Authority
CN
China
Prior art keywords
state
check
sign
emotion
user
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Active
Application number
CN201810881964.XA
Other languages
Chinese (zh)
Other versions
CN110795178A (en
Inventor
曾文富
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Alibaba China Co Ltd
Original Assignee
Alibaba China Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Alibaba China Co Ltd filed Critical Alibaba China Co Ltd
Priority to CN201810881964.XA priority Critical patent/CN110795178B/en
Publication of CN110795178A publication Critical patent/CN110795178A/en
Application granted granted Critical
Publication of CN110795178B publication Critical patent/CN110795178B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F9/00Arrangements for program control, e.g. control units
    • G06F9/06Arrangements for program control, e.g. control units using stored programs, i.e. using an internal store of processing equipment to receive or retain programs
    • G06F9/44Arrangements for executing specific programs
    • G06F9/451Execution arrangements for user interfaces

Abstract

The invention discloses an application sign-in method, an application sign-in device and electronic equipment. The method comprises the following steps: receiving a sign-in operation of a user applied to an application sign-in interface, and acquiring a corresponding sign-in state; analyzing the sign-in state and obtaining a corresponding sign-in result; and displaying the check-in result through the application check-in interface. According to the invention, the personalized sign-in result of the user is displayed and adapted for the sign-in operation of the user, a unique application sign-in response mode is provided, and the attraction of the application to the user is effectively improved.

Description

Application sign-in method and device and electronic equipment
Technical Field
The present invention relates to the field of computer application technologies, and in particular, to an application sign-in method, an application sign-in device, and an electronic device.
Background
With rapid development of internet technology and popularization of intelligent terminals, more and more users are used to acquiring corresponding Application services through Applications (APP) installed on terminal devices such as smart phones, palm computers and tablet computers, accessing networks.
At present, a large number of applications are provided in the application market for users to download and install, wherein a plurality of applications provide a sign-in function, and users using the applications can be given corresponding rewards, such as points, special authorities and the like, according to sign-in records, so that the users are encouraged to use continuously, and the attraction of the applications to the users is increased. However, the sign-in function provided in the current application basically gives rewards to users according to sign-in records, only the sign-in rewards are different in form, the integral sign-in responses are converged, the sign-in function becomes a chicken rib function in the application, and the attraction of the application to the users cannot be improved.
Disclosure of Invention
It is an object of the present invention to provide a new solution for application check-in.
According to a first aspect of the present invention, there is provided an application check-in method, including:
receiving a sign-in operation of a user applied to an application sign-in interface, and acquiring a corresponding sign-in state;
wherein the sign-in state at least comprises an emotional state when a user performs the sign-in operation;
analyzing the sign-in state and obtaining a corresponding sign-in result;
wherein the sign-in result at least comprises an emotional state prompt corresponding to the emotional state;
and displaying the check-in result through the application check-in interface.
Optionally, a sign-in status tag is set in the application sign-in interface;
the check-in operation comprises a selection operation or an editing operation implemented on the check-in status tag;
the step of obtaining the corresponding sign-in state comprises the following steps:
according to the sign-in operation of the sign-in state label implemented by the user, acquiring information corresponding to the sign-in state label as a sign-in state;
and/or the number of the groups of groups,
a shooting control for triggering and calling the camera is arranged in the application sign-in interface;
the check-in operation comprises an operation which is implemented on the shooting control to trigger a camera to be called;
The step of obtaining the corresponding sign-in state comprises the following steps:
and calling a camera to acquire the check-in state according to the check-in operation of the shooting control carried out by the user.
Further optionally, the step of analyzing the check-in status and obtaining a check-in result includes:
acquiring a target event tag corresponding to the emotion state;
inquiring and acquiring a matched emotion statement in a pre-constructed emotion statement library according to the target event label and a target emotion state corresponding to the target event label, and taking the emotion statement as an emotion state prompt;
the emotion statement library comprises a plurality of emotion statements, and each emotion statement is a statement respectively associated with an event tag and an emotion state.
Further optionally, the step of acquiring a target event tag corresponding to the emotional state includes:
acquiring event labels in each time unit and the emotion state of the user in a statistical period which accords with a preset statistical duration from the current time unit of the user implementing the sign-in operation;
and determining a target event label according to the event label in each time unit in the statistical period and the emotion state of the user.
Further optionally, the step of acquiring an event tag in each time unit includes:
acquiring event information of the hot events in each time unit;
and extracting event feature words from event information of the hot events in each time unit as the event labels in the corresponding time units.
Further optionally, the step of determining a target event tag according to the event tag in each of the time units in the statistical period and the emotional state of the user includes:
determining a target emotional state from all the emotional states of the users acquired in the statistical period;
selecting the event label corresponding to the target emotion state from all the event labels acquired in the statistical period as a candidate event label;
and carrying out cluster analysis on the candidate event tags to obtain the candidate event tag with the highest association degree with the target emotion state as the target event tag.
Optionally, the emotional state prompt includes an indication of a change in emotional state within a statistical period of time that corresponds to a preset statistical duration from a current time unit in which the user performs the check-in operation;
The step of obtaining the corresponding check-in result comprises the following steps:
acquiring the emotional state of the user in each time unit in the statistical period, and generating a corresponding emotional state change indication;
the step of displaying the check-in result comprises the following steps:
and drawing a corresponding emotional state change curve according to the emotional state change indication, and displaying the emotional state change curve in the application sign-in interface.
Optionally, the step of displaying the check-in result further includes:
when the check-in result is displayed, simultaneously displaying a recommendation result corresponding to the check-in result;
the recommendation result at least comprises one of user information of other users corresponding to the check-in result and article information corresponding to the check-in result.
According to a second aspect of the present invention, there is provided an application check-in apparatus, comprising:
the state acquisition unit is used for receiving the check-in operation of the user applied to the application check-in interface and acquiring a corresponding check-in state;
wherein the sign-in state at least comprises an emotional state when a user performs the sign-in operation;
the state analysis unit is used for analyzing the sign-in state and acquiring a corresponding sign-in result;
Wherein the sign-in result at least comprises an emotional state prompt corresponding to the emotional state;
and the result display unit is used for displaying the check-in result through the application check-in interface.
According to a third aspect of the present invention, there is provided an electronic apparatus, comprising:
the display device is used for displaying the man-machine interaction interface;
a memory for storing executable instructions;
and the processor is used for running the electronic equipment to execute the application check-in method of any one item provided in the first aspect of the invention according to the control of the executable instructions.
According to the embodiment of the disclosure, when the user is received to implement the check-in operation on the application check-in interface, the check-in state of the user including the emotional state when the user checks in is obtained, the corresponding check-in result including the emotional state prompt is obtained according to the check-in state analysis of the user so as to be displayed through the application check-in interface, the personalized check-in result of the user is displayed according to the check-in operation of the user, a unique application check-in response mode is provided, and the attraction of the application to the user is effectively improved.
Other features of the present invention and its advantages will become apparent from the following detailed description of exemplary embodiments of the invention, which proceeds with reference to the accompanying drawings.
Drawings
The accompanying drawings, which are incorporated in and constitute a part of this specification, illustrate embodiments of the invention and together with the description, serve to explain the principles of the invention.
Fig. 1 is a block diagram showing an example of a hardware configuration of an electronic device that can be used to implement an embodiment of the present invention.
Fig. 2 shows a flow chart of an application check-in method of an embodiment of the invention.
FIG. 3 shows a flowchart of the steps for obtaining check-in results in an embodiment of the present invention.
FIG. 4 shows a flowchart of the steps of obtaining a check-in target event tag, according to an embodiment of the invention.
FIG. 5 shows a flowchart of the steps of acquiring event tags for each time unit in accordance with an embodiment of the present invention.
FIG. 6 shows a flowchart of the steps of determining a target event tag according to an embodiment of the present invention.
Fig. 7 shows a block diagram of an application check-in device of an embodiment of the invention.
Fig. 8 shows a block diagram of an electronic device of an embodiment of the invention.
Detailed Description
Various exemplary embodiments of the present invention will now be described in detail with reference to the accompanying drawings. It should be noted that: the relative arrangement of the components and steps, numerical expressions and numerical values set forth in these embodiments do not limit the scope of the present invention unless it is specifically stated otherwise.
The following description of at least one exemplary embodiment is merely exemplary in nature and is in no way intended to limit the invention, its application, or uses.
Techniques, methods, and apparatus known to one of ordinary skill in the relevant art may not be discussed in detail, but are intended to be part of the specification where appropriate.
In all examples shown and discussed herein, any specific values should be construed as merely illustrative, and not a limitation. Thus, other examples of exemplary embodiments may have different values.
It should be noted that: like reference numerals and letters denote like items in the following figures, and thus once an item is defined in one figure, no further discussion thereof is necessary in subsequent figures.
< hardware configuration >
Fig. 1 is a block diagram showing a hardware configuration of an electronic device 1000 in which an embodiment of the present invention can be implemented.
The electronic device 1000 may be a laptop, desktop, cell phone, tablet, etc. As shown in fig. 1, the electronic device 1000 may include a processor 1100, a memory 1200, an interface device 1300, a communication device 1400, a display device 1500, an input device 1600, a speaker 1700, a microphone 1800, and the like. The processor 1100 may be a central processing unit CPU, a microprocessor MCU, or the like. The memory 1200 includes, for example, ROM (read only memory), RAM (random access memory), nonvolatile memory such as a hard disk, and the like. The interface device 1300 includes, for example, a USB interface, a headphone interface, and the like. The communication device 1400 can be capable of wired or wireless communication, and specifically can include Wifi communication, bluetooth communication, 2G/3G/4G/5G communication, and the like. The display device 1500 is, for example, a liquid crystal display, a touch display, or the like. The input device 1600 may include, for example, a touch screen, keyboard, somatosensory input, and the like. A user may input/output voice information through the speaker 1700 and microphone 1800.
The electronic device shown in fig. 1 is merely illustrative and is in no way meant to limit the invention, its application or uses. In an embodiment of the present invention, the memory 1200 of the electronic device 1000 is configured to store instructions for controlling the processor 1100 to operate to perform any one of the application check-in methods provided by the embodiment of the present invention. It will be appreciated by those skilled in the art that although a plurality of devices are shown for the electronic apparatus 1000 in fig. 1, the present invention may relate to only some of the devices thereof, for example, the electronic apparatus 1000 relates to only the processor 1100 and the storage device 1200. The skilled person can design instructions according to the disclosed solution. How the instructions control the processor to operate is well known in the art and will not be described in detail here.
< example >
The general conception of the invention is to provide a new application signing-in scheme, by acquiring the signing-in state of the user including the emotional state when the user signs in when receiving the signing-in operation of the user in the application signing-in interface, analyzing the signing-in state of the user to acquire the corresponding signing-in result including the emotional state prompt so as to display through the application signing-in interface, realizing the display of the personalized signing-in result of the matched user for the signing-in operation of the user, providing a unique application signing-in response mode and effectively improving the attraction of the application to the user.
< method >
In this embodiment, an application sign-in method is provided. The application check-in method can be applied to any application providing a check-in function. In this embodiment, an application refers to any software product or computer program that can be installed, loaded, or the like to be run in an electronic device such as a mobile phone, a tablet computer, a workstation, a game machine, or the like to provide a corresponding application service, for example, a mobile phone APP that can be installed in a mobile phone.
The application check-in method, as shown in fig. 2, comprises the following steps: steps S2100-S2300.
In step S2100, a check-in operation performed on the application check-in interface by the user is received, and a corresponding check-in state is obtained.
The application check-in interface is a man-machine interaction interface which is displayed to a user and used for receiving user check-in operation when the check-in function is provided by the application. In this embodiment, by implementing a check-in operation on the application check-in interface by the user, a corresponding check-in state can be obtained.
The check-in state of the user is a state of the user himself when the check-in operation is performed. In this embodiment, the check-in state includes at least an emotional state when the user performs the check-in operation. The emotional state may be a type of emotion of the user, an emotional tag, or the like. For example, emotional states may include emotional types of happiness, calm, low, anger, distress, and the like. The check-in status of the user may also include other status of the user, such as a scene in which the user checked in, weather, etc.
In one example, a check-in status tab is provided in the application check-in interface.
The check-in status tag may be a tag for showing a corresponding check-in status for selection or editing by a user, for example, may be a check-in tag showing different emotion types (happy, calm, low, anger, annoyance).
The sign-in state label can receive the selection operation of a user, and correspondingly, the information displayed in the selected sign-in state label can be used as the sign-in state when the user signs in; the check-in tag can also receive editing operations including deletion, addition and modification of a user, and the user can newly add the check-in tag conforming to the self check-in state, delete the check-in tag not conforming to the self preference and edit information displayed by the check-in tag to the information conforming to the self actual state. The check-in tag may be provided in a variety of shapes that meet the personalized needs of the user, such as bubble, heart, or other user-defined shapes.
Correspondingly, the check-in operation includes a selection operation or an editing operation for implementing the check-in status tag. The selection operation may be a click, a hook, a slide selection, or the like operation, and the editing operation includes an addition, a deletion, an editing, or the like operation. The step of obtaining the corresponding check-in state comprises the following steps:
And according to the check-in operation of the user on the check-in state label, acquiring information corresponding to the check-in state label as the check-in state.
The information corresponding to the check-in status tag may be tag information presented by the check-in status tag or information pre-associated with the check-in status tag.
In another example, a capture control is set in the application check-in interface that triggers the invocation of the camera. The shooting control can be a button, an icon and the like arranged in the application check-in interface and used for receiving operation call cameras of users. Correspondingly, the check-in operation includes an operation performed on the photographing control to trigger invoking the camera, such as clicking, hooking, or sliding the selected photographing control. The step of obtaining the corresponding check-in state comprises the following steps:
and calling the camera to acquire the check-in state according to the check-in operation of the shooting control carried out by the user.
In this example, after the camera is called, the facial expression of the user or the scene (including the place, weather, surrounding scene, etc.) where the user is located can be shot, and the check-in state when the user performs check-in can be determined by comparing the shot image with the sample image in the pre-constructed state image library. For example, the camera is called to shoot the face image of the user, the face image of the user can be compared with face sample images corresponding to various emotional states included in the state image library, image similarity algorithms such as histogram comparison and structural similarity can be used for determining a sample face sample image most similar to the face image of the user, and the emotional states of the corresponding user when signing operation is carried out can be correspondingly obtained.
The method comprises the steps of acquiring the sign-in state of a user, analyzing the sign-in state of the user, acquiring a corresponding sign-in result comprising an emotion state prompt according to the sign-in state of the user so as to display the sign-in result through an application sign-in interface, realizing the display of the sign-in operation of the user, adapting to the personalized sign-in result of the user, providing a unique application sign-in response mode, and effectively improving the attraction of the application to the user.
After step S2100, enter:
step S2200, analyzing the sign-in state and obtaining the corresponding sign-in result.
The sign-in result is an analysis result obtained after analyzing the sign-in state of the user. In this embodiment, the sign-in result includes at least an emotional state prompt corresponding to the emotional state of the user. The emotional state prompt is personalized prompt information generated after analyzing the emotional state of the user.
The check-in result may further include a result obtained by analyzing other state information included in the check-in state of the user, for example, the check-in state includes a scene and weather where the user checks in, and the check-in result may further include a corresponding scene analysis result and weather analysis result, which are not listed herein.
In one example, the step of analyzing the check-in status to obtain a check-in result may include, as shown in fig. 3: steps S2210-S2220.
Step S2210, a target event tag corresponding to the emotional state is acquired.
The target event tag is the event tag with the highest degree of association with an emotional state. An event tag is a tag that characterizes an event. The target event represented by the target event tag is the event which has the greatest influence on the corresponding emotional state.
The step of obtaining the target event tag corresponding to the emotional state may, as shown in fig. 4, include: steps S2211-S2212.
In step S2211, an event tag and an emotional state of the user in each time unit in a statistical period of time corresponding to a preset statistical duration from a current time unit in which the user performs a check-in operation are obtained.
The time unit is the longest time which is divided according to specific application scenes or application requirements and allows a user to check in once. For example, the time unit may be set to one day.
The preset statistical time length is the time length of the statistical time period set according to the specific application scene or the application requirement. For example, the statistical time period may be set to 1 month.
Assuming that the time unit is set to one day and the statistical time period is set to 1 month, one month from the day on which the user performs the check-in operation is the corresponding statistical period.
Based on the user's historical check-in records, the user's emotional state for each time unit during the statistical period may be obtained.
In this embodiment, the corresponding event tag may be acquired from the content of each time unit and stored as an event tag history, and the event tag of each time unit in the statistical period may be acquired according to the event tag history.
The step of acquiring the event tag in each time unit, as shown in fig. 5, includes: steps S22111-S22112.
In step S22111, event information of the hot event in each time unit is acquired.
In this example, the event information of the hot event can be obtained by capturing the hot event of each big portal, social platform, information publishing platform through network capturing and other means in each time unit through a preset period (for example, every 2 hours in 1 day) or fixed time (for example, 8 points, 12 points, 16 points and the like of each day), and the event information can be an event title, an event keyword and the like.
In step S22112, event feature words are extracted from the event information of the hot event in each time unit as event tags in the corresponding time units.
And processing methods such as word segmentation, correlation calculation and the like are carried out on event information of the hot event, and event feature words which can most represent the event can be extracted to serve as corresponding event labels.
For example, when a hot search event captured from a search website on a certain day (time unit is 1 day) is "what gift should be sent by the mother's day", and the extracted event feature word is "mother's day", an event tag "mother's day" on the same day can be obtained.
It should be appreciated that multiple event tags may be obtained within each event unit.
In practical application, any event label in each time unit can be modified or added or subtracted according to an external configuration instruction of a related person, so that the event label in each time unit is more accurate.
After step S2211, it proceeds to:
step S2212, determining a target event label according to the event label in each time unit in the statistics period and the emotion state of the user.
Step S2212 may include, as shown in fig. 6: steps S22121-S22123.
Step S22121, determining a target emotional state from all the emotional states of the users acquired in the statistical period.
The target emotional state may be the most frequently occurring emotional state in the statistical period.
For example, the emotional state in which the number of time units in which the user appears within the statistical period is the maximum is "anger", and the target emotional state is "anger".
Alternatively, the emotional states of the user during the statistical period may be classified according to a preset emotional classification. The preset emotion classification is assumed to include: "good emotion", "bad emotion", "general emotion". Correspondingly, the emotional states of the user are classified into the categories of "happy", "like", etc., the emotional states of the user are classified into the categories of "anger", "jealoy", "offensive", "wounded", etc., the emotional states of the user are classified into the categories of "bad emotion", the emotional states of the user are classified into the categories of "calm", "stable", "nostalgic", etc., so that the emotional states of the user in the statistical period can be classified into three categories, and the emotion with the largest number of occurring time units is classified as the target emotional state.
For example, the statistical period is one month, the time unit is 1 day, the number of days in which the "good emotion" appears in the statistical period is 5 days, the number of days in which the "general emotion" appears is 10 days, the number of days in which the "bad emotion" appears is 15 days, and the "bad emotion" appears as the target emotion state.
Step S22122, selecting an event tag corresponding to the target emotion state from all event tags obtained in the statistics period as a candidate event tag.
The event tag corresponding to the target emotional state is an event tag in a time unit in which the target emotional state occurs in the statistical period. The event label corresponding to the target emotion state is used as a candidate event label, and the target event label is selected from the candidate event labels, so that the target event label can more truly reflect the event affecting the emotion state of the user.
For example, the statistical period is one month, the time unit is 1 day, the target emotional state is "bad emotion", the date a and the date B are both "bad emotion", the event label of the date a is "world cup", "england gives belgium", the event label of the date B is "france takes a crown", and the candidate event label is "world cup", "england gives belgium", "france takes a crown".
Step S22123, performing cluster analysis on the candidate event tags, and obtaining the candidate event tag with the highest association degree with the target emotion state as the target event tag.
The candidate event labels can be subjected to cluster analysis according to the association degree among the event labels, and some representative candidate event labels are selected from the candidate event labels, for example, the candidate event labels are "world cup", "England is delivered to belgium", "France is covered by the crown", and can be clustered into the representative candidate event label of "world cup" through the cluster analysis.
After cluster analysis of the candidate event tags, a degree of association with the target emotional state is obtained for each candidate event tag after cluster analysis, wherein the degree of association can be the proportion of the number of occurrence of the candidate event tag in the statistical period to the total number of occurrence of the event tag in the whole statistical period. It should be understood that the candidate event tags after the cluster analysis represent a class of candidate event tags, and the number of occurrences of the class of candidate event tags within the statistical period should be counted among the number of occurrences of the candidate event tags within the statistical period.
For example, the target emotion state is "bad emotion", the candidate event labels obtained after clustering the candidate event labels are "world cup", "flood and" shopping festival ", wherein the number of occurrences of the" world cup "is the largest, and the corresponding association degree with the target emotion state is the largest, so the target event label is" world cup ".
Based on the above-listed ways of acquiring the target event tag, other ways of acquiring the target event tag may be further obtained by appropriately deforming or adjusting. For example, other time units in the statistical period that are the same as the emotional state of the user in the current time unit, and event tags that occur in the current time unit may be used as candidate event tags, and the target event tag may be acquired from the candidate event tags in a manner similar to the selection of the target event tag from the candidate events.
Step S2210 has been described above with reference to the drawings and examples, and after the target event tag is acquired, the process proceeds to:
step S2220, according to the target event label and the target emotion state corresponding to the target event label, inquiring and obtaining the matched emotion statement in the emotion statement library constructed in advance, and using the emotion statement as an emotion state prompt.
The target event tag is the event tag with the highest degree of association with an emotional state.
Based on the above embodiment of step S2210, the target emotional state corresponding to the target event tag may be the emotional state with the highest occurrence frequency in the statistical period, or may be the emotional state in which the user performs the check-in operation in the current time unit.
The emotion statement library is a database including a plurality of emotion statements. In this embodiment, feature sentences related to preset emotional state classification and event tag content may be extracted from the available web text content by a web capturing means, and the most representative sentences are extracted from the feature sentences according to cosine similarity, manhattan similarity distance and other manners to serve as emotion sentences, so that a plurality of emotion sentences are obtained to construct a corresponding emotion sentence library. Correspondingly, each emotional statement is a statement associated with one event tag and one emotional state, respectively.
In practical application, the emotion sentences in the emotion sentence library can be manually collected by related personnel and written into the emotion sentence library, or the emotion sentences in the emotion sentence library can be manually modified and deleted, so that the content of the emotion sentence library is more accurate and rich.
According to the target event label and the target emotion state corresponding to the target event label, in an emotion statement library, inquiring to obtain emotion statements related to the target event label and the target emotion state, and selecting the emotion statement with the highest degree of relevance with the target event label and the target emotion state from the emotion statements as a matched emotion statement to be used as an emotion state prompt.
For example, in the emotional statement library, there are included emotional statements associated with the event tag "world cup", the emotional state "bad emotion": "mood good, must be the fan bar of French team-! "; and emotional sentences associated with the event tag "world cup", emotional state "bad emotion": "bad, certainly, the ball has been played in the bar, what family has had the ore in the home-! "; when the target event label is ' world cup ', the target emotion state is ' bad emotion ', the emotion state prompt is ' bad ', which is certainly that the ball is bet, what family is o ' er-! ".
In this embodiment, the emotional state prompt may further include an emotional state change indication within a statistical period of time that corresponds to a preset statistical duration from a current time unit in which the user performs the check-in operation.
The time unit is the longest time which is divided according to specific application scenes or application requirements and allows a user to check in once. For example, the time unit may be set to one day.
The preset statistical time length is the time length of the statistical time period set according to the specific application scene or the application requirement. For example, the statistical time period may be set to 1 month.
Assuming that the time unit is set to one day and the statistical time period is set to 1 month, one month from the day on which the user performs the check-in operation is the corresponding statistical period.
The emotional state change indication is used to indicate a change in the emotional state of the user over a statistical period of time.
Correspondingly, the step S2200 of obtaining the check-in result may include:
and acquiring the emotional state of the user in each time unit in the statistical period, and generating a corresponding emotional state change indication.
In this embodiment, a corresponding value may be set in advance for each emotion classification, and for the emotional state of the user in each time unit in the statistical period, the corresponding value is acquired to form an emotion change array in the corresponding statistical period in chronological order, as a corresponding emotion change instruction, and so on.
After the sign-in state of the user is analyzed to obtain the personalized sign-in result, the sign-in result can be displayed through the application sign-in interface in combination with the subsequent steps, so that the personalized sign-in result of the user is displayed and adapted to the sign-in operation of the user, a unique application sign-in response mode is provided, and the attraction of the application to the user is effectively improved.
After step S2200, enter:
step S2300, displaying check-in results by applying a check-in interface.
In this embodiment, the step of displaying the check-in result may be various embodiments.
For example, when the sign-in result includes that the emotional state prompt is a corresponding emotional sentence, the sentence can be directly displayed by applying the sign-in interface.
Alternatively, when the emotional state indication included in the check-in result includes an emotional state change indication, the step of displaying the check-in result may include: and drawing a corresponding emotional state change curve according to the emotional state change indication, and displaying the emotional state change curve in an application sign-in interface. For example, the emotion change indication is an emotion change array including values corresponding to the emotion states of each time unit in time sequence in the statistical period, and a corresponding emotion change state curve can be drawn according to each value of the emotion change array for display.
In this embodiment, the step of displaying the check-in result may further include:
and when the check-in result is displayed, simultaneously displaying a recommendation result corresponding to the check-in result.
The recommended result at least comprises one of user information of other users corresponding to the check-in result and article information corresponding to the check-in result.
Other users corresponding to the check-in result of the user may be users whose check-in result similarity exceeds a preset similarity threshold, for example, the emotional state prompt similarity included in the check-in result exceeds a preset similarity threshold, and more specifically, the relevance of the emotional sentence as the emotional state prompt may be regarded as the emotional state prompt similarity, or the curve similarity of the emotional change curve included in the emotional state prompt may be regarded as the emotional state prompt similarity. The method of calculating the relevance of emotion sentences or the curve similarity of emotion change curves can be used for cosine similarity, structural similarity and the like, and is not listed here.
The user information at least comprises a social platform account number and the like of the user, and can also comprise an avatar, a profile and the like of the user. By displaying the check-in result and user information of other users corresponding to the check-in result, the user can establish social contact with other users having similar check-in results.
It should be understood that other users corresponding to the check-in result of the user may also be users whose check-in result matching degree exceeds a preset matching degree threshold. The matching degree may be predefined in terms of sign-in result complementation, sex complementation, and the like, and is not specifically shown here.
The item information corresponding to the check-in result is information of an item recommended to the user based on the check-in result. For example, corresponding object event tags and object emotion states can be prompted according to the emotion states in the check-in result, and corresponding object information can be recommended to the user, for example, when the object event tags are in a rainy day and the object emotion states are in a bad emotion, the recommended object information can be laugh content, links for smiling videos, sea island travel products or the like. The article information can be written into a corresponding article information library in a network grabbing or manual configuration mode, and each article information can be associated with a corresponding emotion state in advance, so that article information can be selected for recommendation according to the actual emotion state included in the sign-in result.
In step S2300, the personalized sign-in result of the adapted user is displayed for the sign-in operation of the user, a unique application sign-in response mode is provided, and the attractiveness of the application to the user is effectively improved.
< application sign-in device >
In this embodiment, there is also provided an application sign-in apparatus 3000, as shown in fig. 7, including: the state obtaining unit 3100, the state analyzing unit 3200, and the result displaying unit 3300 are used for implementing any one of the application sign-in methods in the present embodiment, and are not described herein.
Application check-in device 3000, which includes:
the state obtaining unit 3100 is configured to receive a check-in operation performed by a user on the application check-in interface, and obtain a corresponding check-in state;
wherein the sign-in state at least comprises an emotional state when a user performs the sign-in operation;
the state analysis unit 3200 is configured to analyze the check-in state and obtain a corresponding check-in result;
wherein the sign-in result at least comprises an emotional state prompt corresponding to the emotional state;
and a result display unit 3300, configured to display the check-in result through the application check-in interface.
Optionally, a sign-in status tag is set in the application sign-in interface;
the check-in operation comprises a selection operation or an editing operation implemented on the check-in status tag;
the state acquisition unit 3100 is configured to:
according to the sign-in operation of the sign-in state label implemented by the user, acquiring information corresponding to the sign-in state label as a sign-in state;
And/or the number of the groups of groups,
a shooting control for triggering and calling the camera is arranged in the application sign-in interface;
the check-in operation comprises an operation which is implemented on the shooting control to trigger a camera to be called;
the state acquisition unit 3100 is configured to:
and calling a camera to acquire the check-in state according to the check-in operation of the shooting control carried out by the user.
Optionally, the state analysis unit 3200 is configured to:
acquiring a target event tag corresponding to the emotion state;
inquiring and acquiring a matched emotion statement in a pre-constructed emotion statement library according to the target event label and a target emotion state corresponding to the target event label, and taking the emotion statement as an emotion state prompt;
the emotion statement library comprises a plurality of emotion statements, and each emotion statement is a statement respectively associated with an event tag and an emotion state.
Optionally, the means of the step of obtaining the target event tag corresponding to the emotional state by the state analysis unit 3200 is configured to:
acquiring event labels in each time unit and the emotion state of the user in a statistical period which accords with a preset statistical duration from the current time unit of the user implementing the sign-in operation;
And determining a target event label according to the event label in each time unit in the statistical period and the emotion state of the user.
Further optionally, the means for acquiring an event tag within each of the time units in the state analysis unit 3200 is configured to:
acquiring event information of the hot events in each time unit;
and extracting event feature words from event information of the hot events in each time unit as the event labels in the corresponding time units.
Further optionally, the means for determining a target event tag in the state analysis unit 3200 according to the event tag in each of the time units in the statistical period and the emotional state of the user is configured to:
determining a target emotional state from all the emotional states of the users acquired in the statistical period;
selecting the event label corresponding to the target emotion state from all the event labels acquired in the statistical period as a candidate event label;
and carrying out cluster analysis on the candidate event tags to obtain the candidate event tag with the highest association degree with the target emotion state as the target event tag.
Optionally, the emotional state prompt includes an indication of a change in emotional state within a statistical period of time that corresponds to a preset statistical duration from a current time unit in which the user performs the check-in operation;
the state analysis unit 3200 is further configured to:
acquiring the emotional state of the user in each time unit in the statistical period, and generating a corresponding emotional state change indication;
the result display unit 3300 is further configured to:
and drawing a corresponding emotional state change curve according to the emotional state change indication, and displaying the emotional state change curve in the application sign-in interface.
Optionally, the result display unit 3300 is further configured to:
when the check-in result is displayed, simultaneously displaying a recommendation result corresponding to the check-in result;
the recommendation result at least comprises one of user information of other users corresponding to the check-in result and article information corresponding to the check-in result.
It should be appreciated by those skilled in the art that the application check-in device 3000 may be implemented in a variety of ways. For example, the application check-in device 3000 may be implemented by an instruction configuration processor. For example, instructions may be stored in a ROM, and when the device is started, the instructions are read from the ROM into a programmable device to implement the application check-in apparatus 3000. For example, application check-in device 3000 may be solidified into a dedicated device (e.g., ASIC). The application check-in device 3000 may be divided into units independent of each other, or they may be implemented by being combined together. Application check-in device 3000 may be implemented by one of the various implementations described above, or may be implemented by a combination of two or more of the various implementations described above.
In this embodiment, the application registration apparatus 3000 may be a function module provided in any application for implementing the application registration method provided in this embodiment, or the application registration apparatus 3000 may also be any application that provides a registration function.
< electronic device >
In this embodiment, there is also provided an electronic apparatus 4000, as shown in fig. 8, including:
a display device 4100 for displaying a human-computer interaction interface;
memory 4200 for storing executable instructions;
the processor 4300 is configured to execute the electronic device to perform the application check-in method according to any one of the embodiments provided in the present embodiment, according to control of the executable instructions.
In this embodiment, the electronic device 4000 may be any electronic device that supports implementation of the application check-in method of this embodiment, such as a mobile phone, a tablet computer, a palm computer, a desktop computer, a notebook computer, a workstation, and a game machine. In one example, the electronic device 4000 may be a cell phone installed with an application providing an application check-in function.
The electronic device 4000 may also include other means, such as, for example, the electronic device 1000 shown in fig. 1, communication means, and the like.
The embodiment of the invention has been described above with reference to the accompanying drawings, and according to the embodiment, an application check-in method, an application check-in device and electronic equipment are provided, by acquiring the check-in state of a user including the emotional state when the user checks in when the user performs the check-in operation on the application check-in interface, and analyzing the check-in state of the user to acquire the corresponding check-in result including the emotional state prompt so as to display through the application check-in interface, so that the personalized check-in result of the adapted user is displayed for the check-in operation of the user, a unique application check-in response mode is provided, and the attraction of the application to the user is effectively improved.
The present invention may be a system, method, and/or computer program product. The computer program product may include a computer readable storage medium having computer readable program instructions embodied thereon for causing a processor to implement aspects of the present invention.
The computer readable storage medium may be a tangible device that can hold and store instructions for use by an instruction execution device. The computer readable storage medium may be, for example, but not limited to, an electronic storage device, a magnetic storage device, an optical storage device, an electromagnetic storage device, a semiconductor storage device, or any suitable combination of the foregoing. More specific examples (a non-exhaustive list) of the computer-readable storage medium would include the following: portable computer disks, hard disks, random Access Memory (RAM), read-only memory (ROM), erasable programmable read-only memory (EPROM or flash memory), static Random Access Memory (SRAM), portable compact disk read-only memory (CD-ROM), digital Versatile Disks (DVD), memory sticks, floppy disks, mechanical coding devices, punch cards or in-groove structures such as punch cards or grooves having instructions stored thereon, and any suitable combination of the foregoing. Computer-readable storage media, as used herein, are not to be construed as transitory signals per se, such as radio waves or other freely propagating electromagnetic waves, electromagnetic waves propagating through waveguides or other transmission media (e.g., optical pulses through fiber optic cables), or electrical signals transmitted through wires.
The computer readable program instructions described herein may be downloaded from a computer readable storage medium to a respective computing/processing device or to an external computer or external storage device over a network, such as the internet, a local area network, a wide area network, and/or a wireless network. The network may include copper transmission cables, fiber optic transmissions, wireless transmissions, routers, firewalls, switches, gateway computers and/or edge servers. The network interface card or network interface in each computing/processing device receives computer readable program instructions from the network and forwards the computer readable program instructions for storage in a computer readable storage medium in the respective computing/processing device.
Computer program instructions for carrying out operations of the present invention may be assembly instructions, instruction Set Architecture (ISA) instructions, machine-related instructions, microcode, firmware instructions, state setting data, or source or object code written in any combination of one or more programming languages, including an object oriented programming language such as Smalltalk, c++ or the like and conventional procedural programming languages, such as the "C" programming language or similar programming languages. The computer readable program instructions may be executed entirely on the user's computer, partly on the user's computer, as a stand-alone software package, partly on the user's computer and partly on a remote computer or entirely on the remote computer or server. In the case of remote computers, the remote computer may be connected to the user computer through any kind of network, including a Local Area Network (LAN) or a Wide Area Network (WAN), or may be connected to an external computer (e.g., connected through the internet using an internet service provider). In some embodiments, aspects of the present invention are implemented by personalizing electronic circuitry, such as programmable logic circuitry, field Programmable Gate Arrays (FPGAs), or Programmable Logic Arrays (PLAs), with state information for computer readable program instructions, which can execute the computer readable program instructions.
Various aspects of the present invention are described herein with reference to flowchart illustrations and/or block diagrams of methods, apparatus (systems) and computer program products according to embodiments of the invention. It will be understood that each block of the flowchart illustrations and/or block diagrams, and combinations of blocks in the flowchart illustrations and/or block diagrams, can be implemented by computer-readable program instructions.
These computer readable program instructions may be provided to a processor of a general purpose computer, special purpose computer, or other programmable data processing apparatus to produce a machine, such that the instructions, which execute via the processor of the computer or other programmable data processing apparatus, create means for implementing the functions/acts specified in the flowchart and/or block diagram block or blocks. These computer readable program instructions may also be stored in a computer readable storage medium that can direct a computer, programmable data processing apparatus, and/or other devices to function in a particular manner, such that the computer readable medium having the instructions stored therein includes an article of manufacture including instructions which implement the function/act specified in the flowchart and/or block diagram block or blocks.
The computer readable program instructions may also be loaded onto a computer, other programmable data processing apparatus, or other devices to cause a series of operational steps to be performed on the computer, other programmable apparatus or other devices to produce a computer implemented process such that the instructions which execute on the computer, other programmable apparatus or other devices implement the functions/acts specified in the flowchart and/or block diagram block or blocks.
The flowcharts and block diagrams in the figures illustrate the architecture, functionality, and operation of possible implementations of systems, methods and computer program products according to various embodiments of the present invention. In this regard, each block in the flowchart or block diagrams may represent a module, segment, or portion of instructions, which comprises one or more executable instructions for implementing the specified logical function(s). In some alternative implementations, the functions noted in the block may occur out of the order noted in the figures. For example, two blocks shown in succession may, in fact, be executed substantially concurrently, or the blocks may sometimes be executed in the reverse order, depending upon the functionality involved. It will also be noted that each block of the block diagrams and/or flowchart illustration, and combinations of blocks in the block diagrams and/or flowchart illustration, can be implemented by special purpose hardware-based systems which perform the specified functions or acts, or combinations of special purpose hardware and computer instructions. It is well known to those skilled in the art that implementation by hardware, implementation by software, and implementation by a combination of software and hardware are all equivalent.
The foregoing description of embodiments of the invention has been presented for purposes of illustration and description, and is not intended to be exhaustive or limited to the embodiments disclosed. Many modifications and variations will be apparent to those of ordinary skill in the art without departing from the scope and spirit of the various embodiments described. The terminology used herein was chosen in order to best explain the principles of the embodiments, the practical application, or the technical improvements in the marketplace, or to enable others of ordinary skill in the art to understand the embodiments disclosed herein. The scope of the invention is defined by the appended claims.

Claims (9)

1. An application sign-in method, comprising:
receiving a sign-in operation of a user applied to an application sign-in interface, and acquiring a corresponding sign-in state;
wherein the sign-in state at least comprises an emotional state when a user performs the sign-in operation;
analyzing the sign-in state and obtaining a corresponding sign-in result;
wherein the sign-in result at least comprises an emotional state prompt corresponding to the emotional state;
displaying the check-in result through the application check-in interface,
wherein, the step of analyzing the check-in state and obtaining the check-in result includes:
Acquiring a target event tag corresponding to the emotion state;
inquiring and acquiring a matched emotion statement in a pre-constructed emotion statement library according to the target event label and a target emotion state corresponding to the target event label, and taking the emotion statement as an emotion state prompt;
the emotion statement library comprises a plurality of emotion statements, and each emotion statement is a statement respectively associated with an event tag and an emotion state.
2. The method of claim 1, wherein,
a sign-in state label is arranged in the application sign-in interface;
the check-in operation comprises a selection operation or an editing operation implemented on the check-in status tag;
the step of obtaining the corresponding sign-in state comprises the following steps:
according to the sign-in operation of the sign-in state label implemented by the user, acquiring information corresponding to the sign-in state label as a sign-in state;
and/or the number of the groups of groups,
a shooting control for triggering and calling the camera is arranged in the application sign-in interface;
the check-in operation comprises an operation which is implemented on the shooting control to trigger a camera to be called;
the step of obtaining the corresponding sign-in state comprises the following steps:
and calling a camera to acquire the check-in state according to the check-in operation of the shooting control carried out by the user.
3. The method of claim 1, wherein the step of obtaining a target event tag corresponding to the emotional state comprises:
acquiring event labels in each time unit and the emotion state of the user in a statistical period which accords with a preset statistical duration from the current time unit of the user implementing the sign-in operation;
and determining a target event label according to the event label in each time unit in the statistical period and the emotion state of the user.
4. A method according to claim 3, wherein the step of obtaining event tags within each of the time units comprises:
acquiring event information of the hot events in each time unit;
and extracting event feature words from event information of the hot events in each time unit as the event labels in the corresponding time units.
5. A method according to claim 3, wherein the step of determining a target event tag from event tags within each of the time units in the statistical period and the emotional state of the user comprises:
determining a target emotional state from all the emotional states of the users acquired in the statistical period;
Selecting the event label corresponding to the target emotion state from all the event labels acquired in the statistical period as a candidate event label;
and carrying out cluster analysis on the candidate event tags to obtain the candidate event tag with the highest association degree with the target emotion state as the target event tag.
6. The method of claim 1, wherein,
the emotional state prompt comprises an emotional state change indication in a statistical period which accords with a preset statistical duration from a current time unit of the user implementing the sign-in operation;
the step of obtaining the corresponding check-in result comprises the following steps:
acquiring the emotional state of the user in each time unit in the statistical period, and generating a corresponding emotional state change indication;
the step of displaying the check-in result comprises the following steps:
and drawing a corresponding emotional state change curve according to the emotional state change indication, and displaying the emotional state change curve in the application sign-in interface.
7. The method of claim 1, wherein the step of presenting the check-in result further comprises:
when the check-in result is displayed, simultaneously displaying a recommendation result corresponding to the check-in result;
The recommendation result at least comprises one of user information of other users corresponding to the check-in result and article information corresponding to the check-in result.
8. An application check-in device, comprising:
the state acquisition unit is used for receiving the check-in operation of the user applied to the application check-in interface and acquiring a corresponding check-in state;
wherein the sign-in state at least comprises an emotional state when a user performs the sign-in operation;
the state analysis unit is used for analyzing the sign-in state and acquiring a corresponding sign-in result;
wherein the sign-in result at least comprises an emotional state prompt corresponding to the emotional state;
a result display unit for displaying the check-in result through the application check-in interface,
wherein, the state analysis unit is further used for:
acquiring a target event tag corresponding to the emotion state;
inquiring and acquiring a matched emotion statement in a pre-constructed emotion statement library according to the target event label and a target emotion state corresponding to the target event label, and taking the emotion statement as an emotion state prompt;
the emotion statement library comprises a plurality of emotion statements, and each emotion statement is a statement respectively associated with an event tag and an emotion state.
9. An electronic device, comprising:
the display device is used for displaying the man-machine interaction interface;
a memory for storing executable instructions;
a processor for executing the electronic device to perform the application check-in method according to any one of claims 1-7, under control of the executable instructions.
CN201810881964.XA 2018-07-31 2018-07-31 Application sign-in method and device and electronic equipment Active CN110795178B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN201810881964.XA CN110795178B (en) 2018-07-31 2018-07-31 Application sign-in method and device and electronic equipment

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN201810881964.XA CN110795178B (en) 2018-07-31 2018-07-31 Application sign-in method and device and electronic equipment

Publications (2)

Publication Number Publication Date
CN110795178A CN110795178A (en) 2020-02-14
CN110795178B true CN110795178B (en) 2023-08-22

Family

ID=69425706

Family Applications (1)

Application Number Title Priority Date Filing Date
CN201810881964.XA Active CN110795178B (en) 2018-07-31 2018-07-31 Application sign-in method and device and electronic equipment

Country Status (1)

Country Link
CN (1) CN110795178B (en)

Families Citing this family (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN111564203A (en) * 2020-04-30 2020-08-21 深圳市镜象科技有限公司 Psychotherapy method based on ACT therapy, psychotherapy terminal, and storage medium

Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN102930610A (en) * 2012-10-24 2013-02-13 深圳市万凯达科技有限公司 Check-in information processing method and system
CN103714071A (en) * 2012-09-29 2014-04-09 株式会社日立制作所 Label emotional tendency quantifying method and label emotional tendency quantifying system
CN105095286A (en) * 2014-05-14 2015-11-25 腾讯科技(深圳)有限公司 Page recommendation method and device
CN105138222A (en) * 2015-08-26 2015-12-09 美国掌赢信息科技有限公司 Method for selecting expression icon and electronic equipment

Family Cites Families (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US10706367B2 (en) * 2013-09-10 2020-07-07 Facebook, Inc. Sentiment polarity for users of a social networking system
JP6467965B2 (en) * 2015-02-13 2019-02-13 オムロン株式会社 Emotion estimation device and emotion estimation method
CN105843922A (en) * 2016-03-25 2016-08-10 乐视控股(北京)有限公司 Multimedia classification recommendation method, apparatus and system

Patent Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN103714071A (en) * 2012-09-29 2014-04-09 株式会社日立制作所 Label emotional tendency quantifying method and label emotional tendency quantifying system
CN102930610A (en) * 2012-10-24 2013-02-13 深圳市万凯达科技有限公司 Check-in information processing method and system
CN105095286A (en) * 2014-05-14 2015-11-25 腾讯科技(深圳)有限公司 Page recommendation method and device
CN105138222A (en) * 2015-08-26 2015-12-09 美国掌赢信息科技有限公司 Method for selecting expression icon and electronic equipment

Also Published As

Publication number Publication date
CN110795178A (en) 2020-02-14

Similar Documents

Publication Publication Date Title
US10834355B2 (en) Barrage message processing
CN109522424B (en) Data processing method and device, electronic equipment and storage medium
JP2021517998A (en) Image clustering methods and devices, electronic devices and storage media
US11394675B2 (en) Method and device for commenting on multimedia resource
CN108227950B (en) Input method and device
CN108038102B (en) Method and device for recommending expression image, terminal and storage medium
CN109325223B (en) Article recommendation method and device and electronic equipment
CN111638832A (en) Information display method, device, system, electronic equipment and storage medium
CN108924381B (en) Image processing method, image processing apparatus, and computer readable medium
CN110222256B (en) Information recommendation method and device and information recommendation device
US11640420B2 (en) System and method for automatic summarization of content with event based analysis
CN112559800A (en) Method, apparatus, electronic device, medium, and product for processing video
CN112784142A (en) Information recommendation method and device
CN108197105B (en) Natural language processing method, device, storage medium and electronic equipment
CN111046927B (en) Method and device for processing annotation data, electronic equipment and storage medium
CN113849723A (en) Search method and search device
CN110795178B (en) Application sign-in method and device and electronic equipment
CN111475664B (en) Object display method and device and electronic equipment
CN110362686B (en) Word stock generation method and device, terminal equipment and server
CN107301188B (en) Method for acquiring user interest and electronic equipment
CN114302231A (en) Video processing method and device, electronic equipment and storage medium
CN109992697B (en) Information processing method and electronic equipment
CN109145151B (en) Video emotion classification acquisition method and device
CN113535940A (en) Event abstract generation method and device and electronic equipment
CN111382367A (en) Search result ordering method and device

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
TA01 Transfer of patent application right

Effective date of registration: 20200603

Address after: 310051 room 508, floor 5, building 4, No. 699, Wangshang Road, Changhe street, Binjiang District, Hangzhou City, Zhejiang Province

Applicant after: Alibaba (China) Co.,Ltd.

Address before: 100083, Beijing, Haidian District, Cheng Fu Road, No. 28, A building, block 12

Applicant before: UC MOBILE Co.,Ltd.

TA01 Transfer of patent application right
GR01 Patent grant
GR01 Patent grant