CN110795178A - Application signing method and device and electronic equipment - Google Patents

Application signing method and device and electronic equipment Download PDF

Info

Publication number
CN110795178A
CN110795178A CN201810881964.XA CN201810881964A CN110795178A CN 110795178 A CN110795178 A CN 110795178A CN 201810881964 A CN201810881964 A CN 201810881964A CN 110795178 A CN110795178 A CN 110795178A
Authority
CN
China
Prior art keywords
check
state
user
event
result
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Granted
Application number
CN201810881964.XA
Other languages
Chinese (zh)
Other versions
CN110795178B (en
Inventor
曾文富
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Alibaba China Co Ltd
Original Assignee
Ucweb Inc
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Ucweb Inc filed Critical Ucweb Inc
Priority to CN201810881964.XA priority Critical patent/CN110795178B/en
Publication of CN110795178A publication Critical patent/CN110795178A/en
Application granted granted Critical
Publication of CN110795178B publication Critical patent/CN110795178B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F9/00Arrangements for program control, e.g. control units
    • G06F9/06Arrangements for program control, e.g. control units using stored programs, i.e. using an internal store of processing equipment to receive or retain programs
    • G06F9/44Arrangements for executing specific programs
    • G06F9/451Execution arrangements for user interfaces

Landscapes

  • Engineering & Computer Science (AREA)
  • Software Systems (AREA)
  • Theoretical Computer Science (AREA)
  • Human Computer Interaction (AREA)
  • Physics & Mathematics (AREA)
  • General Engineering & Computer Science (AREA)
  • General Physics & Mathematics (AREA)
  • User Interface Of Digital Computer (AREA)
  • Management, Administration, Business Operations System, And Electronic Commerce (AREA)

Abstract

The invention discloses an application sign-in method and device and electronic equipment. The method comprises the following steps: receiving a check-in operation of a user on an application check-in interface, and acquiring a corresponding check-in state; analyzing the check-in state and acquiring a corresponding check-in result; and displaying the check-in result through the application check-in interface. According to the invention, the personalized sign-in result of the adaptive user is displayed aiming at the sign-in operation of the user, a unique application sign-in response mode is provided, and the attraction of the application to the user is effectively improved.

Description

Application signing method and device and electronic equipment
Technical Field
The invention relates to the technical field of computer application, in particular to an application sign-in method, an application sign-in device and electronic equipment.
Background
With the rapid development of internet technology and the popularization of intelligent terminals, more and more users are used to access a network to obtain corresponding Application services through Applications (APPs) installed on terminal devices such as smart phones, palm computers and tablet computers.
At present, massive applications can be provided in an application market for users to download and install, wherein many applications provide a check-in function, and corresponding rewards such as points, special authorities and the like can be given to the users using the applications according to check-in records so as to encourage the users to use continuously and increase the attraction of the applications to the users. However, the check-in function provided in the current application basically gives a reward to the user according to the check-in record, only the check-in reward forms are different, the overall check-in response forms convergence, and instead, the check-in function becomes the chicken rib function in the application, and the attraction of the application to the user cannot be improved.
Disclosure of Invention
It is an object of the present invention to provide a new solution for application check-in.
According to a first aspect of the present invention, there is provided an application check-in method, including:
receiving a check-in operation of a user on an application check-in interface, and acquiring a corresponding check-in state;
wherein the check-in state comprises at least an emotional state of the user when performing the check-in operation;
analyzing the check-in state and acquiring a corresponding check-in result;
wherein the check-in result at least comprises an emotional state prompt corresponding to the emotional state;
and displaying the check-in result through the application check-in interface.
Optionally, a check-in state tag is set in the application check-in interface;
the check-in operation comprises a selection operation or an editing operation implemented on the check-in status tag;
the step of obtaining the corresponding check-in status comprises:
acquiring information corresponding to the check-in state label as a check-in state according to check-in operation of a user on the check-in state label;
and/or the presence of a gas in the gas,
a shooting control for triggering and calling a camera is arranged in the application check-in interface;
the check-in operation comprises an operation which is implemented in the shooting control to trigger and call a camera;
the step of obtaining the corresponding check-in status comprises:
and calling a camera to acquire the check-in state according to the check-in operation of the shooting control implemented by the user.
Further optionally, the step of analyzing the check-in status and obtaining a check-in result includes:
acquiring a target event label corresponding to the emotional state;
according to the target event label and a target emotion state corresponding to the target event label, inquiring and acquiring a matched emotion statement in a pre-constructed emotion statement library to serve as the emotion state prompt;
the emotion sentence library comprises a plurality of emotion sentences, and each emotion sentence is a sentence which is respectively associated with an event label and an emotion state.
Further optionally, the step of obtaining a target event tag corresponding to the emotional state includes:
acquiring an event tag in each time unit and the emotional state of the user in a statistical time period which is in accordance with preset statistical time duration from the current time unit of the user for implementing the check-in operation;
and determining a target event label according to the event label in each time unit in the statistical time period and the emotional state of the user.
Further optionally, the step of obtaining the event label in each of the time units includes:
acquiring event information of the hot event in each time unit;
and extracting event characteristic words from the event information of the hot event in each time unit to serve as the event labels in the corresponding time units.
Further optionally, the step of determining a target event label according to the event label in each time unit in the statistical period and the emotional state of the user includes:
determining a target emotional state from all the emotional states of the user acquired in the statistical time period;
selecting the event label corresponding to the target emotion state from all the event labels acquired in the statistical time period as a candidate event label;
and performing cluster analysis on the candidate event labels to obtain the candidate event label with the highest association degree with the target emotional state as the target event label.
Optionally, the emotional state prompt includes an emotional state change indication within a statistical time period that conforms to a preset statistical time duration from a current time unit in which the user performs the check-in operation;
the step of obtaining the corresponding check-in result comprises:
acquiring the emotional state of the user in each time unit in the statistical time period, and generating a corresponding emotional state change indication;
the step of displaying the check-in result comprises:
and drawing a corresponding emotional state change curve according to the emotional state change indication, and displaying the emotional state change curve in the application check-in interface.
Optionally, the step of displaying the check-in result further includes:
when the check-in result is displayed, a recommendation result corresponding to the check-in result is displayed at the same time;
the recommendation result at least comprises one of user information of other users corresponding to the check-in result and item information corresponding to the check-in result.
According to a second aspect of the present invention, there is provided an application check-in apparatus, comprising:
the system comprises a state acquisition unit, a state acquisition unit and a state acquisition unit, wherein the state acquisition unit is used for receiving the check-in operation of a user on an application check-in interface and acquiring a corresponding check-in state;
wherein the check-in state comprises at least an emotional state of the user when performing the check-in operation;
the state analysis unit is used for analyzing the check-in state and acquiring a corresponding check-in result;
wherein the check-in result at least comprises an emotional state prompt corresponding to the emotional state;
and the result display unit is used for displaying the check-in result through the application check-in interface.
According to a third aspect of the present invention, there is provided an electronic apparatus, comprising:
the display device is used for displaying a human-computer interaction interface;
a memory for storing executable instructions;
and the processor is used for operating the electronic equipment to execute the application check-in method according to the control of the executable instruction.
According to one embodiment of the disclosure, when the check-in operation of the application check-in interface implemented by a user is received, the check-in state of the user including the emotional state of the user during check-in is obtained, the check-in result including the emotional state prompt is obtained according to the check-in state analysis of the user and displayed through the application check-in interface, the personalized check-in result of the user is displayed according to the check-in operation of the user, a unique application check-in response mode is provided, and the attraction of the application to the user is effectively improved.
Other features of the present invention and advantages thereof will become apparent from the following detailed description of exemplary embodiments thereof, which proceeds with reference to the accompanying drawings.
Drawings
The accompanying drawings, which are incorporated in and constitute a part of this specification, illustrate embodiments of the invention and together with the description, serve to explain the principles of the invention.
Fig. 1 is a block diagram showing an example of a hardware configuration of an electronic apparatus that can be used to implement an embodiment of the present invention.
FIG. 2 shows a flow diagram of an application check-in method of an embodiment of the invention.
FIG. 3 shows a flowchart of the steps of obtaining a check-in result of an embodiment of the present invention.
FIG. 4 shows a flowchart of the steps of obtaining a check-in target event tag, according to an embodiment of the invention.
FIG. 5 shows a flowchart of the steps of obtaining an event tag per time unit of an embodiment of the present invention.
FIG. 6 shows a flowchart of the steps for determining a target event tag, according to an embodiment of the present invention.
FIG. 7 shows a block diagram of an application check-in apparatus of an embodiment of the invention.
FIG. 8 shows a block diagram of an electronic device of an embodiment of the invention.
Detailed Description
Various exemplary embodiments of the present invention will now be described in detail with reference to the accompanying drawings. It should be noted that: the relative arrangement of the components and steps, the numerical expressions and numerical values set forth in these embodiments do not limit the scope of the present invention unless specifically stated otherwise.
The following description of at least one exemplary embodiment is merely illustrative in nature and is in no way intended to limit the invention, its application, or uses.
Techniques, methods, and apparatus known to those of ordinary skill in the relevant art may not be discussed in detail but are intended to be part of the specification where appropriate.
In all examples shown and discussed herein, any particular value should be construed as merely illustrative, and not limiting. Thus, other examples of the exemplary embodiments may have different values.
It should be noted that: like reference numbers and letters refer to like items in the following figures, and thus, once an item is defined in one figure, further discussion thereof is not required in subsequent figures.
< hardware configuration >
Fig. 1 is a block diagram showing a hardware configuration of an electronic apparatus 1000 that can implement an embodiment of the present invention.
The electronic device 1000 may be a laptop, desktop, cell phone, tablet, etc. As shown in fig. 1, the electronic device 1000 may include a processor 1100, a memory 1200, an interface device 1300, a communication device 1400, a display device 1500, an input device 1600, a speaker 1700, a microphone 1800, and the like. The processor 1100 may be a central processing unit CPU, a microprocessor MCU, or the like. The memory 1200 includes, for example, a ROM (read only memory), a RAM (random access memory), a nonvolatile memory such as a hard disk, and the like. The interface device 1300 includes, for example, a USB interface, a headphone interface, and the like. The communication device 1400 is capable of wired or wireless communication, for example, and may specifically include Wifi communication, bluetooth communication, 2G/3G/4G/5G communication, and the like. The display device 1500 is, for example, a liquid crystal display panel, a touch panel, or the like. The input device 1600 may include, for example, a touch screen, a keyboard, a somatosensory input, and the like. A user can input/output voice information through the speaker 1700 and the microphone 1800.
The electronic device shown in fig. 1 is merely illustrative and is in no way meant to limit the invention, its application, or uses. In an embodiment of the present invention, the memory 1200 of the electronic device 1000 is configured to store instructions for controlling the processor 1100 to operate to execute any application check-in method provided by the embodiment of the present invention. It will be appreciated by those skilled in the art that although a plurality of means are shown for the electronic device 1000 in fig. 1, the present invention may relate to only some of the means therein, e.g. the electronic device 1000 relates to only the processor 1100 and the storage means 1200. The skilled person can design the instructions according to the disclosed solution. How the instructions control the operation of the processor is well known in the art and will not be described in detail herein.
< example >
The general concept of the invention is to provide a new application check-in scheme, which is characterized in that when a check-in operation of a user on an application check-in interface is received, the check-in state of the user including the emotional state when the user checks in is obtained, and the corresponding check-in result including the emotional state prompt is obtained by analyzing the check-in state of the user and displayed through the application check-in interface, so that the personalized check-in result adapted to the user is displayed according to the check-in operation of the user, a unique application check-in response mode is provided, and the attractiveness of the application to the user is effectively improved.
< method >
In the embodiment, an application check-in method is provided. The application check-in method can be applied to any application providing a check-in function. In the present embodiment, an application refers to any software product or computer program that can be run in an electronic device such as a mobile phone, a tablet computer, a workstation, a game machine, and the like by means of installation, loading, and the like, and provides a corresponding application service, for example, a mobile phone APP that can be installed in a mobile phone.
The application check-in method, as shown in fig. 2, includes: steps S2100-S2300.
In step S2100, a check-in operation performed by the user on the application check-in interface is received, and a corresponding check-in state is obtained.
The application check-in interface is a human-computer interaction interface which is displayed to a user and used for receiving check-in operation of the user when the check-in function provided by the application is used. In this embodiment, the corresponding check-in status can be obtained by the user performing the check-in operation on the application check-in interface.
The check-in state of the user is the state of the user himself when the check-in operation is performed. In this embodiment, the check-in state at least includes an emotional state of the user when the check-in operation is performed. The emotional state may be a type of emotion or an emotional tag of the user, etc. For example, emotional states may include types of emotions, such as happy, calm, fallen, angry, distressing, and the like. The check-in status of the user may also include other statuses of the user, such as the scene in which the user checked-in, weather, etc.
In one example, a check-in status tag is set in the application check-in interface.
The check-in status label may be a label for showing the corresponding check-in status for the user to select or edit, for example, may be a check-in label showing different emotion types (happy, calm, low, angry, distressing).
The check-in state label can receive selection operation of a user, and correspondingly, information displayed in the selected check-in state label can be used as a check-in state when the user checks in; the check-in label can also receive editing operations including deletion, addition and modification of a user, and the user can add a check-in label conforming to the check-in state of the user, delete a check-in label not conforming to the preference of the user and edit and modify information displayed by the check-in label to information conforming to the actual state of the user. The check-in label can be arranged in various shapes meeting the personalized requirements of the user, such as a bubble shape, a heart shape or other user-defined shapes.
Correspondingly, the check-in operation comprises a selection operation or an editing operation of implementing the check-in status tag. The selection operation may be a click operation, a check operation, a slide selection operation, and the like, and the editing operation includes an addition operation, a deletion operation, an editing operation, and the like. And the step of obtaining the corresponding check-in status comprises:
and acquiring information corresponding to the check-in state label as a check-in state according to the check-in operation of the user on the check-in state label.
The information corresponding to the check-in status label may be label information displayed by the check-in status label, or information associated with the check-in status label in advance.
In another example, a shooting control that triggers the invocation of a camera is set in the application check-in interface. The shooting control can be a button, an icon and the like arranged in an application check-in interface and is used for receiving the operation of a user and calling the camera. Correspondingly, the check-in operation includes an operation implemented in the shooting control to trigger the camera to be called, for example, an operation of clicking, checking or sliding the shooting control. The step of obtaining the corresponding check-in status comprises:
and calling the camera to acquire the check-in state according to the check-in operation of the shooting control implemented by the user.
In this example, after the camera is called, the facial expression of the user or the scene (including the place, the weather, the surrounding scenery, and the like) where the user is located can be photographed, and the check-in state of the user when the check-in is performed can be determined by comparing the photographed and acquired image with the sample image in the pre-constructed state image library. For example, a camera is called to shoot a face image of a user, the face image of the user can be compared with face sample images corresponding to various emotion states in a state image library, a sample face sample image most similar to the face image of the user can be determined by image similarity calculation methods such as histogram comparison and structural similarity, and the emotion state of the corresponding user when the sign-in operation is carried out is correspondingly obtained.
The check-in state of the user is obtained, the corresponding check-in result including the emotion state prompt is analyzed and obtained according to the check-in state of the user by combining with the subsequent steps, the check-in result is displayed through the application check-in interface, the check-in result adaptive to the user is displayed according to the check-in operation of the user, a unique application check-in response mode is provided, and the attraction degree of the application to the user is effectively improved.
After step S2100, the flow proceeds to:
step S2200, the check-in state is analyzed, and the corresponding check-in result is obtained.
The check-in result is an analysis result obtained after analyzing the check-in state of the user. In this embodiment, the check-in result at least includes an emotional state prompt corresponding to the emotional state of the user. The emotional state prompt is personalized prompt information generated after analyzing the emotional state of the user.
The check-in result may further include a result obtained by analyzing other state information included in the check-in state of the user, for example, the check-in state includes a scene and a weather where the user is checked in, and the check-in result may further include a corresponding scene analysis result, a weather analysis result, and the like, which are not listed herein.
In one example, the step of analyzing the check-in status to obtain the check-in result may be as shown in fig. 3, and includes: steps S2210-S2220.
In step S2210, a target event tag corresponding to the emotional state is acquired.
The target event label is the event label most highly correlated with one of the emotional states. An event tag is a tag that characterizes an event. The target event represented by the target event tag is the event that has the greatest influence on the emotional state corresponding to the target event tag.
The step of acquiring the target event tag corresponding to the emotional state may include, as shown in fig. 4: steps S2211-S2212.
Step S2211, obtaining the event label and the emotional state of the user in each time unit within the statistical time period according with the preset statistical duration from the current time unit of the user performing the check-in operation.
The time unit is the longest time length which is divided according to a specific application scene or application requirements and allows a user to perform one check-in operation. For example, the time unit may be set to one day.
The preset statistical duration is the time length of the statistical time interval set according to a specific application scene or application requirements. For example, the statistical time period may be set to 1 month.
Assuming that the time unit is set as one day and the statistical duration is set as 1 month, one month before the day on which the user performs the check-in operation is the corresponding statistical period.
According to the historical check-in record of the user, the emotional state of the user in each time unit in the statistical time period can be obtained.
In this embodiment, the corresponding event tag may be obtained in each time unit content and stored as an event tag history, and the event tag of each time unit in the statistical time period may be obtained according to the event tag history.
The step of acquiring the event tag in each time unit, as shown in fig. 5, includes: steps S22111-S22112.
In step S22111, event information of the hot event in each time unit is obtained.
In this example, in each time unit, through a preset period (for example, every 2 hours in 1 day) or a fixed time (for example, 8 o ' clock, 12 o ' clock, 16 o ' clock in each day), hot events of various main portal websites, social platforms, and information publishing platforms are captured through a means such as web capture, and event information of the hot events is obtained, where the event information may be an event title, an event keyword, and the like.
Step S22112, extracting event feature words from the event information of the hot spot events in each time unit, as event labels in the corresponding time units.
Processing methods such as word segmentation, correlation calculation and the like are carried out on the event information of the hot event, and the event characteristic words which can represent the event most can be extracted to serve as corresponding event labels.
For example, if the hotspot search event captured from the search site at a certain day (time unit is 1 day) as the hotspot event is "what gift should be sent by mother festival", and the extracted event feature word is "mother festival", an event label "mother festival" of the day can be obtained.
It should be understood that multiple event tags may be acquired per event unit.
In practical application, any one event label in each time unit can be modified or added or deleted for any one time unit through an external configuration instruction of related personnel, so that the event label in each time unit is more accurate.
After step S2211, the process proceeds to:
and step S2212, determining a target event label according to the event label in each time unit in the statistical time interval and the emotional state of the user.
Step S2212 may be as shown in fig. 6, including: steps S22121-S22123.
Step S22121, determines the target emotional state from the emotional states of all the users acquired in the statistical period.
The target emotional state may be the emotional state that occurs most frequently in the statistical period.
For example, the emotional state in which the number of time units that the user has appeared is the greatest in the statistical period is "anger", and the target emotional state is "anger".
Alternatively, the emotional states of the user within the statistical period may be classified according to a preset emotional category. Assume that the preset emotion classifications include: "good mood", "bad mood", "general mood". Correspondingly, the emotional states of the users, such as "happy", and "like", are classified into "good emotional" categories, the emotional states of the users, such as "anger", "jealousy", "nuisance", "obstinate", "sick", and "distressed", are classified into "bad emotional" categories, and the emotional states of the users, such as "calm", "stable", and "nostalgic", are classified into "general emotional" categories, so that the emotional states of the users during the statistical period can be classified into three categories, and the emotions with the largest number of units of time that appear are classified as the target emotional states.
For example, if the statistical period is one month, the time unit is 1 day, the number of days in which "good emotion" appears within the statistical period is 5 days, "general emotion" is 10 days, "bad emotion" is 15 days, and "bad emotion" is the target emotional state.
Step S22122, selecting an event label corresponding to the target emotional state from all event labels acquired in the statistical time period as a candidate event label.
The event label corresponding to the target emotional state is an event label in a time unit in which the target emotional state occurs in the statistical period. And selecting the target event label from the candidate event labels, wherein the event label corresponding to the target emotional state is used as a candidate event label, so that the target event label can reflect events influencing the emotional state of the user more truly.
For example, the statistical period is one month, the time unit is 1 day, the target emotional state is "bad emotion", both the date a and the date B are "bad emotion", the event label of the date a is "world cup", "england lost to belgium", the event label of the date B is "french crow", and the candidate event label is "world cup", "england lost to belgium", and "french crow".
Step S22123, performs cluster analysis on the candidate event labels, and obtains the candidate event label with the highest association degree with the target emotional state as the target event label.
And performing cluster analysis on the candidate event labels according to the association degree between the event labels, and selecting some representative candidate event labels from the candidate event labels, wherein the candidate event labels are 'world cup', 'english to belgian' and 'france captivity', and the representative candidate event label can be clustered into the 'world cup' through the cluster analysis.
After cluster analysis is performed on the candidate event labels, for each candidate event label after the cluster analysis, the association degree with the target emotional state is respectively obtained, and the association degree may be the proportion of the number of the candidate event labels appearing in the counting time period to the total number of the event labels appearing in the whole counting time period. It should be understood that the candidate event labels after the cluster analysis represent a class of candidate event labels, and the number of the candidate event labels in the statistical time period should be counted in the number of the candidate event labels in the statistical time period.
For example, the target emotional state is "bad emotion", the candidate event labels obtained after clustering the candidate event labels are "world cup", "flood", and "shopping festival", wherein the number of the "world cup" is the largest, and the corresponding association degree with the target emotional state is the largest, so the target event label is "world cup".
Based on the above-mentioned method for acquiring the target event tag, other methods for acquiring the target event tag may be obtained by performing appropriate modification or adjustment. For example, other time units within the statistical period that are the same as the emotional state of the user in the current time unit and event tags that occur within the current time unit may be used as candidate event tags, and the target event tag may be obtained from the candidate event tags in a manner similar to the selection of the target event tag from the candidate events.
Step S2210 has been described above with reference to the drawings and examples, and after the target event tag is obtained, the process proceeds to:
step S2220, according to the target event label and the target emotion state corresponding to the target event label, matched emotion sentences are inquired and obtained in a pre-constructed emotion sentence library and serve as emotion state prompts.
The target event label is the event label most highly correlated with one of the emotional states.
Based on the implementation of step S2210, the target emotional state corresponding to the target event tag may be the emotional state that occurs most frequently in the statistical period, or the emotional state that the user performs the check-in operation in the current time unit.
The emotion sentence library is a database including a plurality of emotion sentences. In this embodiment, feature sentences related to preset emotion state classifications and event label contents can be extracted from available web text contents by a web capturing means, and the most representative sentence is extracted from the feature sentences as an emotion sentence according to cosine similarity, manhattan similarity distance and other manners, so that a plurality of emotion sentences are obtained and a corresponding emotion sentence library is constructed. Correspondingly, each emotional sentence is a sentence associated with an event tag and an emotional state, respectively.
In practical application, the emotion sentences can be manually collected by related personnel and written into the emotion sentence library, or the emotion sentences in the emotion sentence library can be manually modified and deleted, so that the content of the emotion sentence library is more accurate and richer.
According to the target event label and the target emotion state corresponding to the target event label, emotion sentences related to the target event label and the target emotion state can be obtained through query in an emotion sentence library, and the emotion sentences with the highest association degree with the target event label and the target emotion state are selected from the emotion sentences to serve as matched emotion sentences to serve as emotion state prompts.
For example, the emotion sentence library includes emotion sentences associated with event labels "world cup" and emotional states "bad emotion": "have good mood, certainly be the fan bar of French team! "; and emotional sentences associated with the event labels "world cup", emotional state "bad emotion": "what family is in what house is in what you are in what you! "; when the target event label is "world cup" and the target emotional state is "bad emotion", the emotional state prompt is "bad, certainly what family he is in the game ball bar, and the place has mine! ".
In this embodiment, the emotional state prompt may further include an emotional state change indication within a statistical period that conforms to a preset statistical duration from a current time unit when the user performs the check-in operation.
The time unit is the longest time length which is divided according to a specific application scene or application requirements and allows a user to perform one check-in operation. For example, the time unit may be set to one day.
The preset statistical duration is the time length of the statistical time interval set according to a specific application scene or application requirements. For example, the statistical time period may be set to 1 month.
Assuming that the time unit is set as one day and the statistical duration is set as 1 month, one month before the day on which the user performs the check-in operation is the corresponding statistical period.
The emotional state change indication indicates a change in the emotional state of the user over a statistical period.
Correspondingly, the step S2200 of acquiring the check-in result may include:
and acquiring the emotional state of the user in each time unit in the statistical time period, and generating a corresponding emotional state change indication.
In this embodiment, a corresponding numerical value may be set in advance for each emotion classification, and the corresponding numerical value is obtained for the emotional state of the user in each time unit in the statistical time period, so as to form an emotion change array in the corresponding statistical time period in time sequence, as a corresponding emotion change indication, and the like.
After the check-in state of the user is analyzed to obtain the personalized check-in result, the check-in result can be displayed through the application check-in interface in combination with the subsequent steps, the personalized check-in result of the user is displayed and adapted to the check-in operation of the user, a unique application check-in response mode is provided, and the attraction of the application to the user is effectively improved.
After step S2200, the process proceeds to:
and step S2300, displaying the check-in result through the application check-in interface.
In this embodiment, the step of displaying the check-in result may be implemented in various ways.
For example, when the emotional state prompt included in the check-in result is a corresponding emotional statement, the statement may be directly presented through the application check-in interface.
Alternatively, when the indication of emotional state included in the check-in result includes an indication of a change in emotional state, the step of presenting the check-in result may include: and drawing a corresponding emotional state change curve according to the emotional state change indication, and displaying the emotional state change curve in an application check-in interface. For example, the emotion change indication is an emotion change array including numerical values corresponding to the emotion states of each time unit in time sequence within the statistical time period, and a corresponding emotion change state curve may be drawn for display according to each numerical value of the emotion change array.
In this embodiment, the step of displaying the check-in result may further include:
and when the check-in result is displayed, displaying a recommendation result corresponding to the check-in result.
The recommendation result at least comprises one of user information of other users corresponding to the check-in result and item information corresponding to the check-in result.
The other users corresponding to the check-in result of the user may be users whose check-in result similarity exceeds a preset similarity threshold, for example, the emotional state prompt similarity included in the check-in result exceeds a preset similarity threshold, and more specifically, the association degree of the emotional statement included in the emotional state prompt may be used as the emotional state prompt similarity, or the curve similarity of the emotional change curve included in the emotional state prompt may be used as the emotional state prompt similarity. The association degree of the emotion sentences or the curve similarity of the emotion change curve may be calculated by cosine similarity, structure similarity, or the like, which is not listed herein.
The user information includes at least the user's social platform account number, etc., and may also include the user's avatar, profile, etc. By showing the check-in result and showing the user information of other users corresponding to the check-in result, the user can establish social contact with other users with similar check-in results.
It should be understood that other users corresponding to the check-in result of the user may also be users whose check-in result matching degree exceeds a preset matching degree threshold. The matching degree may be predefined in terms of sign-in result complementation, gender complementation, etc., which are not listed herein.
The item information corresponding to the check-in result is information of an item recommended to the user based on the check-in result. For example, the corresponding target event tag and the target emotional state may be prompted according to the emotional state in the check-in result, and the corresponding item information may be recommended to the user, for example, when the target event tag is "rainy day" and the target emotional state is "bad emotion", the recommended item information may be content of jokes, links of funny videos, or sea island travel products. The article information can be written into a corresponding article information base in a network grabbing or manual configuration mode, and each article information can be associated with a corresponding emotional state in advance, so that the article information can be selected for recommendation according to the actual emotional state included in the check-in result.
In step S2300, the personalized check-in result of the adapted user is displayed according to the check-in operation of the user, a unique application check-in response mode is provided, and the attraction of the application to the user is effectively improved.
< application check-in apparatus >
In this embodiment, an application check-in apparatus 3000 is further provided, as shown in fig. 7, including: the state obtaining unit 3100, the state analyzing unit 3200, and the result displaying unit 3300 are used to implement any application check-in method in this embodiment, and are not described herein again.
Application check-in apparatus 3000, among others, includes:
the state acquisition unit 3100 is configured to receive a check-in operation performed by a user on the application check-in interface, and acquire a corresponding check-in state;
wherein the check-in state comprises at least an emotional state of the user when performing the check-in operation;
a state analysis unit 3200, configured to analyze the check-in state and obtain a corresponding check-in result;
wherein the check-in result at least comprises an emotional state prompt corresponding to the emotional state;
and the result display unit 3300 is used for displaying the check-in result through the application check-in interface.
Optionally, a check-in state tag is set in the application check-in interface;
the check-in operation comprises a selection operation or an editing operation implemented on the check-in status tag;
the state acquisition unit 3100 is configured to:
acquiring information corresponding to the check-in state label as a check-in state according to check-in operation of a user on the check-in state label;
and/or the presence of a gas in the gas,
a shooting control for triggering and calling a camera is arranged in the application check-in interface;
the check-in operation comprises an operation which is implemented in the shooting control to trigger and call a camera;
the state acquisition unit 3100 is configured to:
and calling a camera to acquire the check-in state according to the check-in operation of the shooting control implemented by the user.
Optionally, the status analysis unit 3200 is configured to:
acquiring a target event label corresponding to the emotional state;
according to the target event label and a target emotion state corresponding to the target event label, inquiring and acquiring a matched emotion statement in a pre-constructed emotion statement library to serve as the emotion state prompt;
the emotion sentence library comprises a plurality of emotion sentences, and each emotion sentence is a sentence which is respectively associated with an event label and an emotion state.
Optionally, the means for obtaining the target event tag corresponding to the emotional state by the state analysis unit 3200 is configured to:
acquiring an event tag in each time unit and the emotional state of the user in a statistical time period which is in accordance with preset statistical time duration from the current time unit of the user for implementing the check-in operation;
and determining a target event label according to the event label in each time unit in the statistical time period and the emotional state of the user.
Further optionally, the means for acquiring the event label in each time unit in the status analysis unit 3200 is configured to:
acquiring event information of the hot event in each time unit;
and extracting event characteristic words from the event information of the hot event in each time unit to serve as the event labels in the corresponding time units.
Further optionally, the means in the state analysis unit 3200 for determining a target event label according to the event label in each of the time units in the statistical time period and the emotional state of the user is configured to:
determining a target emotional state from all the emotional states of the user acquired in the statistical time period;
selecting the event label corresponding to the target emotion state from all the event labels acquired in the statistical time period as a candidate event label;
and performing cluster analysis on the candidate event labels to obtain the candidate event label with the highest association degree with the target emotional state as the target event label.
Optionally, the emotional state prompt includes an emotional state change indication within a statistical time period that conforms to a preset statistical time duration from a current time unit in which the user performs the check-in operation;
the status analysis unit 3200 is further adapted to:
acquiring the emotional state of the user in each time unit in the statistical time period, and generating a corresponding emotional state change indication;
the result presentation unit 3300 is also used to:
and drawing a corresponding emotional state change curve according to the emotional state change indication, and displaying the emotional state change curve in the application check-in interface.
Optionally, the result presentation unit 3300 is further configured to:
when the check-in result is displayed, a recommendation result corresponding to the check-in result is displayed at the same time;
the recommendation result at least comprises one of user information of other users corresponding to the check-in result and item information corresponding to the check-in result.
It will be appreciated by those skilled in the art that the application check-in apparatus 3000 may be implemented in a variety of ways. For example, application check-in apparatus 3000 may be implemented by an instruction configuration processor. For example, instructions may be stored in ROM and read from ROM into a programmable device to implement application check-in apparatus 3000 when starting a device. For example, the application check-in device 3000 may be cured into a dedicated device (e.g., ASIC). The application check-in apparatus 3000 may be divided into separate units or may be implemented by combining them. The application check-in apparatus 3000 may be implemented by one of the various implementations described above, or may be implemented by a combination of two or more of the various implementations described above.
In this embodiment, the application check-in apparatus 3000 may be a functional module provided in any application and configured to implement the application check-in method provided in this embodiment, or the application check-in apparatus 3000 may also be any application providing a check-in function.
< electronic apparatus >
In this embodiment, an electronic apparatus 4000 is further provided, as shown in fig. 8, including:
a display device 4100 for displaying a human-computer interaction interface;
a memory 4200 for storing executable instructions;
the processor 4300 is configured to run the electronic device to execute the application check-in method according to any one of the embodiments provided in this embodiment, according to the control of the executable instruction.
In this embodiment, the electronic device 4000 may be any electronic device supporting the application check-in method according to this embodiment, such as a mobile phone, a tablet computer, a palmtop computer, a desktop computer, a notebook computer, a workstation, and a game console. In one example, electronic device 4000 may be a cell phone installed with an application that provides an application check-in function.
The electronic device 4000 may also comprise other means, for example, an electronic device 1000 as shown in fig. 1, communication means, etc.
The embodiments of the present invention have been described above with reference to the accompanying drawings, and according to the embodiments, an application check-in method, an apparatus, and an electronic device are provided, in which when a check-in operation performed by a user on an application check-in interface is received, a check-in state of the user including an emotional state of the user when the user checks in is acquired, and a corresponding check-in result including an emotional state prompt is acquired through analysis of the check-in state of the user to be displayed through the application check-in interface, so that a personalized check-in result adapted to the user is displayed for the check-in operation of the user, a unique application check-in response mode is provided, and the attractiveness of the application to the user is effectively improved.
The present invention may be a system, method and/or computer program product. The computer program product may include a computer-readable storage medium having computer-readable program instructions embodied therewith for causing a processor to implement various aspects of the present invention.
The computer readable storage medium may be a tangible device that can hold and store the instructions for use by the instruction execution device. The computer readable storage medium may be, for example, but not limited to, an electronic memory device, a magnetic memory device, an optical memory device, an electromagnetic memory device, a semiconductor memory device, or any suitable combination of the foregoing. More specific examples (a non-exhaustive list) of the computer readable storage medium would include the following: a portable computer diskette, a hard disk, a Random Access Memory (RAM), a read-only memory (ROM), an erasable programmable read-only memory (EPROM or flash memory), a Static Random Access Memory (SRAM), a portable compact disc read-only memory (CD-ROM), a Digital Versatile Disc (DVD), a memory stick, a floppy disk, a mechanical coding device, such as punch cards or in-groove projection structures having instructions stored thereon, and any suitable combination of the foregoing. Computer-readable storage media as used herein is not to be construed as transitory signals per se, such as radio waves or other freely propagating electromagnetic waves, electromagnetic waves propagating through a waveguide or other transmission medium (e.g., optical pulses through a fiber optic cable), or electrical signals transmitted through electrical wires.
The computer-readable program instructions described herein may be downloaded from a computer-readable storage medium to a respective computing/processing device, or to an external computer or external storage device via a network, such as the internet, a local area network, a wide area network, and/or a wireless network. The network may include copper transmission cables, fiber optic transmission, wireless transmission, routers, firewalls, switches, gateway computers and/or edge servers. The network adapter card or network interface in each computing/processing device receives computer-readable program instructions from the network and forwards the computer-readable program instructions for storage in a computer-readable storage medium in the respective computing/processing device.
The computer program instructions for carrying out operations of the present invention may be assembler instructions, Instruction Set Architecture (ISA) instructions, machine-related instructions, microcode, firmware instructions, state setting data, or source or object code written in any combination of one or more programming languages, including an object oriented programming language such as Smalltalk, C + + or the like and conventional procedural programming languages, such as the "C" programming language or similar programming languages. The computer-readable program instructions may execute entirely on the user's computer, partly on the user's computer, as a stand-alone software package, partly on the user's computer and partly on a remote computer or entirely on the remote computer or server. In the case of a remote computer, the remote computer may be connected to the user's computer through any type of network, including a Local Area Network (LAN) or a Wide Area Network (WAN), or the connection may be made to an external computer (for example, through the Internet using an Internet service provider). In some embodiments, aspects of the present invention are implemented by personalizing an electronic circuit, such as a programmable logic circuit, a Field Programmable Gate Array (FPGA), or a Programmable Logic Array (PLA), with state information of computer-readable program instructions, which can execute the computer-readable program instructions.
Aspects of the present invention are described herein with reference to flowchart illustrations and/or block diagrams of methods, apparatus (systems) and computer program products according to embodiments of the invention. It will be understood that each block of the flowchart illustrations and/or block diagrams, and combinations of blocks in the flowchart illustrations and/or block diagrams, can be implemented by computer-readable program instructions.
These computer-readable program instructions may be provided to a processor of a general purpose computer, special purpose computer, or other programmable data processing apparatus to produce a machine, such that the instructions, which execute via the processor of the computer or other programmable data processing apparatus, create means for implementing the functions/acts specified in the flowchart and/or block diagram block or blocks. These computer-readable program instructions may also be stored in a computer-readable storage medium that can direct a computer, programmable data processing apparatus, and/or other devices to function in a particular manner, such that the computer-readable medium storing the instructions comprises an article of manufacture including instructions which implement the function/act specified in the flowchart and/or block diagram block or blocks.
The computer readable program instructions may also be loaded onto a computer, other programmable data processing apparatus, or other devices to cause a series of operational steps to be performed on the computer, other programmable apparatus or other devices to produce a computer implemented process such that the instructions which execute on the computer, other programmable apparatus or other devices implement the functions/acts specified in the flowchart and/or block diagram block or blocks.
The flowchart and block diagrams in the figures illustrate the architecture, functionality, and operation of possible implementations of systems, methods and computer program products according to various embodiments of the present invention. In this regard, each block in the flowchart or block diagrams may represent a module, segment, or portion of instructions, which comprises one or more executable instructions for implementing the specified logical function(s). In some alternative implementations, the functions noted in the block may occur out of the order noted in the figures. For example, two blocks shown in succession may, in fact, be executed substantially concurrently, or the blocks may sometimes be executed in the reverse order, depending upon the functionality involved. It will also be noted that each block of the block diagrams and/or flowchart illustration, and combinations of blocks in the block diagrams and/or flowchart illustration, can be implemented by special purpose hardware-based systems which perform the specified functions or acts, or combinations of special purpose hardware and computer instructions. It is well known to those skilled in the art that implementation by hardware, by software, and by a combination of software and hardware are equivalent.
Having described embodiments of the present invention, the foregoing description is intended to be exemplary, not exhaustive, and not limited to the embodiments disclosed. Many modifications and variations will be apparent to those of ordinary skill in the art without departing from the scope and spirit of the described embodiments. The terminology used herein is chosen in order to best explain the principles of the embodiments, the practical application, or improvements made to the technology in the marketplace, or to enable others of ordinary skill in the art to understand the embodiments disclosed herein. The scope of the invention is defined by the appended claims.

Claims (10)

1. An application check-in method, comprising:
receiving a check-in operation of a user on an application check-in interface, and acquiring a corresponding check-in state;
wherein the check-in state comprises at least an emotional state of the user when performing the check-in operation;
analyzing the check-in state and acquiring a corresponding check-in result;
wherein the check-in result at least comprises an emotional state prompt corresponding to the emotional state;
and displaying the check-in result through the application check-in interface.
2. The method of claim 1, wherein,
a check-in state label is arranged in the application check-in interface;
the check-in operation comprises a selection operation or an editing operation implemented on the check-in status tag;
the step of obtaining the corresponding check-in status comprises:
acquiring information corresponding to the check-in state label as a check-in state according to check-in operation of a user on the check-in state label;
and/or the presence of a gas in the gas,
a shooting control for triggering and calling a camera is arranged in the application check-in interface;
the check-in operation comprises an operation which is implemented in the shooting control to trigger and call a camera;
the step of obtaining the corresponding check-in status comprises:
and calling a camera to acquire the check-in state according to the check-in operation of the shooting control implemented by the user.
3. The method of claim 1, wherein analyzing the check-in status and obtaining a check-in result comprises:
acquiring a target event label corresponding to the emotional state;
according to the target event label and a target emotion state corresponding to the target event label, inquiring and acquiring a matched emotion statement in a pre-constructed emotion statement library to serve as the emotion state prompt;
the emotion sentence library comprises a plurality of emotion sentences, and each emotion sentence is a sentence which is respectively associated with an event label and an emotion state.
4. The method of claim 3, wherein the step of obtaining a target event tag corresponding to the emotional state comprises:
acquiring an event tag in each time unit and the emotional state of the user in a statistical time period which is in accordance with preset statistical time duration from the current time unit of the user for implementing the check-in operation;
and determining a target event label according to the event label in each time unit in the statistical time period and the emotional state of the user.
5. The method of claim 4, wherein the step of obtaining the event tag for each of the time units comprises:
acquiring event information of the hot event in each time unit;
and extracting event characteristic words from the event information of the hot event in each time unit to serve as the event labels in the corresponding time units.
6. The method of claim 4, wherein the step of determining a target event label based on the event labels within each of the time units in the statistical period and the emotional state of the user comprises:
determining a target emotional state from all the emotional states of the user acquired in the statistical time period;
selecting the event label corresponding to the target emotion state from all the event labels acquired in the statistical time period as a candidate event label;
and performing cluster analysis on the candidate event labels to obtain the candidate event label with the highest association degree with the target emotional state as the target event label.
7. The method of claim 1, wherein,
the emotional state prompt comprises an emotional state change indication in a counting time period which is in accordance with preset counting time length from the current time unit of the user for implementing the check-in operation;
the step of obtaining the corresponding check-in result comprises:
acquiring the emotional state of the user in each time unit in the statistical time period, and generating a corresponding emotional state change indication;
the step of displaying the check-in result comprises:
and drawing a corresponding emotional state change curve according to the emotional state change indication, and displaying the emotional state change curve in the application check-in interface.
8. The method of claim 1, wherein presenting the check-in result further comprises:
when the check-in result is displayed, a recommendation result corresponding to the check-in result is displayed at the same time;
the recommendation result at least comprises one of user information of other users corresponding to the check-in result and item information corresponding to the check-in result.
9. An application check-in apparatus, comprising:
the system comprises a state acquisition unit, a state acquisition unit and a state acquisition unit, wherein the state acquisition unit is used for receiving the check-in operation of a user on an application check-in interface and acquiring a corresponding check-in state;
wherein the check-in state comprises at least an emotional state of the user when performing the check-in operation;
the state analysis unit is used for analyzing the check-in state and acquiring a corresponding check-in result;
wherein the check-in result at least comprises an emotional state prompt corresponding to the emotional state;
and the result display unit is used for displaying the check-in result through the application check-in interface.
10. An electronic device, comprising:
the display device is used for displaying a human-computer interaction interface;
a memory for storing executable instructions;
a processor for operating the electronic device to perform the application check-in method of any one of claims 1-8 under the control of the executable instructions.
CN201810881964.XA 2018-07-31 2018-07-31 Application sign-in method and device and electronic equipment Active CN110795178B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN201810881964.XA CN110795178B (en) 2018-07-31 2018-07-31 Application sign-in method and device and electronic equipment

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN201810881964.XA CN110795178B (en) 2018-07-31 2018-07-31 Application sign-in method and device and electronic equipment

Publications (2)

Publication Number Publication Date
CN110795178A true CN110795178A (en) 2020-02-14
CN110795178B CN110795178B (en) 2023-08-22

Family

ID=69425706

Family Applications (1)

Application Number Title Priority Date Filing Date
CN201810881964.XA Active CN110795178B (en) 2018-07-31 2018-07-31 Application sign-in method and device and electronic equipment

Country Status (1)

Country Link
CN (1) CN110795178B (en)

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN111564203A (en) * 2020-04-30 2020-08-21 深圳市镜象科技有限公司 Psychotherapy method based on ACT therapy, psychotherapy terminal, and storage medium

Citations (7)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN102930610A (en) * 2012-10-24 2013-02-13 深圳市万凯达科技有限公司 Check-in information processing method and system
CN103714071A (en) * 2012-09-29 2014-04-09 株式会社日立制作所 Label emotional tendency quantifying method and label emotional tendency quantifying system
US20150074020A1 (en) * 2013-09-10 2015-03-12 Facebook, Inc. Sentiment polarity for users of a social networking system
CN105095286A (en) * 2014-05-14 2015-11-25 腾讯科技(深圳)有限公司 Page recommendation method and device
CN105138222A (en) * 2015-08-26 2015-12-09 美国掌赢信息科技有限公司 Method for selecting expression icon and electronic equipment
CN105843922A (en) * 2016-03-25 2016-08-10 乐视控股(北京)有限公司 Multimedia classification recommendation method, apparatus and system
US20170311863A1 (en) * 2015-02-13 2017-11-02 Omron Corporation Emotion estimation device and emotion estimation method

Patent Citations (7)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN103714071A (en) * 2012-09-29 2014-04-09 株式会社日立制作所 Label emotional tendency quantifying method and label emotional tendency quantifying system
CN102930610A (en) * 2012-10-24 2013-02-13 深圳市万凯达科技有限公司 Check-in information processing method and system
US20150074020A1 (en) * 2013-09-10 2015-03-12 Facebook, Inc. Sentiment polarity for users of a social networking system
CN105095286A (en) * 2014-05-14 2015-11-25 腾讯科技(深圳)有限公司 Page recommendation method and device
US20170311863A1 (en) * 2015-02-13 2017-11-02 Omron Corporation Emotion estimation device and emotion estimation method
CN105138222A (en) * 2015-08-26 2015-12-09 美国掌赢信息科技有限公司 Method for selecting expression icon and electronic equipment
CN105843922A (en) * 2016-03-25 2016-08-10 乐视控股(北京)有限公司 Multimedia classification recommendation method, apparatus and system

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN111564203A (en) * 2020-04-30 2020-08-21 深圳市镜象科技有限公司 Psychotherapy method based on ACT therapy, psychotherapy terminal, and storage medium

Also Published As

Publication number Publication date
CN110795178B (en) 2023-08-22

Similar Documents

Publication Publication Date Title
JP6926339B2 (en) Image clustering methods and devices, electronic devices and storage media
US10834355B2 (en) Barrage message processing
CN109325223B (en) Article recommendation method and device and electronic equipment
CN110134806B (en) Contextual user profile photo selection
CN108227950B (en) Input method and device
EP3852044A1 (en) Method and device for commenting on multimedia resource
US11768597B2 (en) Method and system for editing video on basis of context obtained using artificial intelligence
KR20190132360A (en) Method and device for processing multimedia resources
CN105068976B (en) Ticket information display method and device
US11182447B2 (en) Customized display of emotionally filtered social media content
US11341336B2 (en) Recording medium, conversation control method, and information processing apparatus
US11640420B2 (en) System and method for automatic summarization of content with event based analysis
KR20150016786A (en) Device and sever for providing a subject of conversation and method for providing the same
CN110764627B (en) Input method and device and electronic equipment
CN113849723A (en) Search method and search device
CN111428806B (en) Image tag determining method and device, electronic equipment and storage medium
CN110795178B (en) Application sign-in method and device and electronic equipment
CN111475664B (en) Object display method and device and electronic equipment
CN108255917B (en) Image management method and device and electronic device
CN110929122B (en) Data processing method and device for data processing
CN110362686B (en) Word stock generation method and device, terminal equipment and server
CN112115231A (en) Data processing method and device
CN113221030A (en) Recommendation method, device and medium
CN113535940A (en) Event abstract generation method and device and electronic equipment
CN113360738A (en) Content evaluation method, system, and computer-readable recording medium

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
TA01 Transfer of patent application right

Effective date of registration: 20200603

Address after: 310051 room 508, floor 5, building 4, No. 699, Wangshang Road, Changhe street, Binjiang District, Hangzhou City, Zhejiang Province

Applicant after: Alibaba (China) Co.,Ltd.

Address before: 100083, Beijing, Haidian District, Cheng Fu Road, No. 28, A building, block 12

Applicant before: UC MOBILE Co.,Ltd.

TA01 Transfer of patent application right
GR01 Patent grant
GR01 Patent grant