CN115857748A - Interaction method, intelligent terminal and storage medium - Google Patents

Interaction method, intelligent terminal and storage medium Download PDF

Info

Publication number
CN115857748A
CN115857748A CN202211528900.4A CN202211528900A CN115857748A CN 115857748 A CN115857748 A CN 115857748A CN 202211528900 A CN202211528900 A CN 202211528900A CN 115857748 A CN115857748 A CN 115857748A
Authority
CN
China
Prior art keywords
shortcut
contact
target
determining
current
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
CN202211528900.4A
Other languages
Chinese (zh)
Inventor
秦振新
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Shenzhen Transsion Holdings Co Ltd
Original Assignee
Shenzhen Transsion Holdings Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Shenzhen Transsion Holdings Co Ltd filed Critical Shenzhen Transsion Holdings Co Ltd
Priority to CN202211528900.4A priority Critical patent/CN115857748A/en
Publication of CN115857748A publication Critical patent/CN115857748A/en
Pending legal-status Critical Current

Links

Images

Abstract

The application provides an interaction method, an intelligent terminal and a storage medium, wherein the interaction method can be applied to the intelligent terminal and comprises the following steps: responding to the interactive operation of a target application, and determining at least one common contact corresponding to the target application; determining or generating shortcuts corresponding to the frequently-used contacts; and displaying the shortcut in a shortcut triggering area. According to the method and the device, after the interactive operation is obtained, at least one frequently-used contact is determined, and the shortcut of the frequently-used contact is displayed in the quick trigger area, so that the searching efficiency of the contact is improved, and the user experience is improved.

Description

Interaction method, intelligent terminal and storage medium
5 field of the invention
The application relates to the technical field of terminals, in particular to an interaction method, an intelligent terminal and a storage medium.
Background
In some implementations, when a user needs to send information to any contact or plays a contact 0 of any contact, the user needs to enter a corresponding application program to find the corresponding contact.
In the course of conceiving and implementing the present application, the inventors found that at least the following problems existed: if different contacts of different software need to be communicated, switching between one application program and another application program needs to be realized, and a large amount of time cost is consumed, or if a certain software has a common contact, the software needs to be accessed and the corresponding contact is found, and a large amount of time cost is also consumed.
5 the foregoing is intended to provide general background information and is not necessarily prior art.
Disclosure of Invention
In order to solve the technical problems, the application provides an interaction method, an intelligent terminal and a storage medium, so that a user can quickly communicate with a contact person, and the operation is simple and convenient.
The application provides an interaction method, which can be applied to an intelligent terminal and comprises the following steps:
s10: responding to the interactive operation of the target application, and determining at least one common contact corresponding to the target application;
s20: determining or generating shortcuts corresponding to the frequently-used contacts;
5S30: and displaying the shortcut in a shortcut triggering area.
Optionally, the shortcut trigger area and/or the target interface are displayed on the current display interface in an overlapping manner.
Optionally, there is no masking layer between the shortcut trigger area and/or the target interface and the current display interface.
Optionally, the method further comprises:
0 acquiring the type of the running application;
and executing the step S20 when the type of the target application is a preset type.
Optionally, the S10 step includes at least one of:
determining the contact persons with the contact frequency greater than or equal to a preset frequency threshold value as the frequently-used contact persons according to the contact frequency of the corresponding contact persons in the target application;
determining the contact persons with the operation frequency greater than or equal to a preset frequency threshold value as the frequently-used contact persons according to the operation frequency of the corresponding contact persons in the target application;
and determining the current dragged object as the common contact according to the current dragged object.
Optionally, the method further comprises at least one of:
detecting a first trigger operation of the shortcut, determining the shortcut to be deleted of the first trigger operation, and deleting the shortcut to be deleted;
acquiring trigger frequency corresponding to each shortcut in the shortcut trigger area, and deleting the shortcut when the trigger frequency is smaller than a preset frequency threshold;
obtaining the latest communication time corresponding to each shortcut in the shortcut trigger area, determining the time interval between the latest communication time corresponding to each shortcut 5 mode and the current time, and deleting the time interval beyond the preset time
The corresponding shortcut is long.
Optionally, after the step of determining at least one frequent contact corresponding to at least one target application, the method further includes:
determining a current operation scene according to the type of the operation application;
0, determining or generating a target frequent contact according to the current operation scene to determine or generate the target frequent contact
And displaying the shortcut corresponding to the target common contact in the shortcut triggering area by using the shortcut corresponding to the contact.
Optionally, the step of determining or generating the target frequent contact according to the current operating scenario includes:
determining or generating a target application associated with a current operation scene according to the current operation scene; and 5, determining or generating the target frequent contact according to the frequent contact associated with the target application.
Optionally, the determining or generating the target frequent contact according to the frequent contact associated with the target application includes:
acquiring a contact tag of a common contact related to the content to be interacted and/or the target application;
and determining the 0 target common contact according to the common contact corresponding to the contact tag matched with the content to be interacted.
Optionally, the step of determining or generating the target frequent contact according to the current operating scenario includes:
determining a common contact associated with the current operation scene according to the current operation scene;
and determining or generating the target frequent contact according to the frequent contact related to the current operation scene.
This application still provides an intelligent terminal, includes: the interactive system comprises a memory and a processor, wherein the memory stores an interactive program, and the interactive program realizes the steps of any one of the interactive methods when being executed by the processor.
The present application also provides a storage medium storing a computer program which, when executed by a processor, implements the steps of the interaction method as described in any one of the above.
As described above, the interaction method of the present application, which can be applied to an intelligent terminal, includes the steps of: s10: responding to the interactive operation of a target application, and determining at least one common contact corresponding to the target application; s20:
determining or generating shortcuts corresponding to the frequently-used contacts; s30: displaying the shortcut in a shortcut trigger area
In a short way. Through the technical scheme, the function of displaying the shortcut 5 of the commonly used contact in the shortcut area can be realized, the problem of low efficiency of searching for the contact by a user is solved, and the user experience is further improved.
Drawings
The accompanying drawings, which are incorporated in and constitute a part of this specification, illustrate embodiments consistent with the present application and together with the description, serve to explain the principles of the application. In order to more clearly illustrate the technical solutions of the embodiments of the present application, the drawings needed to be used in the description of the embodiments will be briefly described below, and it is obvious for those skilled in the art to obtain other drawings based on these drawings without inventive exercise.
Fig. 1 is a schematic diagram of a hardware structure of an intelligent terminal implementing various embodiments of the present application;
fig. 2 is a diagram illustrating a communication network system architecture according to an embodiment of the present application;
fig. 3 is a flow chart diagram illustrating an interaction method according to the first embodiment;
FIG. 4 is a drawing diagram of the interaction method shown according to the first embodiment;
FIG. 5 is a schematic diagram of a shortcut trigger area of the interaction method shown in accordance with the first embodiment;
FIG. 6 is a schematic diagram of a shortcut trigger area of the interaction method shown in accordance with the first embodiment;
FIG. 7 is a schematic diagram of a shortcut trigger area of the interaction method shown in accordance with the first embodiment;
fig. 8 is a flowchart illustrating an interaction method according to a second embodiment;
fig. 9 is a flowchart illustrating an interaction method according to a third embodiment;
fig. 10 is a detailed flow diagram of the step of the interaction method S32 shown according to the fourth embodiment;
fig. 11 is a flowchart illustrating a step of the interaction method S322 according to the fifth embodiment;
fig. 12 is a flow chart illustrating a step of the interaction method S32 according to the sixth embodiment;
fig. 13 is a schematic flow chart diagram illustrating an interaction method according to a seventh embodiment;
the implementation, functional features and advantages of the objectives of the present application will be further explained with reference to the accompanying drawings. With the above figures, there are shown specific embodiments of the present application, which will be described in more detail below. These drawings and written description are not intended to limit the scope of the inventive concepts in any manner, but rather to illustrate the inventive concepts to those skilled in the art by reference to specific embodiments.
Detailed Description
Reference will now be made in detail to the exemplary embodiments, examples of which are illustrated in the accompanying drawings. When the following description refers to the accompanying drawings, like numbers in different drawings represent the same or similar elements unless otherwise indicated. The embodiments described in the following exemplary embodiments do not represent all embodiments consistent with the present application. Rather, they are merely examples of apparatus and methods consistent with certain aspects of the application, as detailed in the appended claims.
It should be noted that, in this document, the terms "comprises," "comprising," or any other variation thereof, are intended to cover a non-exclusive inclusion, such that a process, method, article, or apparatus that comprises a list of elements does not include only those elements but may include other elements not expressly listed or inherent to such process, method, article, or apparatus. Without further limitation, the recitation of a claim "comprising a" 8230a "\8230means" does not exclude the presence of additional identical elements in the process, method, article or apparatus in which the element is incorporated, and further, similarly named components, features, elements in different embodiments of the application may have the same meaning or may have different meanings, the specific meaning of which should be determined by its interpretation in the specific embodiment or by further combination with the context of the specific embodiment.
It should be understood that although the terms first, second, third, etc. may be used herein to describe various information, such information should not be limited to these terms. These terms are only used to distinguish one type of information from another. For example, first information may also be referred to as second information, and similarly, second information may also be referred to as first information, without departing from the scope herein. The word "if" as used herein may be interpreted as "at" \8230; "or" when 8230; \8230; "or" in response to a determination ", depending on the context. Also, as used herein, the singular forms "a", "an" and "the" are intended to include the plural forms as well, unless the context indicates otherwise. It will be further understood that the terms "comprises," "comprising," "includes" and/or "including," when used in this specification, specify the presence of stated features, steps, operations, elements, components, items, species, and/or groups, but do not preclude the presence, or addition of one or more other features, steps, operations, elements, components, items, species, and/or groups thereof. The terms "or," "and/or," "including at least one of the following," and the like, as used herein, are to be construed as inclusive or mean any one or any combination. For example, "includes at least one of: A. b, C "means" any of the following: a; b; c; a and B; a and C; b and C; a and B and C ", again for example," a, B or C "or" a, B and/or C "means" any one of the following: a; b; c; a and B; a and C; b and C; a and B and C'. An exception to this definition will occur only when a combination of elements, functions, steps or operations are inherently mutually exclusive in some way.
It should be understood that, although the steps in the flowcharts in the embodiments of the present application are shown in order as indicated by the arrows, the steps are not necessarily performed in order as indicated by the arrows. The steps are not performed in the exact order shown and may be performed in other orders unless otherwise indicated herein. Moreover, at least some of the steps in the figures may include multiple sub-steps or multiple stages that are not necessarily performed at the same time, but may be performed at different times, in different orders, and may be performed alternately or partially with other steps or at least some of the sub-steps or stages of other steps.
The words "if", as used herein may be interpreted as "at \8230; \8230whenor" when 8230; \8230when or "in response to a determination" or "in response to a detection", depending on the context. Similarly, the phrases "if determined" or "if detected (a stated condition or event)" may be interpreted as "when determined" or "in response to a determination" or "when detected (a stated condition or event)" or "in response to a detection (a stated condition or event)", depending on the context.
It should be noted that step numbers such as S10 and S20 are used herein for the purpose of more clearly and briefly describing the corresponding contents, and do not constitute a substantial limitation on the sequence, and those skilled in the art may perform S20 first and then S10 in the specific implementation, but these should be within the protection scope of the present application.
It should be understood that the specific embodiments described herein are merely illustrative of the present application and are not intended to limit the present application.
In the following description, suffixes such as "module", "component", or "unit" used to denote elements are used only for the convenience of description of the present application, and have no specific meaning in themselves. Thus, "module", "component" or "unit" may be used mixedly.
The smart terminal may be implemented in various forms. For example, the smart terminal described in the present application may include smart terminals such as a mobile phone, a tablet computer, a notebook computer, a palmtop computer, a Personal Digital Assistant (PDA), a Portable Media Player (PMP), a navigation device, a wearable device, a smart band, a pedometer, and the like, and fixed terminals such as a Digital TV, a desktop computer, and the like.
While the following description will be given by way of example of a smart terminal, those skilled in the art will appreciate that the configuration according to the embodiments of the present application can be applied to a fixed type terminal in addition to elements particularly used for mobile purposes.
Referring to fig. 1, which is a schematic diagram of a hardware structure of an intelligent terminal for implementing various embodiments of the present application, the intelligent terminal 100 may include: an RF (Radio Frequency) unit 101, a WiFi module 102, an audio output unit 103, an a/V (audio/video) input unit 104, a sensor 105, a display unit 106, a user input unit 107, an interface unit 108, a memory 109, a processor 110, and a power supply 111. Those skilled in the art will appreciate that the intelligent terminal architecture shown in fig. 1 does not constitute a limitation of the intelligent terminal, and that the intelligent terminal may include more or fewer components than shown, or some components may be combined, or a different arrangement of components.
The following specifically describes each component of the intelligent terminal with reference to fig. 1:
the radio frequency unit 101 may be configured to receive and transmit signals during information transmission and reception or during a call, and specifically, receive downlink information of a base station and then process the downlink information to the processor 110; in addition, the uplink data is transmitted to the base station. Typically, radio frequency unit 101 includes, but is not limited to, an antenna, at least one amplifier, a transceiver, a coupler, a low noise amplifier, a duplexer, and the like. In addition, the radio frequency unit 101 can also communicate with a network and other devices through wireless communication. The wireless communication may use any communication standard or protocol, including but not limited to GSM (Global System for Mobile communications), GPRS (General Packet Radio Service), CDMA2000 (Code Division Multiple Access 2000 ), WCDMA (Wideband Code Division Multiple Access), TD-SCDMA (Time Division-Synchronous Code Division Multiple Access), FDD-LTE (Frequency Division duplex-Long Term Evolution), TDD-LTE (Time Division duplex-Long Term Evolution, time Division Long Term Evolution), 5G, and so on.
WiFi belongs to short-distance wireless transmission technology, and the intelligent terminal can help a user to receive and send e-mails, browse webpages, access streaming media and the like through the WiFi module 102, and provides wireless broadband internet access for the user. Although fig. 1 shows the WiFi module 102, it is understood that it does not belong to the essential constitution of the smart terminal, and may be omitted entirely as needed within the scope not changing the essence of the invention.
The audio output unit 103 may convert audio data received by the radio frequency unit 101 or the WiFi module 102 or stored in the memory 109 into an audio signal and output as sound when the smart terminal 100 is in a call signal reception mode, a call mode, a recording mode, a voice recognition mode, a broadcast reception mode, or the like. Also, the audio output unit 103 may also provide audio output related to a specific function performed by the smart terminal 100 (e.g., a call signal reception sound, a message reception sound, etc.). The audio output unit 103 may include a speaker, a buzzer, and the like.
The a/V input unit 104 is used to receive audio or video signals. The a/V input Unit 104 may include a Graphics Processing Unit (GPU) 1041 and a microphone 1042, the Graphics processor 1041 Processing image data of still pictures or video obtained by an image capturing device (e.g., a camera) in a video capturing mode or an image capturing mode. The processed image frames may be displayed on the display unit 106. The image frames processed by the graphic processor 1041 may be stored in the memory 109 (or other storage medium) or transmitted via the radio frequency unit 101 or the WiFi module 102. The microphone 1042 may
To receive sounds (audio data) via the microphone 10425 in a phone call mode, a recording mode, a voice recognition mode, or the like, and to be able to process such sounds into audio data. Processed sound
The frequency (voice) data can be converted into a format output transmittable to a mobile communication base station via the radio frequency unit 101 in case of the phone call mode. The microphone 1042 may implement various types of noise cancellation (or suppression) algorithms to cancel (or suppress) noise or interference generated in the course of receiving and transmitting audio signals.
The smart terminal 100 also includes at least one sensor 105, such as a light sensor, a motion sensor, and 0 other sensors. Optionally, the light sensor comprises an ambient light sensor and a proximity sensor, optionally a ring
The ambient light sensor may adjust the brightness of the display panel 1061 according to the brightness of the ambient light, and the proximity sensor may turn off the display panel 1061 and/or the backlight when the smart terminal 100 moves to the ear. As one type of motion sensor, accelerometer sensors can detect the magnitude of acceleration in various directions (typically three axes), static
The gravity force and the gravity direction can be detected at the end of time, and the method can be used for identifying the application of the gesture of the mobile phone (such as 5-switch horizontal and vertical screens, related games and magnetometer gesture calibration), the vibration identification related function (such as pedometer and knocking) and the like; as for other sensors such as a fingerprint sensor, a pressure sensor, an iris sensor, a molecular sensor, a gyroscope, a barometer, a hygrometer, a thermometer, and an infrared sensor, which can be configured on the mobile phone, the description is omitted here.
The display unit 106 is used to display information input by a user or information provided to the user. The Display unit 0 106 may include a Display panel 1061, and the Display panel 1061 may be configured in the form of a Liquid Crystal Display (LCD), an Organic Light-Emitting Diode (OLED), or the like.
The user input unit 107 may be used to receive input numeric or character information, and generate and communicate with the intelligent terminal
User settings of the terminal and key signal input related to function control. Alternatively, the user input unit 107 may 5 include a touch panel 1071 and other input devices 1072. The touch panel 1071, also referred to as a touch screen,
touch operations by a user on or near the touch panel 1071 (e.g., operations by a user on or near the touch panel 1071 using a finger, a stylus, or any other suitable object or accessory) may be collected and driven according to a predetermined program. Touch panel 1071 may include a touch detection device and a touch controller
Two parts of a maker. Optionally, the touch detection device detects a touch orientation of the user, detects a signal caused by touch operation 0, and transmits the signal to the touch controller; the touch controller receives a touch from the touch detection device
The touch information is converted into touch point coordinates, and then sent to the processor 110, and can receive and execute commands sent by the processor 110. In addition, the touch panel 1071 may be implemented in various types, such as a resistive type, a capacitive type, an infrared ray, and a surface acoustic wave. In addition to the touch panel 1071, the user input unit 107 may include other input devices 1072. Alternatively, other input devices 1072 may include, but are not limited to, one of a physical keyboard, 5 function keys (e.g., volume control keys, switch keys, etc.), a trackball, a mouse, a joystick, and the like
One or more, and are not specifically limited herein.
Alternatively, the touch panel 1071 may cover the display panel 1061, and when the touch panel 1071 detects a touch operation thereon or nearby, the touch panel 1071 transmits the touch operation to the processor 110 to determine the type of the touch event, and then the processor 110 provides a corresponding visual output on the display panel 1061 according to the type of the touch event. Although the touch panel 1071 and the display panel 1061 are shown in fig. 1 as two separate components to implement the input and output functions of the smart terminal, in some embodiments, the touch panel 1071 and the display panel 1061 may be integrated to implement the input and output functions of the smart terminal, and is not limited herein.
The interface unit 108 serves as an interface through which at least one external device is connected to the smart terminal 100. For example, the external device may include a wired or wireless headset port, an external power supply (or battery charger) port, a wired or wireless data port, a memory card port, a port for connecting a device having an identification module, an audio input/output (I/O) port, a video I/O port, an earphone port, and the like. The interface unit 108 may be used to receive input (e.g., data information, power, etc.) from an external device and will receive the input
To one or more elements within the smart terminal 100 or may be used to transfer data between the smart terminal 1000 and an external device.
The memory 109 may be used to store software programs as well as various data. The memory 109 may mainly include a program storage area and a data storage area, and optionally, the program storage area may store an operating system, an application program (such as a sound playing function, an image playing function, and the like) required by at least one function, and the like; a data storage area for storing data
Store data created from the use of the handset (such as audio data, phone book, etc.), and so on. In addition, storage 5 may include high speed random access memory, and may also include non-volatile memory, e.g., at least
A magnetic disk storage device, flash memory device, or other volatile solid state storage device.
The processor 110 is a control center of the intelligent terminal, connects various parts of the entire intelligent terminal using various interfaces and lines, and performs various functions of the intelligent terminal and processes data by operating or executing software programs and/or modules stored in the memory 109 and calling data stored in the memory 109, thereby performing overall monitoring of the intelligent terminal from 0. Processor 110 may include one or more processing units; in a preferred embodiment of the method of the invention,
the processor 110 may integrate an application processor, which optionally handles primarily the operating system, user interface, applications, etc., and a modem processor, which handles primarily wireless communications. It will be appreciated that the modem processor described above may not be integrated into the processor 110.
The smart terminal 100 may further include a power source 111 (e.g., a battery) for powering the various components, and preferably, the 5-power source 111 may be logically connected to the processor 110 via a power management system, such as a power management system
The system realizes the functions of managing charging, discharging, power consumption management and the like.
Although not shown in fig. 1, the smart terminal 100 may further include a bluetooth module or the like, which is not described herein.
In order to facilitate understanding of the embodiment of the present application, a communication network 0 system on which the smart terminal of the present application is based is described below.
Referring to fig. 2, fig. 2 is an architecture diagram of a communication Network system provided in an embodiment of the present application, where the communication Network system is an LTE system of a universal mobile telecommunications technology, and the LTE system includes a UE (User Equipment) 201, an e-UTRAN (Evolved UMTS Terrestrial radio access Network) 202, an epc (Evolved Packet Core,5 Evolved Packet Core) 203, and an IP service 204 of an operator, which are in communication connection in sequence.
Optionally, the UE201 may be the terminal 100 described above, and is not described herein again.
The E-UTRAN202 includes eNodeB2021 and other eNodeBs 2022, among others. Alternatively, the eNodeB2021 may be connected with other enodebs 2022 through a backhaul (e.g., X2 interface), the eNodeB2021 is connected to the EPC203, and the eNodeB2021 may provide the UE201 access to the EPC 203.
The EPC203 may include an MME (Mobility Management Entity) 2031, an hss (Home Subscriber Server) 2032, other MMEs 2033, an SGW (Serving gateway) 2034, a pgw (PDN gateway) 2035, and a PCRF (Policy and Charging Rules Function) 2036, and the like. Optionally, the MME2031 is a control node that handles signaling between the UE201 and the EPC203, providing bearer and connection management. HSS2032 is used to provide registers to manage functions such as home location register (not shown) and holds subscriber specific information about service characteristics, data rates, etc. All user data may be sent through SGW2034, PGW2035 may provide IP address allocation and other functions for UE201, PCRF2036 is a policy and charging control policy decision point for traffic data flow and IP bearer resources, which selects and provides available policy and charging control decisions for policy and charging enforcement function (not shown).
The IP services 204 may include the internet, intranets, IMS (IP Multimedia Subsystem), or other IP services, among others.
Although the LTE system is described as an example, it should be understood by those skilled in the art that the present application is not limited to the LTE system, but may also be applied to other wireless communication systems, such as GSM, CDMA2000, WCDMA, TD-SCDMA, 5G, and future new network systems (e.g. 6G), and the like.
Based on the above intelligent terminal hardware structure and communication network system, various embodiments of the present application are provided.
First embodiment
With reference to fig. 3, fig. 3 shows a schematic flow chart of a first embodiment of an interaction method, said method comprising the steps of:
s10: responding to the interactive operation of a target application, and determining at least one common contact corresponding to the target application;
s20: determining or generating shortcuts corresponding to the frequently-used contacts;
s30: and displaying the shortcut in a shortcut triggering area.
The method of the embodiment of the application can be applied to an intelligent terminal, the intelligent terminal is provided with at least one different target application, the target application comprises at least one contact person, the interactive operation comprises current interactive operation and/or historical interactive operation, the interactive operation can be conversation operation between a user and the contact person, sharing operation of the contact way of the user to the contact person, contact person adding operation of the user in the target application, contact person deleting operation of the user in the target application, contact person collection operation of the user, tag adding operation and the like, the interactive operation is used for indicating and determining the common contact person, the common contact person can be at least one common contact person in the same target application and can also be common contact persons in different target applications, and the target applications comprise communication applications such as WeChat applications, telephone applications, microblog applications and the like.
Optionally, the S10 step includes at least one of:
determining the contact persons with the contact frequency greater than or equal to a preset frequency threshold value as the frequently-used contact persons according to the contact frequency of the corresponding contact persons in the target application;
determining the contact persons with the operation frequency larger than or equal to a preset frequency threshold value as the frequently-used contact persons according to the operation frequency of the corresponding contact persons in the target application;
and determining the current dragged object as the common contact according to the current dragged object.
Optionally, the contact frequency of the corresponding contact in the target application is acquired, the contact with the contact frequency greater than or equal to the preset frequency threshold is taken as the common contact, and the contact with the contact frequency greater than or equal to the preset frequency threshold in each target application can be taken as the common contact.
Optionally, the operation frequency is a frequency of an operation on the contact, where the operation includes a sharing operation on the contact, a collecting operation on the contact, a top-placing operation on the contact, and the like, and for example, if a "wechat contact B" of the wechat application is shared 1 time by the user, collected 1 time by the user, and placed once by the user, the operation frequency is 3 times. Optionally, the operation frequency of the contact corresponding to each target application is obtained, and the contact with the operation frequency greater than or equal to a preset frequency threshold is used as the common contact.
Optionally, a commonly used contact may be determined based on a drag operation of a user, optionally, a currently dragged object is determined according to the drag operation, the currently dragged object is determined as the commonly used contact, in an actual operation, when a long-press operation for the contact is detected, a shortcut trigger area is called, the user drags the currently dragged object into the shortcut trigger area without loosing hands, when the user drags the contact into the shortcut trigger area, the contact dragged into the shortcut trigger area is taken as the currently dragged object, the shortcut trigger area may be a sidebar, may be a status bar, may be an upper right corner area of a currently displayed interface, or may be a lower right corner area of the currently displayed interface, where the status bar is not limited, and when the shortcut trigger area is the sidebar area, the user presses a contact dialog window in a target application for a long time, and drags the contact icon of the contact into the sidebar without loosing hands, and the contact icon of the contact is regarded as an operation completion signal after the contact icon is displayed in the sidebar. Illustratively, referring to fig. 4, fig. 4 shows an exemplary diagram of determining a common contact according to a drag operation.
Optionally, the method for determining the frequently-used contact may also be that a long-press operation of the user on a chat dialog box of the contact in the target application is received, and the contact corresponding to the chat dialog box corresponding to the long-press operation is used as the frequently-used contact.
Optionally, the method for determining the frequently-used contact may be to acquire a contact corresponding to the unread information in the target application, use the contact as the frequently-used contact, so that the user can directly enter the target interface of the contact to view and reply the unread information based on the quick trigger area, or acquire a contact whose number of the unread information in the target application is greater than or equal to a preset number, and use the contact as the frequently-used contact.
Optionally, after the frequent contacts are determined, a shortcut corresponding to the frequent contacts is determined or generated, where the shortcut may be a contact photo, a contact name, and the like, the shortcut has a function of starting a target interface corresponding to the contact corresponding to the shortcut, and when a user initiates a touch operation on the shortcut, the target interface of the contact corresponding to the shortcut is output in response to the touch operation. Optionally, the target interface is displayed in a small window manner in a superposed manner on a current display interface, and a masking layer is not left between the target interface and the current display interface, so that a user can execute corresponding operations for the current display interface while executing communication based on the target interface. Optionally, a preset control is displayed on the target interface, and the preset control is used for exiting the target interface.
Optionally, the manner of outputting the target interface of the contact corresponding to the shortcut may also be jumping to the target interface of the frequently used contact corresponding to the shortcut. Optionally, after jumping to the target interface of the common contact corresponding to the shortcut, the current display interface and the target interface may be displayed in a split screen manner, or the current display interface may be displayed in the target interface in a small window manner in an overlapping manner.
Optionally, after the shortcut is determined or generated, displaying the shortcut in a shortcut triggering area to trigger a target interface corresponding to the common contact through the shortcut, optionally, the shortcut triggering area may be a preset area of a current display interface, such as an upper right corner area, and a manner of displaying the shortcut in the preset area of the current display interface in the lower left corner area may be displaying the shortcut corresponding to the common contact in a form of a floating window, for example, referring to fig. 5, fig. 5 shows a schematic diagram of displaying the shortcut in the floating window.
Optionally, the shortcut triggering area may be a preset area in a sidebar, after the user calls the sidebar, the shortcut of the frequently-used contact may be viewed based on the preset area of the sidebar, and after the shortcut is touch-controlled, a target interface corresponding to the frequently-used contact may be triggered, for example, referring to fig. 6, fig. 6 shows a schematic diagram of displaying the shortcut in the preset area in the sidebar.
Optionally, the shortcut triggering area may also be a contact display panel of the status bar, and when the user pulls down to call out the status bar, the shortcut of the frequently-used contact can be viewed based on the contact display panel of the status bar, for example, referring to fig. 7, fig. 7 shows a schematic diagram of the contact display panel display shortcut in the status bar.
Optionally, the shortcut triggering area may also be a negative one-screen area of the intelligent terminal, or may also be an area where a preset desktop component of the intelligent terminal is located, where the preset desktop component is used to display shortcuts of the vehicle contacts, or may also be an unlocking area of the intelligent terminal, and after a user unlocks the intelligent terminal based on the unlocking area, the user may directly perform touch operation on the common contacts based on the unlocking area, so as to trigger a target interface corresponding to the common contacts.
Optionally, the shortcut trigger area is displayed on a current display interface in an overlapped manner, no covering layer covering is arranged between the shortcut trigger area and the current display interface, a preset control is displayed on the shortcut trigger area and used for quitting displaying the shortcut, and thus, even if a user performs touch operation on other areas except the shortcut trigger area, the shortcut is still fixedly displayed in the shortcut trigger area until touch operation on the preset control in the shortcut trigger area is received.
In the embodiment of the application, after at least one frequently-used contact corresponding to a target application is screened out based on interactive operation in response to historical interactive operation and/or current interactive operation of the target application, a shortcut corresponding to the frequently-used contact is determined or generated, the shortcut is displayed in a quick trigger area, a target interface corresponding to the frequently-used contact is triggered by the shortcut, a user can trigger the target interface corresponding to the frequently-used contact only by the shortcut displayed in the quick trigger area without starting the target application where the frequently-used contact is located, then the contact is found based on the started target application, and the target interface corresponding to the contact is triggered, so that the communication convenience is improved.
Second embodiment
Based on the first embodiment, referring to fig. 8, fig. 8 shows a schematic flow chart of a second embodiment of the interaction method of the present application, where the method includes the following steps:
s10: responding to the interactive operation of a target application, and determining at least one common contact corresponding to the target application;
s21: acquiring the type of the running application, and determining or generating a shortcut corresponding to the common contact when the type of the application is determined to be a preset type;
s30: and displaying the shortcut in a shortcut triggering area.
Optionally, after determining or generating a shortcut corresponding to the common contact, obtaining a type of an application running, determining whether the type is a preset type, and when the type is the preset type, displaying the shortcut in a shortcut triggering area to trigger a target interface corresponding to the common contact through the shortcut, where the preset type includes but is not limited to a game type, a video type, and the like, and when a user runs a game or watches a video, the user can play the game or watch the video at the same time by displaying the shortcut in the shortcut triggering area without interrupting the game or the video, and the user experience is improved by triggering the target interface of the common contact at the same time; therefore, when the type is not the preset type, the step of displaying the shortcut in the shortcut triggering area is refused to be executed, and extra display space does not need to be occupied to serve as the shortcut triggering area to display the shortcut.
In the embodiment of the application, after the shortcut corresponding to the commonly used contact is determined or generated, the type of the running application is acquired, when the type is the preset type, the shortcut is displayed in the shortcut triggering area so as to pass through the shortcut triggering of the target interface corresponding to the commonly used contact, so that when the user runs the application conforming to the preset type, the call interface of the commonly used contact is triggered on the premise that the current application is continuously run based on the shortcut triggering area, and the convenience of communication is improved.
Third embodiment
Based on all the above embodiments, referring to fig. 9, fig. 9 shows a flow chart of a third embodiment of an interaction method, which includes the following steps:
s10: responding to the interactive operation of a target application, and determining at least one common contact corresponding to the target application;
s31: determining a current operation scene according to the type of the operation application;
s32: determining or generating a target frequent contact according to a current operation scene so as to determine or generate a shortcut corresponding to the target frequent contact, and displaying the shortcut corresponding to the target frequent contact in a shortcut triggering area;
s33: and displaying shortcuts corresponding to the target frequent contacts in the shortcut trigger area.
In the embodiment of the application, in different operation scenes, the contacts to be contacted by a user are different, for example, in a working scene, the contacts to be contacted can be colleagues, and in a game scene, the contacts to be contacted can be game friends, so that in order to avoid displaying shortcuts corresponding to too many common contacts in a quick trigger area and improve the efficiency of the contacts required by the user to contact, the embodiment of the application provides a method for screening target common contacts matched with the current operation scene according to the current operation scene and displaying the shortcuts corresponding to the target common contacts in the quick trigger area.
Optionally, the manner of determining the current operation scene may be determined according to the type of the application operated by the terminal, where different applications correspond to different operation scenes, and illustratively, when the application is a game-type application, the current operation scene is a game scene, and when the application is a work-type application, the current operation scene is a work scene.
Optionally, the mode of determining the current operation scene may also be determined according to the current time and/or the current location, for example, when the current time is 12 pm, the current operation scene is a dining scene, and when the current time is 7 pm, the current operation scene is a leisure scene; and when the current place is a working place, the current operation scene is a working scene.
Optionally, the mode of determining the current operation scene may also be that different contents to be interacted correspond to different operation scenes according to the contents to be interacted, for example, when the contents to be interacted are work-related contents, the current operation scene is a work scene, and when the contents to be interacted are a meal sharing map, the current operation scene is a meal scene. The content to be interacted is the content which needs to be sent to the corresponding contact person by the user.
Optionally, after the current operation scene is determined, a target frequent contact is determined or generated according to the current operation scene, and a shortcut corresponding to the target frequent contact is displayed in the shortcut trigger area.
According to the method and the device, the current operation scene is identified, the target common contact matched with the current operation scene is screened out from the plurality of common contacts according to the current operation scene, the problem that due to the fact that the number of the common contacts is too large, a user searches for shortcuts of needed contacts from a quick trigger area, the efficiency is low is solved, and therefore communication efficiency is improved.
Fourth embodiment
Based on all the above embodiments, referring to fig. 10, the step S32 includes the steps of:
s321: determining or generating a target application associated with a current operation scene according to the current operation scene;
s322: and determining or generating the target frequent contact according to the frequent contact associated with the target application.
Optionally, in different operation scenarios, the target application where the contact person that the user needs to contact is located is different, for example, in a game scenario, the user needs to send the game invitation link to the contact person of the wechat application, and in a work scenario, the user needs to send the relevant information of work to the contact person of the work application. In this way, after the current operation scene is determined, the target application associated with the current operation scene is acquired.
Optionally, different target applications are associated with different common contacts, where the common contacts may be contacts in the target applications, and for example, common contacts associated with a WeChat application are WeChat contacts, and common contacts associated with a work class application are colleagues, and the common contacts may also be contacts in other target applications associated with the target applications.
Optionally, after the target application associated with the current operating scene is determined, the common contact associated with the target application is used as the target common contact, and then a shortcut corresponding to the target common contact is displayed in the shortcut triggering area, so that a user can directly trigger a target interface of the common contact in the target application associated with the current operating scene based on the shortcut triggering area.
In the embodiment of the application, after the current operation scene is determined, the corresponding target application is determined based on the current operation scene, common contacts in the target application or common contacts in other target applications associated with the target application are determined as the common contacts associated with the target application, the common contacts associated with the target application are used as the target common contacts, and shortcuts corresponding to the target common contacts are displayed in a quick trigger area so that a user can quickly trigger a target interface of the target common contacts matched with the current operation scene, and communication efficiency is improved.
Fifth embodiment
Based on all the above embodiments, referring to fig. 11, the step S322 includes the steps of:
s3221: acquiring a contact tag of a common contact related to the content to be interacted and/or the target application;
s3222: and determining the target frequent contact according to the frequent contact corresponding to the contact tag matched with the content to be interacted.
Optionally, the common contact associated with the target application corresponds to different contact tags, for example, the contact tags may be determined according to content of interaction content when interacting with the common contact, and may also be determined according to a social relationship between the common contact and a user, for example, when the user uses a wechat application, different contact tags are established for different contacts, such as a "family" tag, a "colleague" tag, and a "game friend" tag, and optionally, one contact may correspond to at least one contact tag, and the contact tag may be automatically labeled in an actual use process of the target application, and may also be manually labeled by the user, which is not limited herein.
Optionally, the types of the interactive contents corresponding to the common contacts corresponding to different contact tags are different, and for example, the interactive contents corresponding to the common contacts corresponding to the "colleague" tag are usually work related contents such as work documents and the like; the interactive content corresponding to the common contact corresponding to the "game friend" tag is generally game related content such as a game interface cover, a game invitation link, and the like. Based on the method, after the target application associated with the current running scene is determined or generated, the contact person tags of the common contact persons associated with the target application and/or the content to be interacted are obtained. Optionally, the manner of determining the target contact may be determining the contact tag matched with the content to be interacted, and taking a common contact corresponding to the contact tag as the target contact; optionally, the determining or generating of the target contact may also be performed by taking a common contact matched with the content type in the common contacts associated with the target application as the target common contact according to the content type of the content to be interacted, where if the content type of the content to be interacted is a work link, the matched common contact is a work colleague; optionally, the target contact person may be determined or generated by acquiring a target contact person tag matched with the current operation scene according to the current operation scene, determining a common contact person matched with the target contact person tag according to a contact person tag of a common contact person associated with the target application, taking the common contact person matched with the target contact person tag as the target contact person, if the current operation scene is a work scene, taking the target contact person tag matched with the work scene as a "co-worker" tag, and taking the common contact person belonging to the "co-worker" tag as the target common contact person.
In the embodiment of the application, after the target application associated with the current running scene is determined, the target common contacts corresponding to the contact tags of the content to be interacted and/or the common contacts associated with the target application are determined by comparing the contact tags of the content to be interacted and/or the common contacts corresponding to the contact tags of the content to be interacted, or the common contacts corresponding to the target contact tags of the content to be interacted are determined according to the content type of the content to be interacted, or the common contacts corresponding to the target contact tags of the current running long scene are used as the common contacts of the target, and then shortcuts corresponding to the common contacts of the target are displayed in a shortcut triggering area, so that a user can directly send the content to be interacted to the corresponding common contacts of the target based on the shortcut triggering area, and the communication efficiency is improved.
Sixth embodiment
Based on all the above embodiments, referring to fig. 12, the step S32 further includes the steps of:
s323: determining a common contact associated with the current operation scene according to the current operation scene;
s324: and determining or generating the target frequent contact according to the frequent contact related to the current operation scene.
In the embodiment of the application, different current operation scenes are associated with different common contacts, the common contacts include common contacts associated with different target applications, for example, in a work scene, the common contacts associated with the work scene may be common contacts in the work application and fellow friends of the same workers in the wechat application, based on which, in the embodiment of the application, after the current operation scene is determined, the target contacts are determined or generated according to the common contacts associated with the current operation scene, the common contacts associated with the current operation scene include contacts associated with different target applications, and for example, when the current operation scene is the work scene, the common contacts in the work application and the common contacts of other target applications, such as "fellow workers" tags in the wechat, are taken as the target common contacts.
Optionally, the manner of determining the common contact associated with the current operating scene according to the current operating scene may be to determine a contact with a contact frequency higher than a preset frequency corresponding to each operating scene according to a historical contact record, take the contact with the contact frequency higher than the preset frequency as the common contact associated with the operating scene to establish an association relationship between the preset operating scene and the common contact, and determine the common contact associated with the current operating scene according to the association relationship between the preset operating scene and the common contact after determining the current operating scene. Optionally, the preset frequency corresponding to different target applications may be the same or different.
Optionally, the current operation scenario is determined according to the current time and/or the current location, and in an embodiment, the contact person that the user has contacted many times when the user is in any geographic location may be the contact person that the user needs to contact at this time. In this way, the manner of determining the common contact associated with the current operation scene according to the current operation scene may be to acquire the common contact associated with the current position according to the current position corresponding to the current operation scene, so as to take the contact, which is once near the current position, of the user as the target common contact.
Alternatively, the contacts that the user has contacted frequently at any time may be the contacts that the user needs to contact at this time. In this way, the manner of determining the common contacts associated with the current operating scene according to the current operating scene may also be to acquire the common contacts associated with the current time point according to the current time point corresponding to the current operating scene, so as to use the common contacts that the user frequently contacts at the current time point as the target common contacts.
Optionally, the manner of determining the common contact associated with the current operating scene according to the current operating scene may also be to acquire M common contacts associated with the current time point and N common contacts associated with the current position according to the current time point, and use a union or an intersection between the M common contacts and the N common contacts as the target contact.
In the embodiment of the application, after the current operation scene is determined or generated, the common contact persons with the contact frequency higher than the preset frequency corresponding to the current operation scene are used as the common contact persons associated with the current operation scene, and then the common contact persons associated with the current operation scene are determined or generated to be the target common contact persons, so that the contact persons which are expected to be contacted at this time are recommended to the user according to the current operation scene, and the recommendation accuracy and the call efficiency are improved.
Seventh embodiment
Based on the above embodiment, referring to fig. 13, the method includes the following steps:
s10: responding to the interactive operation of a target application, and determining at least one common contact corresponding to the target application;
s20: determining or generating shortcuts corresponding to the frequently-used contacts;
s30: displaying the shortcut in a shortcut triggering area;
s50: detecting a first trigger operation of the shortcut, determining the shortcut to be deleted of the first trigger operation, and deleting the shortcut to be deleted; and/or the presence of a gas in the atmosphere,
s60: acquiring the trigger frequency corresponding to each shortcut in the shortcut trigger area, and deleting the shortcut when the trigger frequency is smaller than a preset frequency threshold; and/or the presence of a gas in the gas,
s70: and acquiring the latest communication time corresponding to each shortcut in the shortcut triggering area, determining the time interval between the latest communication time corresponding to each shortcut and the current time, and deleting the shortcut with the time interval exceeding the preset time length.
In this embodiment, if a preset touch operation corresponding to the shortcut trigger area is detected, the shortcut trigger area is displayed in the current display interface, and the shortcut is displayed in the shortcut trigger area, optionally, the preset touch operation includes voice, gesture, button, and the like, that is, if a preset voice message is detected, or a preset gesture is detected, or a preset touch operation corresponding to a button of a function application is detected, the shortcut trigger area is called, exemplarily, if a click or a press touch operation corresponding to the sidebar is detected, the sidebar is popped up and displayed, and the shortcut is displayed on the sidebar. Optionally, the shortcut is displayed in a preset area in the sidebar, and the preset area is used for placing at least one common contact and/or the shortcut corresponding to the target common contact.
Optionally, in order to facilitate a user to adjust the shortcut displayed in the shortcut triggering area, an editing operation is performed on the shortcut displayed in the shortcut triggering area based on the first triggering operation detected in the shortcut, where the editing operation includes, but is not limited to, adding a new frequently-used contact, deleting an original frequently-used contact, and adjusting a display sequence of the shortcut corresponding to the frequently-used contact in the shortcut triggering area.
Optionally, the first trigger operation may be long-pressing the shortcut trigger area, when the long-pressing operation is detected, changing the state of each shortcut displayed in the shortcut trigger area to an editing state, updating and displaying each shortcut based on the editing state, where the updating and displaying each shortcut based on the editing state may be shaking to display the shortcut, or adding a preset identifier, such as a red dot, to the shortcut. Optionally, after each shortcut is updated and displayed based on the editing state, a selected operation of the user based on the shortcut is received, the shortcut corresponding to the selected operation is used as a shortcut to be deleted, and then the shortcut to be deleted is deleted. Optionally, the to-be-deleted shortcut may also be deleted by receiving a first trigger operation for any shortcut, such as pressing any shortcut for a long time, and deleting the to-be-deleted shortcut by using the shortcut displayed at the trigger position of the first trigger operation as the to-be-deleted shortcut. Optionally, the editing operation further includes adjusting a display sequence of the shortcuts in the shortcut trigger area, changing the state of each shortcut into an editing state according to the first trigger operation, updating and displaying each shortcut based on the editing state, receiving a dragging and sliding operation of a user, determining a shortcut to be adjusted and a target position according to the dragging and sliding operation, and moving the shortcut to be adjusted to the target position to adjust the display sequence of the shortcut corresponding to the common contact in the shortcut trigger area. The editing operation further includes adding a new frequently-used contact, and the editing operation is, as described in the first embodiment, based on a drag operation of a user on a contact displayed in a current display interface, to drag the corresponding contact to the shortcut trigger area, which is not described herein again.
Optionally, after detecting the first trigger operation of the shortcut, while updating and displaying each shortcut based on the editing status, displaying a preset control in the shortcut trigger area, where the preset control is used to receive an edit confirmation instruction from a user, after receiving the edit confirmation instruction, displaying each updated shortcut in the shortcut trigger area, and changing the status of each updated shortcut from the editing status to a non-editing status.
Optionally, adjusting the shortcut displayed in the shortcut triggering area may also be acquiring a triggering frequency corresponding to each shortcut in the shortcut triggering area, deleting the shortcut when the triggering frequency is less than a preset frequency threshold, where when the triggering frequency is smaller, the shortcut with the smaller triggering frequency represents that the requirement is not great relative to the user, and in order to save the display space of the shortcut triggering area, taking the shortcut with the triggering frequency less than the preset frequency threshold as the shortcut to be deleted, and deleting the shortcut to be deleted. And the triggering frequency is the frequency of triggering the target interface by the user through the shortcut. Optionally, the display sequence of each shortcut in the shortcut trigger area may be adjusted according to the trigger frequency corresponding to each shortcut, where the higher the trigger frequency is, the earlier the display sequence is, the lower the trigger frequency is, and the later the display sequence is, so that the user may quickly find the frequently contacted contacts based on the display sequence.
Optionally, the adjustment of the shortcut displayed in the shortcut triggering area may also be to obtain the latest communication time corresponding to each shortcut, where the latest communication time is the time when the user triggers the target interface through the shortcut last time, and may also be the time when the user makes contact in the target application last time. Optionally, when the recent communication time is closer to the current time, the shortcut is a shortcut of a contact required by the user, and when the recent communication time is further away from the current time, the shortcut is not much required for the user. Optionally, the display sequence of each shortcut in the shortcut trigger area may be adjusted according to the time interval between the communication time corresponding to each shortcut and the current time, the shorter the time interval, the longer the display sequence is, and the later the display sequence is, so as to display the shortcut of the contact person that the user has recently contacted at the display position, so that the user finds the contact person that has recently contacted based on the display sequence.
Optionally, adjusting the shortcut displayed in the shortcut trigger area may also be closing the shortcut trigger area.
In the embodiment of the application, through receiving the first trigger operation aiming at the shortcut, according to the first trigger operation, the corresponding shortcut is deleted and/or the display sequence of the shortcut trigger area is adjusted, and/or the corresponding shortcut is deleted and/or the display sequence of the shortcut trigger area is adjusted according to the trigger frequency corresponding to each shortcut and/or the latest communication time, so that the editing operation of the shortcut in the shortcut trigger area is realized, the flexibility of the shortcut trigger area is improved, the flexibility of man-machine interaction is improved, and the user experience is improved.
Based on any of the above embodiments, the present application also provides an interaction method, including the following steps:
and responding to the interactive operation of the target application, and displaying the common contact shortcut corresponding to the target application in the shortcut triggering area.
Optionally, the method further comprises at least one of:
detecting a first trigger operation of the shortcut, determining the shortcut to be deleted of the first trigger operation, and deleting the shortcut to be deleted;
acquiring trigger frequency corresponding to each shortcut in the shortcut trigger area, and deleting the shortcut when the trigger frequency is smaller than a preset frequency threshold;
and acquiring the latest communication time corresponding to each shortcut in the shortcut triggering area, determining the time interval between the latest communication time corresponding to each shortcut and the current time, and deleting the shortcut with the time interval exceeding the preset time length.
Through the technical scheme, the shortcut function of displaying the commonly used contacts in the shortcut area can be realized, the problem of low efficiency of searching the contacts by a user is solved, and the user experience is further improved.
Based on any of the above embodiments, the present application further provides an interaction method, including the following steps:
and displaying a common contact shortcut corresponding to the target application in the shortcut triggering area in response to the interactive operation of the target application and/or the type of the target application being a preset type.
Optionally, the method further comprises at least one of:
detecting a first trigger operation of the shortcut, determining the shortcut to be deleted of the first trigger operation, and deleting the shortcut to be deleted;
acquiring trigger frequency corresponding to each shortcut in the shortcut trigger area, and deleting the shortcut when the trigger frequency is smaller than a preset frequency threshold;
and acquiring the latest communication time corresponding to each shortcut in the shortcut triggering area, determining the time interval between the latest communication time corresponding to each shortcut and the current time, and deleting the shortcut with the time interval exceeding the preset time length.
Through the technical scheme, the shortcut function of displaying the commonly used contacts in the shortcut area can be realized, the problem of low efficiency of searching the contacts by a user is solved, and the user experience is further improved.
Based on any of the above embodiments, the present application also provides an interaction method, including the following steps:
and responding to the interactive operation and/or the current running scene of the target application, and displaying the common contact shortcut corresponding to the target application in the shortcut triggering area.
Optionally, the current running scenario is determined according to the type of the target application.
Optionally, the determining manner of the common contact corresponding to the target application includes at least one of:
determining or generating a target application associated with a current operation scene according to the current operation scene, and determining or generating a target common contact according to a common contact associated with the target application;
acquiring content to be interacted and/or a contact tag of a common contact related to the target application, and determining the target common contact according to the common contact corresponding to the contact tag matched with the content to be interacted;
and determining the common contact associated with the current operation scene according to the current operation scene, and determining or generating the target common contact according to the common contact associated with the current operation scene.
Optionally, the method further comprises at least one of:
detecting a first trigger operation of the shortcut, determining the shortcut to be deleted of the first trigger operation, and deleting the shortcut to be deleted;
acquiring trigger frequency corresponding to each shortcut in the shortcut trigger area, and deleting the shortcut when the trigger frequency is smaller than a preset frequency threshold;
and acquiring the latest communication time corresponding to each shortcut in the shortcut triggering area, determining the time interval between the latest communication time corresponding to each shortcut and the current time, and deleting the shortcut with the time interval exceeding the preset time length.
Through the technical scheme, the shortcut function of displaying the commonly used contacts in the shortcut area can be realized, the problem of low efficiency of searching the contacts by a user is solved, and the user experience is further improved.
The embodiment of the present application further provides an intelligent terminal, which includes a memory and a processor, where the memory stores an interaction program, and the interaction program is executed by the processor to implement the steps of the interaction method in any of the above embodiments.
The embodiment of the present application further provides a storage medium, where an interaction program is stored on the storage medium, and the interaction program, when executed by a processor, implements the steps of the interaction method in any of the above embodiments.
In the embodiments of the intelligent terminal and the storage medium provided in the present application, all technical features of any one of the above-described interaction method embodiments may be included, and the contents of the expansion and the explanation of the specification are basically the same as those of each embodiment of the above-described method, and are not described herein again.
Embodiments of the present application also provide a computer program product, which includes computer program code, when the computer program code runs on a computer, the computer is caused to execute the method in the above various possible embodiments.
Embodiments of the present application further provide a chip, which includes a memory and a processor, where the memory is used to store a computer program, and the processor is used to call and run the computer program from the memory, so that a device in which the chip is installed executes the method in the above various possible embodiments.
It is to be understood that the foregoing scenarios are only examples, and do not constitute a limitation on application scenarios of the technical solutions provided in the embodiments of the present application, and the technical solutions of the present application may also be applied to other scenarios. For example, as can be known by those skilled in the art, with the evolution of system architecture and the emergence of new service scenarios, the technical solution provided in the embodiments of the present application is also applicable to similar technical problems.
The above-mentioned serial numbers of the embodiments of the present application are merely for description and do not represent the merits of the embodiments.
The steps in the method of the embodiment of the application can be sequentially adjusted, combined and deleted according to actual needs.
The units in the device in the embodiment of the application can be merged, divided and deleted according to actual needs.
In the present application, the same or similar term concepts, technical solutions and/or application scenario descriptions will be generally described only in detail at the first occurrence, and when the description is repeated later, the detailed description will not be repeated in general for brevity, and when understanding the technical solutions and the like of the present application, reference may be made to the related detailed description before the description for the same or similar term concepts, technical solutions and/or application scenario descriptions and the like which are not described in detail later.
In the present application, each embodiment is described with an emphasis on the description, and reference may be made to the description of other embodiments for parts that are not described or recited in any embodiment.
The technical features of the technical solution of the present application may be arbitrarily combined, and for brevity of description, all possible combinations of the technical features in the embodiments are not described, however, as long as there is no contradiction between the combinations of the technical features, the scope of the present application should be considered as being described in the present application.
Through the description of the foregoing embodiments, it is clear to those skilled in the art that the method of the foregoing embodiments may be implemented by software plus a necessary general hardware platform, and certainly may also be implemented by hardware, but in many cases, the former is a better implementation. Based on such understanding, the technical solutions of the present application may be embodied in the form of a software product, which is stored in a storage medium (e.g., ROM/RAM, magnetic disk, optical disk) and includes instructions for enabling a terminal device (e.g., a mobile phone, a computer, a server, a controlled terminal, or a network device) to execute the method of each embodiment of the present application.
In the above embodiments, the implementation may be wholly or partially realized by software, hardware, firmware, or any combination thereof. When implemented in software, may be implemented in whole or in part in the form of a computer program product. The computer program product includes one or more computer instructions. The procedures or functions according to the embodiments of the present application are all or partially generated when the computer program instructions are loaded and executed on a computer. The computer may be a general purpose computer, a special purpose computer, a network of computers, or other programmable device. The computer instructions may be stored on a storage medium or transmitted from one storage medium to another, for example, the computer instructions may be transmitted from one website, computer, server, or data center to another website, computer, server, or data center via wire (e.g., coaxial cable, fiber optic, digital subscriber line) or wirelessly (e.g., infrared, wireless, microwave, etc.). The storage medium may be any available medium that can be accessed by a computer or a data storage device including one or more available media integrated servers, data centers, and the like. The usable medium may be a magnetic medium (e.g., floppy Disk, memory Disk, magnetic tape), an optical medium (e.g., DVD), or a semiconductor medium (e.g., solid State Disk (SSD)), among others.
The above description is only a preferred embodiment of the present application, and not intended to limit the scope of the present application, and all modifications of equivalent structures and equivalent processes, which are made by the contents of the specification and the drawings of the present application, or which are directly or indirectly applied to other related technical fields, are included in the scope of the present application.

Claims (10)

1. An interactive method, comprising the steps of:
s10: responding to the interactive operation of a target application, and determining at least one common contact corresponding to at least one target application;
s20: determining or generating shortcuts corresponding to the frequently-used contacts;
s30: and displaying the shortcut in a shortcut triggering area.
2. The method of claim 1, further comprising at least one of:
the quick trigger area and/or the target interface are/is displayed on the current display interface in an overlapping mode;
no covering layer shade exists between the quick trigger area and/or the target interface and the current display interface;
and acquiring the type of the running application, and executing the step S20 when the type of the application is a preset type.
3. The method of claim 1, wherein the S10 step comprises at least one of:
determining the contact persons with the contact frequency greater than or equal to a preset frequency threshold value as the frequently-used contact persons according to the contact frequency of the corresponding contact persons in the target application;
determining the contact persons with the operation frequency greater than or equal to a preset frequency threshold value as the frequently-used contact persons according to the operation frequency of the corresponding contact persons in the target application;
and determining the current dragged object as the common contact according to the current dragged object.
4. The method of claim 1, further comprising at least one of:
detecting a first trigger operation of the shortcut, determining the shortcut to be deleted of the first trigger operation, and deleting the shortcut to be deleted;
acquiring trigger frequency corresponding to each shortcut in the shortcut trigger area, and deleting the shortcut when the trigger frequency is smaller than a preset frequency threshold;
and acquiring the latest communication time corresponding to each shortcut in the shortcut triggering area, determining the time interval between the latest communication time corresponding to each shortcut and the current time, and deleting the shortcut with the time interval exceeding the preset time length.
5. The method of any of claims 1 to 4, wherein the step of determining at least one frequent contact corresponding to at least one of the target applications is followed by further comprising:
determining a current operation scene according to the type of the operation application;
and determining or generating the target frequent contact according to the current operation scene.
6. The method of claim 5, wherein the step of determining or generating a target frequent contact based on the current operational scenario comprises:
determining or generating a target application associated with a current operation scene according to the current operation scene;
and determining or generating the target frequent contact according to the frequent contact associated with the target application.
7. The method of claim 6, wherein the determining or generating the target frequent contact according to the frequent contacts associated with the target application comprises:
acquiring a contact tag of a common contact related to the content to be interacted and/or the target application;
and determining or generating the target common contact according to the common contact corresponding to the contact tag matched with the content to be interacted.
8. The method of claim 5, wherein the step of determining or generating a target frequent contact based on the current operational scenario comprises:
determining a common contact associated with the current operation scene according to the current operation scene;
and determining or generating the target frequent contact according to the frequent contact related to the current operation scene.
9. An intelligent terminal, comprising: memory, processor, wherein the memory has stored thereon an interaction program which, when executed by the processor, implements the steps of the interaction method according to any one of claims 1 to 8.
10. A storage medium, characterized in that the storage medium has stored thereon a computer program which, when being executed by a processor, carries out the steps of the interaction method according to any one of claims 1 to 8.
CN202211528900.4A 2022-11-30 2022-11-30 Interaction method, intelligent terminal and storage medium Pending CN115857748A (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202211528900.4A CN115857748A (en) 2022-11-30 2022-11-30 Interaction method, intelligent terminal and storage medium

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202211528900.4A CN115857748A (en) 2022-11-30 2022-11-30 Interaction method, intelligent terminal and storage medium

Publications (1)

Publication Number Publication Date
CN115857748A true CN115857748A (en) 2023-03-28

Family

ID=85668797

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202211528900.4A Pending CN115857748A (en) 2022-11-30 2022-11-30 Interaction method, intelligent terminal and storage medium

Country Status (1)

Country Link
CN (1) CN115857748A (en)

Similar Documents

Publication Publication Date Title
CN109040441B (en) Application body-separating display method, mobile terminal and computer readable storage medium
CN109697008B (en) Content sharing method, terminal and computer readable storage medium
CN112068744A (en) Interaction method, mobile terminal and storage medium
CN113900560A (en) Icon processing method, intelligent terminal and storage medium
CN109710168B (en) Screen touch method and device and computer readable storage medium
CN107562304B (en) Control method, mobile terminal and computer readable storage medium
CN115914719A (en) Screen projection display method, intelligent terminal and storage medium
CN113835586A (en) Icon processing method, intelligent terminal and storage medium
CN113867588A (en) Icon processing method, intelligent terminal and storage medium
CN113867765A (en) Application management method, intelligent terminal and storage medium
CN113885752A (en) Icon processing method, intelligent terminal and storage medium
CN113867586A (en) Icon display method, intelligent terminal and storage medium
CN113342246A (en) Operation method, mobile terminal and storage medium
CN113282205A (en) Starting method, terminal device and storage medium
CN115857748A (en) Interaction method, intelligent terminal and storage medium
CN107479747B (en) Touch display method and device and computer storage medium
CN114327184A (en) Data management method, intelligent terminal and storage medium
CN115033144A (en) Processing method, mobile terminal and storage medium
CN117572998A (en) Display method, intelligent terminal and storage medium
CN117242425A (en) Control method, mobile terminal and readable storage medium
CN114661206A (en) Data display method, intelligent terminal and storage medium
CN117130519A (en) Application starting method, intelligent terminal and storage medium
CN115129195A (en) Information display method, mobile terminal and storage medium
CN115048003A (en) Processing method, intelligent terminal and storage medium
CN114995730A (en) Information display method, mobile terminal and storage medium

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication