CN112068913A - Interactive method, mobile terminal and storage medium - Google Patents

Interactive method, mobile terminal and storage medium Download PDF

Info

Publication number
CN112068913A
CN112068913A CN202010870364.0A CN202010870364A CN112068913A CN 112068913 A CN112068913 A CN 112068913A CN 202010870364 A CN202010870364 A CN 202010870364A CN 112068913 A CN112068913 A CN 112068913A
Authority
CN
China
Prior art keywords
information
triggered
displaying
interaction
triggering
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
CN202010870364.0A
Other languages
Chinese (zh)
Inventor
王卓阳
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Shenzhen Microphone Holdings Co Ltd
Shenzhen Transsion Holdings Co Ltd
Original Assignee
Shenzhen Microphone Holdings Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Shenzhen Microphone Holdings Co Ltd filed Critical Shenzhen Microphone Holdings Co Ltd
Priority to CN202010870364.0A priority Critical patent/CN112068913A/en
Publication of CN112068913A publication Critical patent/CN112068913A/en
Pending legal-status Critical Current

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F9/00Arrangements for program control, e.g. control units
    • G06F9/06Arrangements for program control, e.g. control units using stored programs, i.e. using an internal store of processing equipment to receive or retain programs
    • G06F9/44Arrangements for executing specific programs
    • G06F9/451Execution arrangements for user interfaces
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]
    • G06F3/0481Interaction techniques based on graphical user interfaces [GUI] based on specific properties of the displayed interaction object or a metaphor-based environment, e.g. interaction with desktop elements like windows or icons, or assisted by a cursor's changing behaviour or appearance
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]
    • G06F3/0484Interaction techniques based on graphical user interfaces [GUI] for the control of specific functions or operations, e.g. selecting or manipulating an object, an image or a displayed text element, setting a parameter value or selecting a range
    • G06F3/04847Interaction techniques to control parameter settings, e.g. interaction with sliders or dials

Landscapes

  • Engineering & Computer Science (AREA)
  • Theoretical Computer Science (AREA)
  • General Engineering & Computer Science (AREA)
  • Human Computer Interaction (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Software Systems (AREA)
  • Telephone Function (AREA)
  • User Interface Of Digital Computer (AREA)

Abstract

The application relates to an interaction method, a mobile terminal and a storage medium, wherein the interaction method is applied to the mobile terminal and comprises the following steps: responding to the first operation, and entering a preset interface of the desktop; and displaying an interactive area on a preset interface, wherein at least one type of information to be interacted is displayed in the interactive area. Through the method, the activity of the preset interface can be improved based on the information interaction scene of the preset interface.

Description

Interactive method, mobile terminal and storage medium
Technical Field
The present application relates to the field of terminal technologies, and in particular, to an interaction method, a mobile terminal, and a storage medium.
Background
In some implementations, the preset interface (e.g., minus one screen) of the desktop is used to expand the desktop and the information display of the system, and a mode of adding scenes is generally adopted to increase liveness, for example, adding news and weather information on minus one screen, but the effect is not ideal.
The foregoing description is provided for general background information and is not admitted to be prior art.
Disclosure of Invention
In view of the above technical problems, the present application provides an interaction method, a mobile terminal and a storage medium, which can improve the activity of a preset interface based on an information interaction scene of the preset interface.
In order to solve the above technical problem, the present application provides an interaction method, applied to a mobile terminal, including:
responding to the first operation, and entering a preset interface of the desktop;
optionally, the preset interface may be a negative screen, a screen-locking interface, a screen-off interface, a system interface or an application interface.
And displaying an interactive area on the preset interface, wherein at least one type of information to be interacted is displayed in the interactive area.
Optionally, the first operation includes at least one of a sliding operation in a preset direction on the desktop, a clicking operation in a preset area of the desktop, and a clicking of a preset guide button.
Optionally, the information to be interacted includes information to be triggered, each type of information to be triggered includes at least one piece of information to be triggered, optionally, the information to be triggered includes at least one of coupon information, score information, and task information, the information to be triggered of the same type has the same triggering condition, optionally, the triggering condition includes at least one of real-time triggering, timing triggering, and countdown triggering.
Optionally, displaying at least one type of information to be interacted in the interaction area, including:
displaying at least one type of information to be triggered in the interactive area in an icon mode according to the triggering condition of the information;
displaying information content at the icon, wherein the information content optionally comprises at least one of a file, an information source and a trigger condition.
Optionally, displaying at least one type of information to be interacted in the interaction area, including:
determining at least one piece of information to be triggered according to the historical behavior of the triggering information, wherein optionally, the at least one piece of information to be triggered comprises at least one piece of integral information;
and displaying at least one type of information to be triggered in the interactive area according to the triggering condition of the at least one type of information to be triggered.
Optionally, the determining at least one to-be-triggered information according to the historical behavior of the trigger information includes:
if the interactive area is displayed for the first time, acquiring at least one piece of default information as at least one piece of information to be triggered, and optionally, the at least one piece of default information comprises at least one piece of integral information; and/or the presence of a gas in the gas,
and if the interactive area is not displayed for the first time, determining at least one piece of information to be triggered according to the historical behavior of the triggering information, and optionally, enabling the at least one piece of information to be triggered to comprise at least one piece of integral information.
Optionally, the determining at least one to-be-triggered information according to the historical behavior of the trigger information includes:
determining at least one piece of information to be triggered according to the historical behavior of the triggering information, wherein optionally, the at least one piece of information to be triggered comprises at least one piece of integral information;
and screening the at least one piece of information to be triggered according to the priority of the information, and optionally, enabling the screening result to comprise at least one piece of integral information.
Optionally, the displaying an interaction area on the preset interface includes:
and displaying the interaction area at the designated position or the adjustable position of the preset interface.
Optionally, after the step of displaying at least one type of information to be interacted in the interaction area, the method further includes:
and responding to the operation of clicking the information, and triggering the corresponding information and/or displaying a detail page of the corresponding information.
Optionally, the triggering the corresponding information and/or displaying a detail page of the corresponding information includes at least one of:
when the current clicked information is integral information, triggering integral of a corresponding numerical value and/or displaying the file information related to the integral information; or the like, or, alternatively,
when the current clicked information is the coupon information, displaying a detail page of the coupon information, wherein the detail page comprises at least one coupon to be triggered and/or corresponding condition information; or the like, or, alternatively,
and when the current clicked information is the task information, displaying a detail page of the task information, wherein the detail page comprises at least one task to be executed and/or information content of the task.
The present application further provides a mobile terminal, including: a memory and a processor;
the memory stores at least one program instruction;
the processor implements the method of interaction described above by loading and executing the at least one program instruction.
The present application further provides a computer storage medium having computer program instructions stored thereon; which when executed by a processor implement the method of interaction described above.
As described above, the interaction method, the mobile terminal and the storage medium of the present application, where the interaction method is applied to the mobile terminal, include: responding to the first operation, and entering a preset interface of the desktop; and displaying an interactive area on a preset interface, wherein at least one type of information to be interacted is displayed in the interactive area. Through the method, the activity of the preset interface can be improved based on the information interaction scene of the preset interface.
The foregoing description is only an overview of the technical solutions of the present application, and in order to make the technical means of the present application more clearly understood, the present application may be implemented in accordance with the content of the description, and in order to make the above and other objects, features, and advantages of the present application more clearly understood, the following preferred embodiments are described in detail with reference to the accompanying drawings.
Drawings
The accompanying drawings, which are incorporated in and constitute a part of this specification, illustrate embodiments consistent with the present application and together with the description, serve to explain the principles of the application. In order to more clearly illustrate the technical solutions of the embodiments of the present application, the drawings needed to be used in the description of the embodiments will be briefly described below, and it is obvious for those skilled in the art to obtain other drawings based on these drawings without inventive exercise.
Fig. 1 is a schematic hardware structure diagram of a mobile terminal implementing various embodiments of the present application;
fig. 2 is a communication network system architecture diagram according to an embodiment of the present application;
FIG. 3 is a flow chart diagram illustrating a method of interaction according to a first embodiment;
fig. 4 is an application effect diagram of the method of interaction shown according to the first embodiment;
FIGS. 5(a) to 5(c) are schematic diagrams of three types of information to be triggered;
FIG. 6 is a flow chart diagram illustrating a method of interaction according to a second embodiment;
FIGS. 7(a) and 7(b) are schematic diagrams of two detailed pages of information;
fig. 8 is one of the configuration diagrams of a mobile terminal shown according to the third embodiment;
fig. 9 is a second schematic structural diagram of the mobile terminal shown in fig. 8.
The implementation, functional features and advantages of the objectives of the present application will be further explained with reference to the accompanying drawings. With the above figures, there are shown specific embodiments of the present application, which will be described in more detail below. These drawings and written description are not intended to limit the scope of the inventive concepts in any manner, but rather to illustrate the inventive concepts to those skilled in the art by reference to specific embodiments.
Detailed Description
Reference will now be made in detail to the exemplary embodiments, examples of which are illustrated in the accompanying drawings. When the following description refers to the accompanying drawings, like numbers in different drawings represent the same or similar elements unless otherwise indicated. The embodiments described in the following exemplary embodiments do not represent all embodiments consistent with the present application. Rather, they are merely examples of apparatus and methods consistent with certain aspects of the present application, as detailed in the appended claims.
It should be noted that, in this document, the terms "comprises," "comprising," or any other variation thereof, are intended to cover a non-exclusive inclusion, such that a process, method, article, or apparatus that comprises a list of elements does not include only those elements but may include other elements not expressly listed or inherent to such process, method, article, or apparatus. Without further limitation, the recitation of an element by the phrase "comprising an … …" does not exclude the presence of additional like elements in the process, method, article, or apparatus that comprises the element, and further, where similarly-named elements, features, or elements in different embodiments of the disclosure may have the same meaning, or may have different meanings, that particular meaning should be determined by their interpretation in the embodiment or further by context with the embodiment.
It should be understood that although the terms first, second, third, etc. may be used herein to describe various information, such information should not be limited to these terms. These terms are only used to distinguish one type of information from another. For example, first information may also be referred to as second information, and similarly, second information may also be referred to as first information, without departing from the scope herein. The word "if" as used herein may be interpreted as "at … …" or "when … …" or "in response to a determination", depending on the context. Also, as used herein, the singular forms "a", "an" and "the" are intended to include the plural forms as well, unless the context indicates otherwise. It will be further understood that the terms "comprises," "comprising," "includes" and/or "including," when used in this specification, specify the presence of stated features, steps, operations, elements, components, items, species, and/or groups, but do not preclude the presence, or addition of one or more other features, steps, operations, elements, components, species, and/or groups thereof. The terms "or" and/or "as used herein are to be construed as inclusive or meaning any one or any combination. Thus, "A, B or C" or "A, B and/or C" means "any of the following: a; b; c; a and B; a and C; b and C; A. b and C ". An exception to this definition will occur only when a combination of elements, functions, steps or operations are inherently mutually exclusive in some way.
It should be understood that, although the steps in the flowcharts in the embodiments of the present application are shown in order as indicated by the arrows, the steps are not necessarily performed in order as indicated by the arrows. The steps are not performed in the exact order shown and may be performed in other orders unless explicitly stated herein. Moreover, at least some of the steps in the figures may include multiple sub-steps or multiple stages that are not necessarily performed at the same time, but may be performed at different times, in different orders, and may be performed alternately or at least partially with respect to other steps or sub-steps of other steps.
It should be noted that, step numbers such as 210, 220, etc. are used herein for the purpose of more clearly and briefly describing the corresponding content, and do not constitute a substantial limitation on the sequence, and those skilled in the art may perform 220 and then 210, etc. in the specific implementation, but these should be within the protection scope of the present application.
It should be understood that the specific embodiments described herein are merely illustrative of the present application and are not intended to limit the present application.
In the following description, suffixes such as "module", "component", or "unit" used to denote elements are used only for the convenience of description of the present application, and have no specific meaning in themselves. Thus, "module", "component" or "unit" may be used mixedly.
The mobile terminal may be implemented in various forms. For example, the mobile terminal described in the present application may include mobile terminals such as a mobile phone, a tablet computer, a notebook computer, a palmtop computer, a Personal Digital Assistant (PDA), a Portable Media Player (PMP), a navigation device, a wearable device, a smart band, a pedometer, and the like, and fixed terminals such as a Digital TV, a desktop computer, and the like.
The following description will be given taking a mobile terminal as an example, and it will be understood by those skilled in the art that the configuration according to the embodiment of the present application can be applied to a fixed type terminal in addition to elements particularly used for mobile purposes.
Referring to fig. 1, which is a schematic diagram of a hardware structure of a mobile terminal for implementing various embodiments of the present application, the mobile terminal 100 may include: RF (Radio Frequency) unit 101, WiFi module 102, audio output unit 103, a/V (audio/video) input unit 104, sensor 105, display unit 106, user input unit 107, interface unit 108, memory 109, processor 110, and power supply 111. Those skilled in the art will appreciate that the mobile terminal architecture shown in fig. 1 is not intended to be limiting of mobile terminals, which may include more or fewer components than those shown, or some components may be combined, or a different arrangement of components.
The following describes each component of the mobile terminal in detail with reference to fig. 1:
the radio frequency unit 101 may be configured to receive and transmit signals during information transmission and reception or during a call, and specifically, receive downlink information of a base station and then process the downlink information to the processor 110; in addition, the uplink data is transmitted to the base station. Typically, radio frequency unit 101 includes, but is not limited to, an antenna, at least one amplifier, a transceiver, a coupler, a low noise amplifier, a duplexer, and the like. In addition, the radio frequency unit 101 can also communicate with a network and other devices through wireless communication. The wireless communication may use any communication standard or protocol, including but not limited to GSM (Global System for Mobile communications), GPRS (General Packet Radio Service), CDMA2000(Code Division Multiple Access 2000), WCDMA (Wideband Code Division Multiple Access), TD-SCDMA (Time Division-Synchronous Code Division Multiple Access), FDD-LTE (Frequency Division duplex Long Term Evolution), and TDD-LTE (Time Division duplex Long Term Evolution).
WiFi belongs to short-distance wireless transmission technology, and the mobile terminal can help a user to receive and send e-mails, browse webpages, access streaming media and the like through the WiFi module 102, and provides wireless broadband internet access for the user. Although fig. 1 shows the WiFi module 102, it is understood that it does not belong to the essential constitution of the mobile terminal, and may be omitted entirely as needed within the scope not changing the essence of the invention.
The audio output unit 103 may convert audio data received by the radio frequency unit 101 or the WiFi module 102 or stored in the memory 109 into an audio signal and output as sound when the mobile terminal 100 is in a call signal reception mode, a call mode, a recording mode, a voice recognition mode, a broadcast reception mode, or the like. Also, the audio output unit 103 may also provide audio output related to a specific function performed by the mobile terminal 100 (e.g., a call signal reception sound, a message reception sound, etc.). The audio output unit 103 may include a speaker, a buzzer, and the like.
The a/V input unit 104 is used to receive audio or video signals. The a/V input Unit 104 may include a Graphics Processing Unit (GPU) 1041 and a microphone 1042, the Graphics processor 1041 Processing image data of still pictures or video obtained by an image capturing device (e.g., a camera) in a video capturing mode or an image capturing mode. The processed image frames may be displayed on the display unit 106. The image frames processed by the graphic processor 1041 may be stored in the memory 109 (or other storage medium) or transmitted via the radio frequency unit 101 or the WiFi module 102. The microphone 1042 may receive sounds (audio data) via the microphone 1042 in a phone call mode, a recording mode, a voice recognition mode, or the like, and may be capable of processing such sounds into audio data. The processed audio (voice) data may be converted into a format output transmittable to a mobile communication base station via the radio frequency unit 101 in case of a phone call mode. The microphone 1042 may implement various types of noise cancellation (or suppression) algorithms to cancel (or suppress) noise or interference generated in the course of receiving and transmitting audio signals.
The mobile terminal 100 also includes at least one sensor 105, such as a light sensor, a motion sensor, and other sensors. Specifically, the light sensor includes an ambient light sensor that may optionally adjust the brightness of the display panel 1061 according to the brightness of ambient light, and a proximity sensor that may turn off the display panel 1061 and/or the backlight when the mobile terminal 100 is moved to the ear. As one of the motion sensors, the accelerometer sensor can detect the magnitude of acceleration in each direction (generally, three axes), can detect the magnitude and direction of gravity when stationary, and can be used for applications of recognizing the posture of a mobile phone (such as horizontal and vertical screen switching, related games, magnetometer posture calibration), vibration recognition related functions (such as pedometer and tapping), and the like; as for other sensors such as a fingerprint sensor, a pressure sensor, an iris sensor, a molecular sensor, a gyroscope, a barometer, a hygrometer, a thermometer, and an infrared sensor, which can be configured on the mobile phone, further description is omitted here.
The display unit 106 is used to display information input by a user or information provided to the user. The Display unit 106 may include a Display panel 1061, and the Display panel 1061 may be configured in the form of a Liquid Crystal Display (LCD), an Organic Light-Emitting Diode (OLED), or the like.
The user input unit 107 may be used to receive input numeric or character information and generate key signal inputs related to user settings and function control of the mobile terminal. Specifically, the user input unit 107 may include a touch panel 1071 and other input devices 1072. The touch panel 1071, also referred to as a touch screen, may collect a touch operation performed by a user on or near the touch panel 1071 (e.g., an operation performed by the user on or near the touch panel 1071 using a finger, a stylus, or any other suitable object or accessory), and drive a corresponding connection device according to a predetermined program. The touch panel 1071 may include two parts of a touch detection device and a touch controller. Optionally, the touch detection device detects a touch orientation of a user, detects a signal caused by a touch operation, and transmits the signal to the touch controller; the touch controller receives touch information from the touch sensing device, converts the touch information into touch point coordinates, sends the touch point coordinates to the processor 110, and can receive and execute commands sent by the processor 110. In addition, the touch panel 1071 may be implemented in various types, such as a resistive type, a capacitive type, an infrared ray, and a surface acoustic wave. In addition to the touch panel 1071, the user input unit 107 may include other input devices 1072. In particular, other input devices 1072 may include, but are not limited to, one or more of a physical keyboard, function keys (e.g., volume control keys, switch keys, etc.), a trackball, a mouse, a joystick, and the like, and are not limited to these specific examples.
Further, the touch panel 1071 may cover the display panel 1061, and when the touch panel 1071 detects a touch operation thereon or nearby, the touch panel 1071 transmits the touch operation to the processor 110 to determine the type of the touch event, and then the processor 110 provides a corresponding visual output on the display panel 1061 according to the type of the touch event. Although the touch panel 1071 and the display panel 1061 are shown in fig. 1 as two separate components to implement the input and output functions of the mobile terminal, in some embodiments, the touch panel 1071 and the display panel 1061 may be integrated to implement the input and output functions of the mobile terminal, and is not limited herein.
The interface unit 108 serves as an interface through which at least one external device is connected to the mobile terminal 100. For example, the external device may include a wired or wireless headset port, an external power supply (or battery charger) port, a wired or wireless data port, a memory card port, a port for connecting a device having an identification module, an audio input/output (I/O) port, a video I/O port, an earphone port, and the like. The interface unit 108 may be used to receive input (e.g., data information, power, etc.) from external devices and transmit the received input to one or more elements within the mobile terminal 100 or may be used to transmit data between the mobile terminal 100 and external devices.
The memory 109 may be used to store software programs as well as various data. The memory 109 may mainly include a program storage area and a data storage area, and optionally, the program storage area may store an operating system, an application program (such as a sound playing function, an image playing function, and the like) required by at least one function, and the like; the storage data area may store data (such as audio data, a phonebook, etc.) created according to the use of the cellular phone, and the like. Further, the memory 109 may include high speed random access memory, and may also include non-volatile memory, such as at least one magnetic disk storage device, flash memory device, or other volatile solid state storage device.
The processor 110 is a control center of the mobile terminal, connects various parts of the entire mobile terminal using various interfaces and lines, and performs various functions of the mobile terminal and processes data by operating or executing software programs and/or modules stored in the memory 109 and calling data stored in the memory 109, thereby performing overall monitoring of the mobile terminal. Processor 110 may include one or more processing units; preferably, the processor 110 may integrate an application processor and a modem processor, optionally, the application processor mainly handles operating systems, user interfaces, application programs, etc., and the modem processor mainly handles wireless communications. It will be appreciated that the modem processor described above may not be integrated into the processor 110.
The mobile terminal 100 may further include a power supply 111 (e.g., a battery) for supplying power to various components, and preferably, the power supply 111 may be logically connected to the processor 110 via a power management system, so as to manage charging, discharging, and power consumption management functions via the power management system.
Although not shown in fig. 1, the mobile terminal 100 may further include a bluetooth module or the like, which is not described in detail herein.
In order to facilitate understanding of the embodiments of the present application, a communication network system on which the mobile terminal of the present application is based is described below.
Referring to fig. 2, fig. 2 is an architecture diagram of a communication Network system according to an embodiment of the present disclosure, where the communication Network system is an LTE system of a universal mobile telecommunications technology, and the LTE system includes a UE (User Equipment) 201, an E-UTRAN (Evolved UMTS Terrestrial Radio Access Network) 202, an EPC (Evolved Packet Core) 203, and an IP service 204 of an operator, which are in communication connection in sequence.
Specifically, the UE201 may be the terminal 100 described above, and is not described herein again.
The E-UTRAN202 includes eNodeB2021 and other eNodeBs 2022, among others. Alternatively, the eNodeB2021 may be connected with other enodebs 2022 through a backhaul (e.g., X2 interface), the eNodeB2021 is connected to the EPC203, and the eNodeB2021 may provide the UE201 access to the EPC 203.
The EPC203 may include an MME (Mobility Management Entity) 2031, an HSS (Home Subscriber Server) 2032, other MMEs 2033, an SGW (Serving gateway) 2034, a PGW (PDN gateway) 2035, and a PCRF (Policy and Charging Rules Function) 2036, and the like. Optionally, the MME2031 is a control node that handles signaling between the UE201 and the EPC203, providing bearer and connection management. HSS2032 is used to provide registers to manage functions such as home location register (not shown) and holds subscriber specific information about service characteristics, data rates, etc. All user data may be sent through SGW2034, PGW2035 may provide IP address assignment for UE201 and other functions, and PCRF2036 is a policy and charging control policy decision point for traffic data flow and IP bearer resources, which selects and provides available policy and charging control decisions for a policy and charging enforcement function (not shown).
The IP services 204 may include the internet, intranets, IMS (IP Multimedia Subsystem), or other IP services, among others.
Although the LTE system is described as an example, it should be understood by those skilled in the art that the present application is not limited to the LTE system, but may also be applied to other wireless communication systems, such as GSM, CDMA2000, WCDMA, TD-SCDMA, and future new network systems.
Based on the above mobile terminal hardware structure and communication network system, various embodiments of the present application are provided.
First embodiment
Fig. 3 is a flow chart diagram illustrating a method of interaction according to the first embodiment. Referring to fig. 3, the interaction method of the present embodiment is applied to a mobile terminal, and includes:
and step 210, responding to the first operation, and entering a preset interface of the desktop.
Optionally, the preset interface may be a negative screen, a screen-locking interface, a screen-off interface, a system interface or an application interface.
Optionally, the desktop is an interface that the screen enters after being unlocked, and generally displays an application icon and some related information, such as time, weather, and the like. The first operation comprises at least one of sliding operation along a preset direction on the desktop, clicking operation on a preset area of the desktop and clicking a preset guide button, namely, sliding in a specified direction on the desktop, for example, sliding towards the left side, and entering a negative screen on the right side of the desktop; or displaying a guide button for entering a negative one-screen on the desktop, and clicking the button to enter the negative one-screen; or setting an operation area for switching to the minus one screen on the desktop, clicking the operation area, and entering the minus one screen.
And step 220, displaying an interaction area on a preset interface, wherein at least one type of information to be interacted is displayed in the interaction area.
Referring to fig. 4, after entering the negative one-screen 11, the negative one-screen 11 has an interactive area 12, and the interactive area 12 may be displayed at a designated position in the negative one-screen 11, or may be displayed at an adjustable position in the negative one-screen 11, and a user may adjust the position of the interactive area 12 according to habits. In the interaction area 12, at least one type of information to be interacted is displayed. The information to be interacted comprises information to be triggered, such as points, coupons and the like, and the information to be interacted can also be game processes which are not completed in negative one screen. Optionally, when the information to be interacted is information to be triggered, each type of information to be triggered includes at least one type of information to be triggered 13, optionally, the information to be triggered 13 includes at least one of coupon benefits, score benefits, and task benefits, the information to be triggered of the same type has the same trigger condition, optionally, the trigger condition includes at least one of real-time trigger, timing trigger, and countdown trigger.
Taking the negative screen 11 shown in fig. 4 as an example, 5 pieces of information 13 to be triggered are displayed in the interaction area 12, namely, score benefits, coupon benefits, timing trigger benefits, countdown trigger benefits, and task benefits. Optionally, the points benefit is used to trigger points, intended to foster user stickiness, by giving away points, attracting users to trigger points; the coupon welfare is used for triggering the coupon of the third-party application, so that welfare in the application of the third-party application is exposed, and the promotion of the third-party application is facilitated; the timed triggered benefits are benefits that can be triggered after a specified time, such as 13:00, and can be bonus benefits, coupon benefits, task benefits, shown as super coupons, i.e., coupon benefits; the countdown triggering benefit is a benefit which can be triggered after the current time is counted down for a certain time, and can be a score benefit, a coupon benefit and a task benefit; a task benefit is a benefit that can be triggered after a specified task is performed, for example, a benefit can be triggered by sharing product links to a specified number of users.
The 5 pieces of information 13 to be triggered are divided into three categories, one category is real-time triggered information, as shown in fig. 5(a), the real-time triggered information does not display triggering conditions, such as triggering time and the like, so that the point benefits, the coupon benefits and the task benefits shown in fig. 4 are all real-time triggered information, and a user can trigger at any time; the other type is timing trigger information, as shown in fig. 5(b), the benefits of timing trigger are displayed, and the trigger conditions are displayed, for example, after the trigger time is 13:00, the trigger time is set by a background, which is beneficial to the centralized exposure of the advertisements of the brand advertisers; the other is countdown triggering information, as shown in fig. 5(c), the benefit of countdown triggering is to display a triggering condition, for example, the countdown duration is 2 minutes, and the countdown duration is set by the background, which is beneficial to increase the time that the user stays in the minus one screen 11 and the frequency of using the minus one screen 11.
Therefore, by collecting different types of information, the content of the information such as points, coupons and tasks is increased, the user viscosity can be improved, the interaction rate is increased, the activity of one negative screen is increased, meanwhile, a fixed information display position can be provided for third-party applications and platforms, and the advertising effect is good.
When the information 13 is displayed on the negative one screen 11, at least one type of information to be triggered is displayed in the interactive area 12 in an icon manner according to the triggering condition of the information 13, and information content is displayed at the icon of each information 13, wherein the information content optionally comprises at least one of a file, an information source and a triggering condition. Taking the information shown in fig. 5(b) as an example, the icon of the information is a red-packet-shaped pattern, but not limited thereto, different icons may be used for different information, and according to different triggering conditions, the icon may be displayed normally, in gray scale or in transparency, to prompt different states; the information case is a super coupon, generally refers to a coupon with higher preferential amount or lower use threshold, and can be set or not set as required; the information source of the information is "timing trigger welfare", which refers to a platform for uniformly pushing timing trigger information to the mobile terminal, and the platform can uniformly collect information of each third-party application for timing pushing, and of course, the information source of the information can also be a direct source of the information, such as a third-party application or a platform for providing the information; the triggering condition of the information is '13: 00', the triggering condition can be displayed on an icon of the information, the information can be triggered after 13:00 takes effect, and after 13:00, the information is converted into real-time triggering information, and the triggering condition is not displayed.
In this embodiment, the information is pushed according to the information triggering behavior of the user, for example, the information provider recommends information to the user who is most likely to trigger the information according to the personalized behavior of the user in combination with a CTR (Click-Through-Rate) estimation engine of the advertisement system, and the information is more accurately delivered by combining the delivery of the information with the user behavior. Specifically, at least one piece of information to be triggered is determined according to the historical behavior of the triggering information, optionally, the at least one piece of information to be triggered includes at least one piece of integral information, and then at least one type of information to be triggered is displayed in the interaction area according to the triggering condition of the at least one piece of information to be triggered. That is, at least one piece of information to be triggered is determined according to the historical behavior of the user trigger information, for example, the user prefers to trigger the coupon, and prefers to trigger the coupon in a specified time period, a timed-triggered coupon benefit can be provided, and then the information is displayed in the interactive area according to the triggered time.
When at least one piece of information to be triggered is determined according to the historical behavior of the triggering information, if the interaction area is displayed for the first time, at least one piece of default information is acquired as the at least one piece of information to be triggered, and optionally, the at least one piece of default information comprises at least one piece of integral information; and/or if the interactive area is not displayed for the first time, determining at least one piece of information to be triggered according to the historical behavior of the triggering information, and optionally enabling the at least one piece of information to be triggered to comprise at least one piece of integral information. That is to say, when entering the negative one-screen or the user opens the interaction area for the first time, default information including the point information is preset, and after the user triggers the point or other information, the user may enter the negative one-screen again to continue triggering the information, so that the viscosity of the user can be improved, and the activity of the preset interface can be improved. If the interaction area is not displayed for the first time, determining at least one piece of information to be triggered according to the historical behavior of the triggering information, optionally, enabling the at least one piece of information to be triggered to include at least one piece of point information, if the user does not have the historical triggering behavior, still pushing default information, if the user has the historical triggering behavior, pushing information related to the historical triggering behavior, increasing the probability of triggering information of the user, and in the pushing process, the point information can be contained, so that the viscosity of the user can be kept.
In actual implementation, when at least one piece of information to be triggered is determined according to historical behaviors of the triggering information, optionally, the at least one piece of information to be triggered comprises at least one piece of point information, meanwhile, the at least one piece of information to be triggered is screened according to priority of the information, optionally, the screening result comprises at least one piece of point information, for example, according to behaviors of a user, the point information is used as backing information to display 5 pieces of information, if the information of the coupon type exists, the information of the coupon type is preferentially used, and at least one piece of point information can be ensured.
As described above, in the interaction method of the embodiment, in response to the first operation, the preset interface of the desktop is entered; and displaying an interactive area on a preset interface, wherein at least one type of information to be interacted is displayed in the interactive area. Through the method, the activity of the preset interface can be improved based on the information interaction scene of the preset interface.
Second embodiment
Fig. 6 is a flow chart diagram illustrating a method of interaction according to a second embodiment. Referring to fig. 6, the interaction method of the present embodiment is applied to a mobile terminal, and includes:
step 310, responding to the first operation, entering a preset interface of the desktop, wherein optionally, the preset interface may be a negative screen, a screen locking interface, a screen extinguishing interface, a system interface or an application interface, and the like.
And 320, displaying an interaction area on a preset interface, wherein at least one type of interaction information is displayed in the interaction area.
Step 330, responding to the operation of clicking the information, triggering the corresponding information and/or displaying a detail page of the corresponding information.
Optionally, details of the implementation processes of step 310 and step 320 are described in the first embodiment, and are not described herein again.
In step 330, the operation of clicking the information includes, but is not limited to, clicking an information icon, and then, according to the difference of the information, the information may be directly triggered, or a detailed page of the corresponding information is displayed, or the detailed page of the information is displayed after the information is triggered.
Fig. 7(a) shows a detail page of the point information, when the currently clicked information is the point information, a point corresponding to a numerical value is triggered, for example, a20 point, and/or the advertisement information 15 associated with the point information is displayed, and the detail page of the point information is simultaneously connected with the advertisement display system, so that the advertisement display scene is added on the information page. Clicking to view details can jump to the point mall, and after the point mall system is combined, the point information attracts a user to click.
Fig. 7(b) shows a detail page of the coupon information, and when the currently clicked information is the coupon information, the detail page of the coupon information is displayed, where the detail page includes at least one coupon to be triggered and/or corresponding condition information, and the condition information includes at least one of coupon product information 16, third-party application information, a coupon denomination, and a use condition.
And when the current clicked information is the task information, displaying a detail page of the task information, wherein the detail page comprises at least one task to be executed and/or information content of the task. Optionally, each task corresponds to one piece of information, and the information can be points and ticket classes, and the ticket classes can be red packets without thresholds or coupons with specified thresholds. Clicking 'immediate trigger' can trigger the coupon, and meanwhile, the coupon can jump to a third-party application.
After the corresponding information is triggered in response to the operation of triggering the information, at least one triggered and non-invalid information, such as total number of triggered points, information of ticket class, valid time and the like, can be displayed in the interactive area.
As described above, the interactive method of the application responds to the first operation, and enters the preset interface of the desktop; and displaying an interactive area on a preset interface, displaying at least one type of information to be interacted in the interactive area, and triggering the corresponding information and/or displaying a detail page of the corresponding information in response to the operation of clicking the information. By the method, the scene can be triggered based on the information of the preset interface, a better advertisement effect is obtained, and meanwhile, the activity of the preset interface is improved.
Third embodiment
Fig. 8 is one of the structural diagrams of the mobile terminal shown according to the third embodiment. Referring to fig. 8, the terminal 80 of the present embodiment includes a memory 802 and a processor 806, wherein the memory 802 is used for storing at least one program instruction, and the processor 806 is used for implementing the methods of the first embodiment to the second embodiment by loading and executing the at least one program instruction.
Referring to fig. 9, in actual implementation, the terminal 80 includes a memory 802, a memory controller 804, one or more processors 806 (only one of which is shown), a peripheral interface 808, a radio frequency module 850, a positioning module 812, a camera module 814, an audio module 816, a screen 818, and a key module 860. These components communicate with one another via one or more communication buses/signal lines 822.
It is to be understood that the configuration shown in fig. 9 is merely exemplary, and that the mobile terminal 80 may include more or fewer components than shown in fig. 9, or have a different configuration than shown in fig. 9. The components shown in fig. 9 may be implemented in hardware, software, or a combination thereof.
The memory 802 may be used for storing software programs and modules, such as program instructions/modules corresponding to the methods in the embodiments of the present application, and the processor 806 executes the software programs and modules stored in the storage controller 804 to execute various functional applications and data processing, so as to implement the methods described above.
The memory 802 may include high-speed random access memory, and may also include non-volatile memory, such as one or more magnetic storage devices, flash memory, or other non-volatile solid-state memory. In some examples, the memory 802 may further include memory located remotely from the processor 806, which may be connected to the terminal 80 via a network. Examples of such networks include, but are not limited to, the internet, intranets, local area networks, mobile communication networks, and combinations thereof. Access to the memory 802 by the processor 806, as well as possibly other components, may be under the control of a memory controller 804.
The peripheral interface 808 couples various input/output devices to the CPU and to the memory 802. The processor 806 executes various software, instructions within the memory 802 to perform various functions of the terminal 80 and to perform data processing.
In some embodiments, the peripheral interface 808, the processor 806, and the memory controller 804 may be implemented in a single chip. In other examples, they may be implemented separately from the individual chips.
The rf module 850 is used for receiving and transmitting electromagnetic waves, and implementing interconversion between the electromagnetic waves and electrical signals, so as to communicate with a communication network or other devices. The radio frequency module 850 may include various existing circuit elements for performing these functions, such as an antenna, a radio frequency transceiver, a digital signal processor, an encryption/decryption chip, a Subscriber Identity Module (SIM) card, memory, and so forth. The rf module 850 may communicate with various networks such as the internet, an intranet, a wireless network, or with other devices via a wireless network. The wireless network may comprise a cellular telephone network, a wireless local area network, or a metropolitan area network. The Wireless network may use various Communication standards, protocols, and technologies, including, but not limited to, Global System for Mobile Communication (GSM), Enhanced Mobile Communication (Enhanced Data GSM Environment, EDGE), wideband Code division multiple Access (W-CDMA), Code Division Multiple Access (CDMA), Time Division Multiple Access (TDMA), bluetooth, Wireless Fidelity (WiFi) (e.g., IEEE802.11a, IEEE802.11 b, IEEE802.1 g, and/or IEEE802.11 n), Voice over Internet Protocol (VoIP), world wide mail Access (Microwave for information, Access, Wi-15, Max), and any other suitable short message Communication protocols, and may even include those protocols that have not yet been developed.
The positioning module 812 is used for acquiring the current position of the terminal 80. Examples of the positioning module 812 include, but are not limited to, a global positioning satellite system (GPS), a wireless local area network-based positioning technology, or a mobile communication network-based positioning technology.
The camera module 814 is used to take pictures or videos. The pictures or videos taken may be stored in the memory 802 and may be transmitted through the radio frequency module 850.
The audio module 816 provides an audio interface to the user, which may include one or more microphones, one or more speakers, and audio circuitry. The audio circuitry receives audio data from the peripheral interface 808, converts the audio data to electrical information, and transmits the electrical information to the speaker. The speaker converts the electrical information into sound waves that the human ear can hear. The audio circuitry also receives electrical information from the microphone, converts the electrical information to voice data, and transmits the voice data to the peripheral interface 808 for further processing. The audio data may be retrieved from the memory 802 or through the radio frequency module 850. In addition, the audio data may also be stored in the memory 802 or transmitted through the radio frequency module 850. In some examples, the audio module 816 may also include a headphone jack for providing an audio interface to headphones or other devices.
The screen 818 provides an output interface between the terminal 80 and the user. In particular, screen 818 displays video output to the user, the content of which may include text, graphics, video, and any combination thereof. Some of the output results are for some of the user interface objects. It is understood that the screen 818 may also include a touch screen. The touch screen provides both an output and an input interface between the terminal 80 and the user. In addition to displaying video output to users, touch screens also receive user input, such as user clicks, swipes, and other gesture operations, so that user interface objects respond to these user input. The technique of detecting user input may be based on resistive, capacitive, or any other possible touch detection technique. Specific examples of touch screen display units include, but are not limited to, liquid crystal displays or light emitting polymer displays.
The key module 860 also provides an interface for user input to the mobile terminal 80, and the user may cause the mobile terminal 80 to perform different functions by pressing different keys.
In practical implementation, the computer storage medium is applied to the mobile terminal shown in fig. 8 or 9, so as to improve the activity of the preset interface.
The present application further provides a mobile terminal device, where the terminal device includes a memory, a processor, and a desktop interactive program stored in the memory and capable of running on the processor, and the desktop interactive program implements the steps of the method in any of the above embodiments when executed by the processor.
The present application further provides a computer-readable storage medium, in which a desktop interactive program is stored, and when being executed by a processor, the desktop interactive program implements the steps of the method in any of the above embodiments.
In the embodiments of the mobile terminal and the computer-readable storage medium provided in the present application, all technical features of the embodiments of the desktop interaction method are included, and the contents of the expansion and the explanation of the description are basically the same as those of the embodiments of the incoming call remark method, and are not described herein again.
Through the above description of the embodiments, those skilled in the art will clearly understand that the method of the above embodiments can be implemented by software plus a necessary general hardware platform, and certainly can also be implemented by hardware, but in many cases, the former is a better implementation manner. Based on such understanding, the technical solutions of the present application may be embodied in the form of a software product, which is stored in a storage medium (e.g., ROM/RAM, magnetic disk, optical disk) and includes instructions for enabling a terminal device (e.g., a mobile phone, a computer, a server, a controlled terminal, or a network device) to execute the method of each embodiment of the present application.
The above description is only a preferred embodiment of the present application, and not intended to limit the scope of the present application, and all modifications of equivalent structures and equivalent processes, which are made by the contents of the specification and the drawings of the present application, or which are directly or indirectly applied to other related technical fields, are included in the scope of the present application.
Embodiments of the present application also provide a computer program product, which includes computer program code, when the computer program code runs on a computer, the computer is caused to execute the method in the above various possible embodiments.
Embodiments of the present application further provide a chip, which includes a memory and a processor, where the memory is used to store a computer program, and the processor is used to call and run the computer program from the memory, so that a device in which the chip is installed executes the method in the above various possible embodiments.
Other embodiments of the present application will be apparent to those skilled in the art from consideration of the specification and practice of the application disclosed herein. The embodiments of the present application are intended to cover any variations, uses, or adaptations of the application following, in general, the principles of the application and including such departures from the present disclosure as come within known or customary practice within the art to which the application pertains. It is intended that the specification and examples be considered as exemplary only, with a true scope and spirit of the application being indicated by the following claims.
It will be understood that the present application is not limited to the precise arrangements described above and shown in the drawings and that various modifications and changes may be made without departing from the scope thereof. The scope of the application is limited only by the appended claims.

Claims (12)

1. An interactive method is applied to a mobile terminal, and is characterized by comprising the following steps:
responding to the first operation, and entering a preset interface of the desktop;
and displaying an interactive area on the preset interface, wherein at least one type of information to be interacted is displayed in the interactive area.
2. The method of interaction of claim 1, wherein the first operation comprises at least one of a sliding operation on the desktop along a preset direction, a clicking operation on a preset area of the desktop, and a clicking of a preset guide button.
3. The interaction method according to claim 1, wherein the information to be interacted includes information to be triggered, each type of information to be triggered includes at least information to be triggered, and the same type of information to be triggered has the same trigger condition.
4. The interaction method according to claim 3, wherein displaying at least one type of information to be interacted in the interaction area comprises:
displaying at least one type of information to be triggered in the interactive area in an icon mode according to the triggering condition of the information;
displaying information content at the icon.
5. The interaction method according to claim 3, wherein displaying at least one type of information to be interacted in the interaction area comprises:
determining at least one piece of information to be triggered according to the historical behavior of the triggering information;
and displaying at least one type of information to be triggered in the interactive area according to the triggering condition of the at least one type of information to be triggered.
6. The interaction method according to claim 5, wherein the determining at least one piece of information to be triggered according to the historical behavior of the triggering information comprises:
if the interactive area is displayed for the first time, acquiring at least one piece of default information as at least one piece of information to be triggered; and/or the presence of a gas in the gas,
and if the interactive area is not displayed for the first time, determining at least information to be triggered according to the historical behavior of the triggering information.
7. The interaction method according to claim 5, wherein the determining at least one piece of information to be triggered according to the historical behavior of the triggering information comprises:
determining at least one piece of information to be triggered according to the historical behavior of the triggering information;
and screening the at least one information to be triggered according to the priority of the information.
8. The method for interaction according to any one of claims 1 to 7, wherein the displaying an interaction area on the preset interface comprises:
and displaying the interaction area at the designated position or the adjustable position of the preset interface.
9. The interaction method according to any one of claims 1 to 7, wherein after the step of displaying at least one type of information to be interacted in the interaction area, the method further comprises:
and responding to the operation of clicking the information, and triggering the corresponding information and/or displaying a detail page of the corresponding information.
10. The interaction method according to claim 9, wherein the triggering of the correspondence information and/or the displaying of the detail page of the correspondence information comprises at least one of:
when the current clicked information is integral information, triggering integral of a corresponding numerical value and/or displaying the file information related to the integral information; or the like, or, alternatively,
when the current clicked information is the coupon information, displaying a detail page of the coupon information, wherein the detail page comprises at least one coupon to be triggered and/or corresponding condition information; or the like, or, alternatively,
and when the current clicked information is the task information, displaying a detail page of the task information, wherein the detail page comprises at least one task to be executed and/or information content of the task.
11. A mobile terminal, comprising: a memory and a processor;
the memory stores at least one program instruction;
the processor implementing the method of interaction of any one of claims 1 to 10 by loading and executing the at least one program instruction.
12. A computer storage medium having computer program instructions stored thereon; the computer program instructions, when executed by a processor, implement a method of interaction as claimed in any one of claims 1 to 10.
CN202010870364.0A 2020-08-26 2020-08-26 Interactive method, mobile terminal and storage medium Pending CN112068913A (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202010870364.0A CN112068913A (en) 2020-08-26 2020-08-26 Interactive method, mobile terminal and storage medium

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202010870364.0A CN112068913A (en) 2020-08-26 2020-08-26 Interactive method, mobile terminal and storage medium

Publications (1)

Publication Number Publication Date
CN112068913A true CN112068913A (en) 2020-12-11

Family

ID=73660419

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202010870364.0A Pending CN112068913A (en) 2020-08-26 2020-08-26 Interactive method, mobile terminal and storage medium

Country Status (1)

Country Link
CN (1) CN112068913A (en)

Citations (7)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2015521763A (en) * 2012-08-14 2015-07-30 シャオミ・インコーポレイテッド Mobile terminal desktop system, interface interaction method, apparatus, program, and recording medium
CN106134163A (en) * 2016-06-22 2016-11-16 北京小米移动软件有限公司 Method for information display, information-pushing method, Apparatus and system
CN106339231A (en) * 2016-09-20 2017-01-18 珠海市魅族科技有限公司 Pushing method and apparatus of desktop notifications
CN107526493A (en) * 2017-08-29 2017-12-29 努比亚技术有限公司 A kind of small tool display methods, equipment and computer-readable recording medium
CN109324730A (en) * 2018-09-30 2019-02-12 努比亚技术有限公司 Shortcut generation method, terminal and computer readable storage medium
CN109408163A (en) * 2018-09-07 2019-03-01 百度在线网络技术(北京)有限公司 Screen control method, appliance arrangement and computer readable storage medium
CN110275663A (en) * 2019-05-07 2019-09-24 珠海格力电器股份有限公司 Processing method, system, terminal device and the storage medium of screen multitask

Patent Citations (7)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2015521763A (en) * 2012-08-14 2015-07-30 シャオミ・インコーポレイテッド Mobile terminal desktop system, interface interaction method, apparatus, program, and recording medium
CN106134163A (en) * 2016-06-22 2016-11-16 北京小米移动软件有限公司 Method for information display, information-pushing method, Apparatus and system
CN106339231A (en) * 2016-09-20 2017-01-18 珠海市魅族科技有限公司 Pushing method and apparatus of desktop notifications
CN107526493A (en) * 2017-08-29 2017-12-29 努比亚技术有限公司 A kind of small tool display methods, equipment and computer-readable recording medium
CN109408163A (en) * 2018-09-07 2019-03-01 百度在线网络技术(北京)有限公司 Screen control method, appliance arrangement and computer readable storage medium
CN109324730A (en) * 2018-09-30 2019-02-12 努比亚技术有限公司 Shortcut generation method, terminal and computer readable storage medium
CN110275663A (en) * 2019-05-07 2019-09-24 珠海格力电器股份有限公司 Processing method, system, terminal device and the storage medium of screen multitask

Similar Documents

Publication Publication Date Title
CN108572764B (en) Character input control method and device and computer readable storage medium
CN107730303B (en) Advertisement pushing method and device and computer readable storage medium
CN107807767B (en) Communication service processing method, terminal and computer readable storage medium
CN109471571B (en) Display method of suspension control, mobile terminal and computer readable storage medium
CN107547741B (en) Information processing method and device and computer readable storage medium
CN108600325B (en) Push content determining method, server and computer readable storage medium
CN109840444B (en) Code scanning identification method, equipment and computer readable storage medium
CN109753210B (en) Information display method of mobile terminal, mobile terminal and readable storage medium
CN112068744A (en) Interaction method, mobile terminal and storage medium
CN108958936B (en) Application program switching method, mobile terminal and computer readable storage medium
CN111427709B (en) Application program splitting control method, device and computer readable storage medium
CN112637410A (en) Method, terminal and storage medium for displaying message notification
CN110381202B (en) Display adjustment method, mobile terminal and computer-readable storage medium
CN109976859B (en) Screenshot method, terminal and computer readable storage medium
CN108241808A (en) identification code display control method, terminal and computer readable storage medium
CN109389394B (en) Multi-screen payment control method, equipment and computer readable storage medium
CN112437472B (en) Network switching method, equipment and computer readable storage medium
CN109948368B (en) Privacy authority control method, device and computer readable storage medium
CN110083294B (en) Screen capturing method, terminal and computer readable storage medium
CN109710168B (en) Screen touch method and device and computer readable storage medium
CN112102780A (en) Display frame rate regulation and control method, device and computer readable storage medium
CN109683796B (en) Interaction control method, equipment and computer readable storage medium
CN109918348B (en) Cleaning method, terminal and computer readable storage medium for application browsing record
CN111931155A (en) Verification code input method, verification code input equipment and storage medium
CN109684020B (en) Theme switching method, device and computer readable storage medium

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination