CN114443199A - Interface processing method, intelligent terminal and storage medium - Google Patents

Interface processing method, intelligent terminal and storage medium Download PDF

Info

Publication number
CN114443199A
CN114443199A CN202210101264.0A CN202210101264A CN114443199A CN 114443199 A CN114443199 A CN 114443199A CN 202210101264 A CN202210101264 A CN 202210101264A CN 114443199 A CN114443199 A CN 114443199A
Authority
CN
China
Prior art keywords
target
target window
interface
window
main interface
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
CN202210101264.0A
Other languages
Chinese (zh)
Inventor
李阳涛
周瑜
王翻
汪溪
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Shenzhen Transsion Holdings Co Ltd
Original Assignee
Shenzhen Transsion Holdings Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Shenzhen Transsion Holdings Co Ltd filed Critical Shenzhen Transsion Holdings Co Ltd
Priority to CN202210101264.0A priority Critical patent/CN114443199A/en
Publication of CN114443199A publication Critical patent/CN114443199A/en
Pending legal-status Critical Current

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F9/00Arrangements for program control, e.g. control units
    • G06F9/06Arrangements for program control, e.g. control units using stored programs, i.e. using an internal store of processing equipment to receive or retain programs
    • G06F9/44Arrangements for executing specific programs
    • G06F9/451Execution arrangements for user interfaces
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]
    • G06F3/0481Interaction techniques based on graphical user interfaces [GUI] based on specific properties of the displayed interaction object or a metaphor-based environment, e.g. interaction with desktop elements like windows or icons, or assisted by a cursor's changing behaviour or appearance
    • G06F3/04817Interaction techniques based on graphical user interfaces [GUI] based on specific properties of the displayed interaction object or a metaphor-based environment, e.g. interaction with desktop elements like windows or icons, or assisted by a cursor's changing behaviour or appearance using icons
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]
    • G06F3/0484Interaction techniques based on graphical user interfaces [GUI] for the control of specific functions or operations, e.g. selecting or manipulating an object, an image or a displayed text element, setting a parameter value or selecting a range
    • G06F3/04845Interaction techniques based on graphical user interfaces [GUI] for the control of specific functions or operations, e.g. selecting or manipulating an object, an image or a displayed text element, setting a parameter value or selecting a range for image manipulation, e.g. dragging, rotation, expansion or change of colour
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]
    • G06F3/0487Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser
    • G06F3/0488Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser using a touch-screen or digitiser, e.g. input of commands through traced gestures
    • G06F3/04883Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser using a touch-screen or digitiser, e.g. input of commands through traced gestures for inputting data by handwriting, e.g. gesture or text
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04MTELEPHONIC COMMUNICATION
    • H04M1/00Substation equipment, e.g. for use by subscribers
    • H04M1/72Mobile telephones; Cordless telephones, i.e. devices for establishing wireless links to base stations without route selection
    • H04M1/724User interfaces specially adapted for cordless or mobile telephones
    • H04M1/72448User interfaces specially adapted for cordless or mobile telephones with means for adapting the functionality of the device according to specific conditions
    • H04M1/72454User interfaces specially adapted for cordless or mobile telephones with means for adapting the functionality of the device according to specific conditions according to context-related or environment-related conditions

Landscapes

  • Engineering & Computer Science (AREA)
  • Theoretical Computer Science (AREA)
  • General Engineering & Computer Science (AREA)
  • Human Computer Interaction (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Software Systems (AREA)
  • Environmental & Geological Engineering (AREA)
  • Computer Networks & Wireless Communication (AREA)
  • Signal Processing (AREA)
  • User Interface Of Digital Computer (AREA)

Abstract

The application provides an interface processing method, an intelligent terminal and a storage medium. The method comprises the following steps: responding to the first operation, and outputting a target object in a target window of the main interface; and responding to the first target instruction, and adjusting the target parameters of the main interface and/or the target parameters of the target window. According to the application, the display effect of the application can be improved, and therefore the use experience of a user is improved.

Description

Interface processing method, intelligent terminal and storage medium
Technical Field
The application relates to the technical field of intelligent terminals, in particular to an interface processing method, an intelligent terminal and a storage medium.
Background
At present, a plurality of intelligent terminals such as mobile phones are provided with gravity sensors, the gravity sensors can be used for detecting the change of a three-dimensional space, calculating the current gravity direction of the mobile phone, and controlling the screen of the mobile phone to be a horizontal screen or a vertical screen, however, the habits and postures of people using the mobile phone are different, some people are used for sitting, some people are used for standing, some people like lying, and other people are playing the mobile phone while walking.
In the course of conceiving and implementing the present application, the inventors found that at least the following problems existed: because the gravity sensor can calculate the image display direction of the screen according to the detection result and rotate the screen, the display direction can be forcibly adjusted for certain applications only suitable for horizontal screens or vertical screens, the display effect of the applications is poor, and the use experience of users is reduced.
The foregoing description is provided for general background information and is not admitted to be prior art.
Disclosure of Invention
In view of the above technical problems, the present application provides an interface processing method, an intelligent terminal and a storage medium, which can improve the display effect of an application, thereby improving the user experience.
In order to solve the above technical problem, the present application provides an interface processing method, including the following steps:
s11: responding to the first operation, and outputting a target object in a target window of the main interface;
s12: and responding to the first target instruction, and adjusting the target parameters of the main interface and/or the target parameters of the target window.
Optionally, the step of S11 includes:
and outputting the target object in a target window of the main interface according to the operation information of the first operation.
Optionally, the operation information includes at least one of an operation position, an operation track, and an operation number.
Optionally, a trigger control is provided on the main interface, and the step S11 includes:
and responding to the first operation aiming at the trigger control, and displaying a target object in the main interface in the target window.
Optionally, the step of S12 includes:
identifying or determining a processing object corresponding to the first target instruction as a main interface and/or a target window;
and adjusting the target parameters of the main interface and/or the target parameters of the target window.
Optionally, the target parameter comprises at least one of: display state, display content, display size, display color, display brightness.
Optionally, the adjusting the display state of the target window includes:
adjusting the display state of the target window to be a horizontal screen state or a vertical screen state, and locking the display state of the main interface; and/or, responding to a second target instruction, and releasing the locking state of the main interface.
Optionally, the method further comprises:
identifying or determining that the interface sliding function is opened;
and responding to a second operation, and performing sliding display on the target object of the target window.
Optionally, at least one of a page turning control, a switch control, a switching control and a shortcut control is arranged in the target window.
Optionally, the method further comprises at least one of:
responding to a third operation aiming at the page turning control, and performing page turning display on the content of the target object of the target window;
hiding and/or closing the target window in response to a fourth operation directed to the switch control;
responding to a fifth operation aiming at the switching control, and switching the target window into a floating window;
and responding to a sixth operation aiming at the shortcut control, and adjusting the display state of the target window to be a horizontal screen state or a vertical screen state, and/or adjusting the display state of the main interface to be a horizontal screen state or a vertical screen state.
The application also provides an interface processing method, which comprises the following steps:
s21: responding to the target operation, and displaying target information in a target window of an application interface;
s22: and responding to a target instruction, and adjusting the target parameters of the target window.
Optionally, the step of S21 includes:
identifying or determining a sharing link corresponding to the target operation, and displaying target information corresponding to the sharing link in the target window; and/or the presence of a gas in the gas,
and identifying or determining that the third party information in the application interface is target information, and displaying the target information in the target window.
Optionally, the step of S22 includes:
and responding to the target instruction, and adjusting the display state of the target window to a first state and/or adjusting the display state of the application interface to a second state.
The application also provides an intelligent terminal, including: the device comprises a memory and a processor, wherein the memory is stored with an interface processing program, and the interface processing program realizes the steps of any method when being executed by the processor.
The present application also provides a computer-readable storage medium, which stores a computer program that, when executed by a processor, performs the steps of the method as set forth in any one of the above.
As described above, the interface processing method of the present application may output the target object in the target window of the main interface in response to the first operation, adjust the target parameter of the main interface and/or the target parameter of the target window in response to the first target instruction, and by using the technical solution of the present application, the display effect of the application may be improved, thereby improving the user experience.
Drawings
The accompanying drawings, which are incorporated in and constitute a part of this specification, illustrate embodiments consistent with the present application and together with the description, serve to explain the principles of the application. In order to more clearly illustrate the technical solutions of the embodiments of the present application, the drawings needed to be used in the description of the embodiments will be briefly described below, and it is obvious for those skilled in the art to obtain other drawings based on these drawings without inventive exercise.
Fig. 1 is a schematic hardware structure diagram of a mobile terminal implementing various embodiments of the present application;
fig. 2 is a communication network system architecture diagram according to an embodiment of the present application;
fig. 3 is a schematic interface diagram of an intelligent terminal provided in an embodiment of the present application;
fig. 4 is a schematic flowchart of a first implementation manner of an interface processing method provided in an embodiment of the present application;
FIGS. 5-20 are schematic diagrams of interfaces including a first implementation of a target window according to embodiments of the present disclosure;
fig. 21 is a schematic flowchart of a second implementation manner of an interface processing method provided in an embodiment of the present application;
FIGS. 22 and 23 are schematic diagrams of interfaces including a second implementation of a target window provided by examples of the present application;
fig. 24 is a schematic structural diagram of a first implementation of an interface processing apparatus according to an embodiment of the present application;
fig. 25 is a schematic structural diagram of a second implementation of an interface processing apparatus according to an embodiment of the present application.
The implementation, functional features and advantages of the objectives of the present application will be further explained with reference to the accompanying drawings. With the above figures, there are shown specific embodiments of the present application, which will be described in more detail below. These drawings and written description are not intended to limit the scope of the inventive concepts in any manner, but rather to illustrate the inventive concepts to those skilled in the art by reference to specific embodiments.
Detailed Description
Reference will now be made in detail to the exemplary embodiments, examples of which are illustrated in the accompanying drawings. When the following description refers to the accompanying drawings, like numbers in different drawings represent the same or similar elements unless otherwise indicated. The embodiments described in the following exemplary embodiments do not represent all embodiments consistent with the present application. Rather, they are merely examples of apparatus and methods consistent with certain aspects of the present application, as detailed in the appended claims.
It should be noted that, in this document, the terms "comprises," "comprising," or any other variation thereof, are intended to cover a non-exclusive inclusion, such that a process, method, article, or apparatus that comprises a list of elements does not include only those elements but may include other elements not expressly listed or inherent to such process, method, article, or apparatus. Without further limitation, the recitation of an element by the phrase "comprising an … …" does not exclude the presence of additional like elements in the process, method, article, or apparatus that comprises the element, and further, where similarly-named elements, features, or elements in different embodiments of the disclosure may have the same meaning, or may have different meanings, that particular meaning should be determined by their interpretation in the embodiment or further by context with the embodiment.
It should be understood that although the terms first, second, third, etc. may be used herein to describe various information, such information should not be limited to these terms. These terms are only used to distinguish one type of information from another. For example, first information may also be referred to as second information, and similarly, second information may also be referred to as first information, without departing from the scope herein. The word "if" as used herein may be interpreted as "at … …" or "when … …" or "in response to a determination", depending on the context. Also, as used herein, the singular forms "a", "an" and "the" are intended to include the plural forms as well, unless the context indicates otherwise. It will be further understood that the terms "comprises," "comprising," "includes" and/or "including," when used in this specification, specify the presence of stated features, steps, operations, elements, components, items, species, and/or groups, but do not preclude the presence, or addition of one or more other features, steps, operations, elements, components, species, and/or groups thereof. The terms "or," "and/or," "including at least one of the following," and the like, as used herein, are to be construed as inclusive or mean any one or any combination. For example, "includes at least one of: A. b, C "means" any of the following: a; b; c; a and B; a and C; b and C; a and B and C ", again for example," A, B or C "or" A, B and/or C "means" any of the following: a; b; c; a and B; a and C; b and C; a and B and C'. An exception to this definition will occur only when a combination of elements, functions, steps or operations are inherently mutually exclusive in some way.
It should be understood that, although the steps in the flowcharts in the embodiments of the present application are shown in order as indicated by the arrows, the steps are not necessarily performed in order as indicated by the arrows. The steps are not performed in the exact order shown and may be performed in other orders unless explicitly stated herein. Moreover, at least some of the steps in the figures may include multiple sub-steps or multiple stages that are not necessarily performed at the same time, but may be performed at different times, in different orders, and may be performed alternately or at least partially with respect to other steps or sub-steps of other steps.
The words "if", as used herein, may be interpreted as "at … …" or "at … …" or "in response to a determination" or "in response to a detection", depending on the context. Similarly, the phrases "if determined" or "if detected (a stated condition or event)" may be interpreted as "when determined" or "in response to a determination" or "when detected (a stated condition or event)" or "in response to a detection (a stated condition or event)", depending on the context.
It should be noted that step numbers such as S11 and S12 are used herein for the purpose of more clearly and briefly describing the corresponding contents, and do not constitute a substantial limitation on the sequence, and those skilled in the art may perform S12 first and then perform S11 in the specific implementation, which should be within the scope of the present application.
It should be understood that the specific embodiments described herein are merely illustrative of the present application and are not intended to limit the present application.
In the following description, suffixes such as "module", "component", or "unit" used to denote elements are used only for the convenience of description of the present application, and have no specific meaning in themselves. Thus, "module", "component" or "unit" may be used mixedly.
The smart terminal may be implemented in various forms. For example, the smart terminal described in the present application may include mobile terminals such as a mobile phone, a tablet computer, a notebook computer, a palmtop computer, a Personal Digital Assistant (PDA), a Portable Media Player (PMP), a navigation device, a wearable device, a smart band, a pedometer, and the like, and fixed terminals such as a Digital TV, a desktop computer, and the like.
The following description will be given taking a mobile terminal as an example, and it will be understood by those skilled in the art that the configuration according to the embodiment of the present application can be applied to a fixed type terminal in addition to elements particularly used for mobile purposes.
Referring to fig. 1, which is a schematic diagram of a hardware structure of a mobile terminal for implementing various embodiments of the present application, the mobile terminal 100 may include: RF (Radio Frequency) unit 101, WiFi module 102, audio output unit 103, a/V (audio/video) input unit 104, sensor 105, display unit 106, user input unit 107, interface unit 108, memory 109, processor 110, and power supply 111. Those skilled in the art will appreciate that the mobile terminal architecture shown in fig. 1 is not intended to be limiting of mobile terminals, which may include more or fewer components than those shown, or some components may be combined, or a different arrangement of components.
The following describes each component of the mobile terminal in detail with reference to fig. 1:
the radio frequency unit 101 may be configured to receive and transmit signals during information transmission and reception or during a call, and specifically, receive downlink information of a base station and then process the downlink information to the processor 110; in addition, the uplink data is transmitted to the base station. Typically, radio frequency unit 101 includes, but is not limited to, an antenna, at least one amplifier, a transceiver, a coupler, a low noise amplifier, a duplexer, and the like. In addition, the radio frequency unit 101 can also communicate with a network and other devices through wireless communication. The wireless communication may use any communication standard or protocol, including but not limited to GSM (Global System for Mobile communications), GPRS (General Packet Radio Service), CDMA2000(Code Division Multiple Access 2000), WCDMA (Wideband Code Division Multiple Access), TD-SCDMA (Time Division-Synchronous Code Division Multiple Access), FDD-LTE (Frequency Division duplex-Long Term Evolution), TDD-LTE (Time Division duplex-Long Term Evolution), 5G (Time Division duplex-Long Term Evolution), and the like.
WiFi belongs to short-distance wireless transmission technology, and the mobile terminal can help a user to receive and send e-mails, browse webpages, access streaming media and the like through the WiFi module 102, and provides wireless broadband internet access for the user. Although fig. 1 shows the WiFi module 102, it is understood that it does not belong to the essential constitution of the mobile terminal, and may be omitted entirely as needed within the scope of not changing the essence of the application.
The audio output unit 103 may convert audio data received by the radio frequency unit 101 or the WiFi module 102 or stored in the memory 109 into an audio signal and output as sound when the mobile terminal 100 is in a call signal reception mode, a call mode, a recording mode, a voice recognition mode, a broadcast reception mode, or the like. Also, the audio output unit 103 may also provide audio output related to a specific function performed by the mobile terminal 100 (e.g., a call signal reception sound, a message reception sound, etc.). The audio output unit 103 may include a speaker, a buzzer, and the like.
The a/V input unit 104 is used to receive audio or video signals. The a/V input Unit 104 may include a Graphics Processing Unit (GPU) 1041 and a microphone 1042, the Graphics processor 1041 Processing image data of still pictures or video obtained by an image capturing device (e.g., a camera) in a video capturing mode or an image capturing mode. The processed image frames may be displayed on the display unit 106. The image frames processed by the graphic processor 1041 may be stored in the memory 109 (or other storage medium) or transmitted via the radio frequency unit 101 or the WiFi module 102. The microphone 1042 may receive sounds (audio data) via the microphone 1042 in a phone call mode, a recording mode, a voice recognition mode, or the like, and may be capable of processing such sounds into audio data. The processed audio (voice) data may be converted into a format output transmittable to a mobile communication base station via the radio frequency unit 101 in case of a phone call mode. The microphone 1042 may implement various types of noise cancellation (or suppression) algorithms to cancel (or suppress) noise or interference generated in the course of receiving and transmitting audio signals.
The mobile terminal 100 also includes at least one sensor 105, such as a light sensor, motion sensor, and other sensors. Optionally, the light sensor includes an ambient light sensor that may adjust the brightness of the display panel 1061 according to the brightness of ambient light, and a proximity sensor that may turn off the display panel 1061 and/or the backlight when the mobile terminal 100 is moved to the ear. As one of the motion sensors, the accelerometer sensor can detect the magnitude of acceleration in each direction (generally three axes), detect the magnitude and direction of gravity when stationary, and can be used for applications of recognizing gestures of a mobile phone (such as horizontal and vertical screen switching, related games, magnetometer gesture calibration), vibration recognition related functions (such as pedometers and taps), and the like; as for other sensors such as a fingerprint sensor, a pressure sensor, an iris sensor, a molecular sensor, a gyroscope, a barometer, a hygrometer, a thermometer, and an infrared sensor, which can be configured on the mobile phone, further description is omitted here.
The display unit 106 is used to display information input by a user or information provided to the user. The Display unit 106 may include a Display panel 1061, and the Display panel 1061 may be configured in the form of a Liquid Crystal Display (LCD), an Organic Light-Emitting Diode (OLED), or the like.
The user input unit 107 may be used to receive input numeric or character information and generate key signal inputs related to user settings and function control of the mobile terminal. Alternatively, the user input unit 107 may include a touch panel 1071 and other input devices 1072. The touch panel 1071, also referred to as a touch screen, may collect a touch operation performed by a user on or near the touch panel 1071 (e.g., an operation performed by the user on or near the touch panel 1071 using a finger, a stylus, or any other suitable object or accessory), and drive a corresponding connection device according to a predetermined program. The touch panel 1071 may include two parts of a touch detection device and a touch controller. Optionally, the touch detection device detects a touch orientation of a user, detects a signal caused by a touch operation, and transmits the signal to the touch controller; the touch controller receives touch information from the touch sensing device, converts the touch information into touch point coordinates, sends the touch point coordinates to the processor 110, and can receive and execute commands sent by the processor 110. In addition, the touch panel 1071 may be implemented in various types, such as a resistive type, a capacitive type, an infrared ray, and a surface acoustic wave. In addition to the touch panel 1071, the user input unit 107 may include other input devices 1072. Optionally, other input devices 1072 may include, but are not limited to, one or more of a physical keyboard, function keys (e.g., volume control keys, switch keys, etc.), a trackball, a mouse, a joystick, and the like, and are not limited thereto.
Alternatively, the touch panel 1071 may cover the display panel 1061, and when the touch panel 1071 detects a touch operation thereon or nearby, the touch panel 1071 transmits the touch operation to the processor 110 to determine the type of the touch event, and then the processor 110 provides a corresponding visual output on the display panel 1061 according to the type of the touch event. Although the touch panel 1071 and the display panel 1061 are shown in fig. 1 as two separate components to implement the input and output functions of the mobile terminal, in some embodiments, the touch panel 1071 and the display panel 1061 may be integrated to implement the input and output functions of the mobile terminal, and is not limited herein.
The interface unit 108 serves as an interface through which at least one external device is connected to the mobile terminal 100. For example, the external device may include a wired or wireless headset port, an external power supply (or battery charger) port, a wired or wireless data port, a memory card port, a port for connecting a device having an identification module, an audio input/output (I/O) port, a video I/O port, an earphone port, and the like. The interface unit 108 may be used to receive input (e.g., data information, power, etc.) from external devices and transmit the received input to one or more elements within the mobile terminal 100 or may be used to transmit data between the mobile terminal 100 and external devices.
The memory 109 may be used to store software programs as well as various data. The memory 109 may mainly include a program storage area and a data storage area, and optionally, the program storage area may store an operating system, an application program (such as a sound playing function, an image playing function, and the like) required by at least one function, and the like; the storage data area may store data (such as audio data, a phonebook, etc.) created according to the use of the cellular phone, and the like. Further, the memory 109 may include high speed random access memory, and may also include non-volatile memory, such as at least one magnetic disk storage device, flash memory device, or other volatile solid state storage device.
The processor 110 is a control center of the mobile terminal, connects various parts of the entire mobile terminal using various interfaces and lines, and performs various functions of the mobile terminal and processes data by operating or executing software programs and/or modules stored in the memory 109 and calling data stored in the memory 109, thereby performing overall monitoring of the mobile terminal. Processor 110 may include one or more processing units; preferably, the processor 110 may integrate an application processor and a modem processor, optionally, the application processor mainly handles operating systems, user interfaces, application programs, etc., and the modem processor mainly handles wireless communications. It will be appreciated that the modem processor described above may not be integrated into the processor 110.
The mobile terminal 100 may further include a power supply 111 (e.g., a battery) for supplying power to various components, and preferably, the power supply 111 may be logically connected to the processor 110 via a power management system, so as to manage charging, discharging, and power consumption management functions via the power management system.
Although not shown in fig. 1, the mobile terminal 100 may further include a bluetooth module or the like, which is not described in detail herein.
In order to facilitate understanding of the embodiments of the present application, a communication network system on which the mobile terminal of the present application is based is described below.
Referring to fig. 2, fig. 2 is an architecture diagram of a communication Network system according to an embodiment of the present disclosure, where the communication Network system is an LTE system of a universal mobile telecommunications technology, and the LTE system includes a UE (User Equipment) 201, an E-UTRAN (Evolved UMTS Terrestrial Radio Access Network) 202, an EPC (Evolved Packet Core) 203, and an IP service 204 of an operator, which are in communication connection in sequence.
Optionally, the UE201 may be the terminal 100 described above, and is not described herein again.
The E-UTRAN202 includes eNodeB2021 and other eNodeBs 2022, among others. Alternatively, the eNodeB2021 may be connected with other enodebs 2022 through a backhaul (e.g., X2 interface), the eNodeB2021 is connected to the EPC203, and the eNodeB2021 may provide the UE201 access to the EPC 203.
The EPC203 may include an MME (Mobility Management Entity) 2031, an HSS (Home Subscriber Server) 2032, other MMEs 2033, an SGW (Serving gateway) 2034, a PGW (PDN gateway) 2035, and a PCRF (Policy and Charging Rules Function) 2036, and the like. Optionally, the MME2031 is a control node that handles signaling between the UE201 and the EPC203, providing bearer and connection management. HSS2032 is used to provide registers to manage functions such as home location register (not shown) and holds subscriber specific information about service characteristics, data rates, etc. All user data may be sent through SGW2034, PGW2035 may provide IP address assignment for UE201 and other functions, and PCRF2036 is a policy and charging control policy decision point for traffic data flow and IP bearer resources, which selects and provides available policy and charging control decisions for a policy and charging enforcement function (not shown).
The IP services 204 may include the internet, intranets, IMS (IP Multimedia Subsystem), or other IP services, among others.
Although the LTE system is described as an example, it should be understood by those skilled in the art that the present application is not limited to the LTE system, but may also be applied to other wireless communication systems, such as GSM, CDMA2000, WCDMA, TD-SCDMA, and future new network systems (e.g. 5G), and the like.
Based on the above mobile terminal hardware structure and communication network system, various embodiments of the present application are provided.
In some implementations, the smart terminal is installed with a plurality of applications, and the user can open the applications by clicking and/or the like. For example, a user opens a video application by clicking a video application icon on the intelligent terminal, and displays a corresponding application interface on a display screen of the intelligent terminal.
Optionally, after the user opens the target application using the intelligent terminal, and switches the intelligent terminal from the vertical state to the horizontal state, the gravity sensor of the intelligent terminal calculates the image display direction of the screen of the intelligent terminal according to the detection result, and rotates the screen of the intelligent terminal, that is, the display state of the target application is adjusted from the vertical state to the horizontal state, as shown in fig. 3, that is, the display state is adjusted from the vertical screen display state in the left diagram to the horizontal screen display state in the right diagram.
In order to solve the above existing technical problems, an embodiment of the present application provides an interface processing method, where after a user opens a target application using an intelligent terminal, the intelligent terminal is switched from a vertical state to a horizontal state, a mobile phone responds to a first operation to display the target application in a target window of a main interface, and the intelligent terminal responds to a first target instruction to adjust a target parameter of the main interface and/or a target parameter of the target window, for example, the main interface is adjusted to be displayed horizontally, and the display state of the target application in the target window is kept to be a vertical state.
The technical means shown in the present application will be described in detail below with reference to specific examples. It should be noted that the following embodiments may exist alone or in combination with each other, and description of the same or similar contents is not repeated in different embodiments.
Referring to fig. 4, fig. 4 is a schematic flowchart illustrating a first implementation manner of an interface processing method according to an embodiment of the present disclosure. The interface processing method may specifically include:
and S11, responding to the first operation, and outputting the target object in the target window of the main interface.
Optionally, the first operation may be an operation for the main interface and/or the target window. Optionally, the first operation may be at least one of: long press, click, double click, continuous click, sliding operation in a preset direction, dragging, air gesture, fingerprint identification, voice control, behavior control, vibration control, state control and time control.
Optionally, the long press refers to a long press operation on a screen page of the mobile phone by a finger or a touch tool.
Optionally, the double-click refers to performing a double-click operation on a screen page of the mobile phone through a finger or a touch tool.
Optionally, the continuous clicking refers to continuously clicking on the screen page of the intelligent terminal through a finger or a touch tool, for example, continuously clicking 3 times or more.
Optionally, the sliding operation in the preset direction refers to that the sliding direction is preset by software, and the sliding operation is performed on a screen page of the intelligent terminal according to the preset sliding direction through a finger or a touch tool. Optionally, the preset direction may be a direction from top to bottom, the preset direction may be a direction from bottom to top, the preset direction may be a first up-down direction and then a left-right direction, the preset direction may be a first left-right direction and then a top-down direction, the preset direction may be a clockwise upper semicircle, the preset direction may be a counterclockwise lower semicircle, the preset direction may be a clockwise arc, and the like. The embodiment of the application does not particularly limit the specific form of the preset direction, and can be adjusted correspondingly according to specific requirements.
Optionally, the operation of the spaced gesture finger is performed according to a preset gesture within a certain distance range from the screen page of the intelligent terminal through the finger or the touch pen. Optionally, the clear gesture may be at least one of: drawing circles in the air, drawing circular arcs in the air, drawing semi-circles in the air, drawing straight lines in the air, drawing curves in the air, drawing opposite signs in the air, drawing numbers in the air, drawing characters in the air and the like.
Alternatively, the fingerprint identification refers to a fingerprint of a user on an entity key of the intelligent terminal or a fingerprint on a display interface.
Alternatively, voice control refers to a user issuing a voice control instruction.
Optionally, the behavior control means that when the current behavior of the user conforms to a preset behavior, a first operation for the target window may be triggered, and the target information is determined or generated according to a type of the first operation. Optionally, the behavior may be a behavior of the user picking up the intelligent terminal, and/or the behavior may be a behavior of the user placing the intelligent terminal on a desktop, and/or the behavior may be a behavior of the user picking up the intelligent terminal to take a picture, and/or the behavior may be a behavior of the user picking up the intelligent terminal to make a call, and/or the behavior may be a behavior of the user picking up the intelligent terminal to do exercise, and so on.
Optionally, the vibration control means that when the vibration frequency of the smart terminal meets a preset vibration frequency, and/or the vibration duration of the smart terminal meets a preset duration, and/or the vibration melody meets a preset melody, the target object may be triggered to be output in the target window of the main interface.
Optionally, the state control means that when the device state of the intelligent terminal meets a preset state, the target object can be triggered to be output in a target window of the main interface.
Optionally, the time control means triggering to output the target object in a target window of the main interface when the current time reaches a preset time.
Optionally, the first operation is voice control, the target object is a social message, and the social message is output in a target window of the main interface according to a voice control instruction corresponding to the voice control.
Optionally, the first operation is behavior control of the user, the target object is a social message, and the social message is output in a target window of the main interface according to a behavior control instruction corresponding to the behavior control.
Optionally, the first operation is vibration control, the target object is a social message, and the social message is output in a target window of the main interface according to a vibration control instruction corresponding to the vibration control.
Optionally, the first operation is state control, the target object is a social message, and the social message is output in a target window of the main interface according to a state control instruction corresponding to the state control. For example, the preset state is an unlocked state, and when the state of the intelligent terminal is in the unlocked state, namely the first operation is met, the intelligent terminal generates a state control instruction and outputs social information in a target window of the main interface. If the preset state is that the signal intensity is greater than the first threshold value, when the signal intensity of the intelligent terminal is greater than the first threshold value, that is, the first operation is met, the intelligent terminal generates a state control instruction, and outputs social information in a target window of the main interface.
Optionally, the first operation is time control, the target object is a social message, and the social message is output in a target window of the main interface according to a time control instruction corresponding to the time control. For example, the preset time is X month and X day, when the intelligent terminal detects that the preset time is reached, a time control instruction is generated, and social information is output in a target window of the main interface.
Optionally, the first operation is a click operation, the target object is a social message, and the social message is output in a target window of the main interface according to a corresponding click number and a click duration corresponding to the click operation.
Optionally, the first operation is a dragging operation, the target object is a social message, and the social message is output in a target window of the main interface according to a corresponding dragging position and dragging distance corresponding to the dragging operation.
Alternatively, the target object may be video, audio, image, application interface, and/or plain text, among others.
Optionally, referring to fig. 5, the application interface of the target application is currently displayed on the main interface, the first operation is a sliding operation, the operation track of the sliding operation is identified or determined to be a preset track, the application interface of the target application is displayed in the target window of the main interface based on the preset track, and the content corresponding to the main interface is displayed.
Optionally, referring to fig. 6, the application interface of the target application is currently displayed on the main interface, the first operation is a click operation, the number of clicks of the click operation is identified or determined as a preset number, and based on the preset number, the content corresponding to the main interface is displayed in the target window of the main interface.
Optionally, the step S11 may specifically include:
and outputting the target object in the target window of the main interface according to the operation information of the first operation.
Optionally, the operation information may include at least one of an operation position, an operation trajectory, an operation number, and an operation time.
Alternatively, the operation position may be a position of a finger or a touch tool in the target window.
Optionally, the operation trajectory may be a sliding graph of the user on the target window through a finger or a touch tool, and may also be a gesture graph corresponding to an empty gesture of the user on the target window.
Alternatively, the number of operations may be the number of clicks or taps on the target window by a finger or a touch tool.
Alternatively, the operation time may be a time period during which the user clicks or taps on the target window by a finger or a touch tool.
Optionally, outputting the target object in the target window of the main interface according to the operation information of the first operation may include the following implementation scheme:
scheme 1: and displaying the content of the target object in the main interface in the target window according to the operation position of the first operation.
Scheme 2: and displaying the content of the target object in the main interface in the target window according to the operation track of the first operation.
Scheme 3: and displaying the content of the target object in the main interface in the target window according to the operation times of the first operation.
Scheme 4: and displaying the content of the target object in the main interface in the target window according to the operation time of the first operation.
In the scheme 1, the operation position is identified or determined to be in the preset area of the main interface, and the content of the target object in the main interface is displayed in the target window. Optionally, referring to fig. 7, the preset area is a, the position q of the operation point is identified or determined to be within the preset area a of the main interface, and a video played by the main interface is displayed in the target window s 1.
In the scheme 2, the content of the target object in the main interface is displayed in the target window according to the graph corresponding to the operation track. Optionally, referring to fig. 8, the graph corresponding to the identified or determined operation track is s-shaped, and a video played by the main interface is displayed in the target window s 2.
In scheme 3, the content of the target object in the main interface is displayed in the target window according to the operation times. Optionally, referring to fig. 9, recognizing or determining the operation frequency as a preset operation frequency, and obtaining a control instruction corresponding to the preset operation frequency, for example, clicking 3 times, switching a video played in the main interface to a target window for playing, and displaying the target window at the lower left of the main interface.
In scheme 4, the content of the target object in the main interface is displayed in the target window according to the operation time. Optionally, referring to fig. 10, if the operation time is identified or determined as the preset time duration, and the control instruction corresponding to the preset time duration is obtained, the video played in the main interface is switched to the target window for playing, and the target window is displayed at the upper left of the main interface.
Optionally, in some embodiments, a trigger control is provided on the main interface, and the trigger control may be a hover control and/or a voice control and/or a fingerprint control. And responding to the first operation aiming at the trigger control, and displaying the target object in the main interface in the target window.
Optionally, referring to fig. 11, in response to a click operation on the trigger control c, a target text is displayed in a target window of the main interface, where the target text is content in the main interface. Optionally, in response to the click operation on the trigger control c, displaying a target image-text (including a picture and text) in a target window of the main interface, where the target image-text is content in the main interface. Optionally, in response to a click operation on the trigger control c, displaying a target image in a target window of the main interface, where the target image is content in the main interface. Optionally, in response to a click operation on the trigger control c, a target video is displayed in a target window of the main interface, where the target video is content in the main interface.
And S12, responding to the first target instruction, and adjusting the target parameters of the main interface and/or the target parameters of the target window.
Alternatively, the first target instruction may be preset by the machine, or may be generated according to an operation of the user, and therefore, in response to the first target instruction, the manner of adjusting the target parameter may be:
the first mode is as follows: the first target instruction is preset by the machine. For example, when the content in the main interface is displayed in the target window, the intelligent terminal indicates, according to preset configuration information: if the size adjusting operation of the user aiming at the target window is not received within 5s, the display state of the target window is adjusted to be the same as the display state of the main interface, namely the initial display state of the target window is vertical screen display, the display state of the main interface is horizontal screen display, and the size adjusting operation of the user aiming at the target window is not received within 5s, the display state of the target window is adjusted to be horizontal screen display.
The second mode is as follows: the first target instruction is generated according to an operation by a user. For example, a processing object corresponding to the first target instruction is identified or determined as a main interface and/or a target window, and a target parameter of the main interface and/or a target parameter of the target window are adjusted, optionally, the target parameter includes at least one of the following: display state, display content, display size, display color, display brightness.
Optionally, referring to fig. 12, when it is determined that the processing object corresponding to the first target instruction is the target window, the display position of the target window may be adjusted, and the display position of the target window is moved from the L1 area to the L2 area.
Alternatively, referring to fig. 13, when it is determined that the processing object corresponding to the first target instruction is the target window, the display shape of the target window may be adjusted, so that the shape of the target window is adjusted from a rectangle to a circle.
Optionally, referring to fig. 14, when it is determined that the processing object corresponding to the first target instruction is the target window, the display state of the target window may be adjusted, and the shape of the target window is adjusted from the horizontal screen display to the vertical screen display, and the display state of the main interface remains unchanged.
Optionally, when it is determined that the processing object corresponding to the first target instruction is the target window, the display mode of the target window may be adjusted, for example, a dynamic effect display mode is set for the target window, and/or a label is set for the target window, and/or background music is set for the target window, and/or the display size of the target window is dynamically displayed, and the like.
Optionally, adjusting the display state of the target window may specifically include:
adjusting the display state of the target window to be a horizontal screen state or a vertical screen state, and locking the display state of the main interface; and/or the presence of a gas in the gas,
and responding to the second target instruction, and releasing the locking state of the main interface.
Alternatively, after the display state of the target window is adjusted, whether the display state of the home interface is locked may be represented in the form of a state icon. Referring to fig. 15, in the first diagram, after the display state of the target window in the main interface is adjusted, the mode that the state icon is in the locked state, such as the mode of the locked icon, is displayed at the preset position of the main interface in the second diagram, and then, in response to the second target instruction, the locked state of the main interface is released, and optionally, the display state icon in the third diagram is in the unlocked mode, such as the mode of the unlocked icon.
In some embodiments, the interface processing method according to the embodiment of the present application may further include:
identifying or determining that the interface sliding function is opened;
and responding to the second operation, and performing sliding display on the target object of the target window.
Optionally, the user may trigger the interface sliding function to be opened by clicking the function control, and respond to a click operation for the function control to open the interface sliding function. Referring to fig. 16, when the interface sliding function is turned on, in response to the sliding operation for the target object of the target window, the target object is displayed in a sliding manner, that is, the target window is slid from the a1 main interface to the a2 main interface, that is, the target object is slid from the a1 main interface to the a2 main interface. Optionally, after the interface sliding function is started, in response to a sliding operation on a target object of the target window, the position of the target window in the main interface is unchanged, and the target object of the target window is displayed in the target window in a sliding manner, that is, by performing the sliding operation on the target object of the target window, the content displayed in the target window by the target object changes along with the sliding operation. Optionally, when the interface sliding function is started, in response to a sliding operation for a target object of the target window, the target window slides from one of the main interfaces to the other main interface, and optionally, the target object of the target window is displayed in the target window in a sliding manner, that is, the content of the target object of the target window changes with the change of the sliding position of the target window.
Optionally, in some embodiments, at least one of a page turning control, an opening/closing control, a switching control, and a shortcut control is provided in the target window, and the interface processing method according to the embodiment of the present application may further include at least one of the following:
responding to a third operation aiming at the page turning control, and performing page turning display on the content of the target object of the target window;
hiding and/or closing the target window in response to a fourth operation directed to the switch control;
responding to a fifth operation aiming at the switching control, and switching the target window into a floating window;
and responding to a sixth operation aiming at the shortcut control, and adjusting the display state of the target window to be a horizontal screen state or a vertical screen state, and/or adjusting the display state of the main interface to be a horizontal screen state or a vertical screen state.
Optionally, the third operation, the fourth operation, the fifth operation and/or the sixth operation may be a long press, a click, a double click, a continuous click, a sliding operation in a preset direction, a drag and/or a blank gesture. The details are determined according to actual situations and are not described herein.
Optionally, the third operation, the fourth operation, the fifth operation and/or the sixth operation may all be wind control operations. The wind power control operation refers to that the intelligent terminal generates a wind power control command by detecting the wind power. For example, when the intelligent terminal detects that the ambient wind power is a preset wind power value, triggering a third operation for a page turning control to display the content of a target object of a target window in a page turning manner, and/or triggering a fourth operation for a switch control to hide and/or close the target window; and/or triggering a fifth operation aiming at the switching control, and switching the target window into a floating window; and/or triggering a sixth operation aiming at the shortcut control, and adjusting the display state of the target window to be a horizontal screen state or a vertical screen state, and/or triggering and adjusting the display state of the main interface to be a horizontal screen state or a vertical screen state.
Optionally, referring to fig. 17, a page turning control K1 is displayed in the target window, and the content of the target object in the target window is displayed in a page turning manner in response to a click operation directed to the page turning control K1. Optionally, in the process of page turning display, corresponding dynamic effects can be displayed, and interestingness of interface processing is improved. For example, in the process of page turning display, the music action is displayed. In another example, during the page turn display, the flashing action is shown. For another example, in the translation display process, a piece of news is broadcasted, and so on.
Optionally, referring to fig. 18, a switch control K2 is displayed in the target window, the target window is hidden and/or closed in response to a click operation on the switch control K2, and the target window is hidden and/or closed in a form of gradually shrinking the target window during the process of hiding and/or closing the target window. Alternatively, the target window may be closed after being hidden on the interface. Alternatively, the target window may be closed in a peacock-off manner during the process of hiding the target window on the interface. Optionally, in the process of influencing the target window on the interface, a music fountain video is played to close the target window, so that the time for a user to intuitively wait for closing the target window is reduced, and the use experience of the user is improved.
Optionally, referring to fig. 19, a switching control K3 is displayed in the target window, and the target window is switched to a floating window in response to a click operation on the switching control K3, where the floating window is a movable window floating on the interface. For example, the user may drag the floating window with a finger or a touch tool to move the floating window in the interface.
Optionally, referring to fig. 20, a shortcut control K4 is displayed in the target window, and in response to a click operation on the shortcut control K4, the display state of the target window is adjusted to be a landscape screen state or a portrait screen state, and/or the display state of the main interface is adjusted to be a landscape screen state or a portrait screen state. For example, clicking the shortcut control K4 for 1 time adjusts the display state of the target window to be a horizontal screen state, continuously clicking the shortcut control K4 for 2 times, adjusting the display state of the target window to be a vertical screen state, continuously clicking the shortcut control K4 for 3 times, adjusting the display state of the main interface to be a vertical screen state, continuously clicking the shortcut control K4 for 4 times, and adjusting the display state of the main interface to be a vertical screen state.
Optionally, in some embodiments, a corresponding dynamic effect may be further output according to a change of a target parameter of the target window, and optionally, the dynamic effect may include at least one of: bubble, play music, flash, message pop-up, play video, add a label to the target window.
As described above, the interface processing method according to the embodiment of the present application may output the target object in the target window of the main interface in response to the first operation, adjust the target parameter of the main interface and/or the target parameter of the target window in response to the first target instruction, and by using the technical solution of the present application, the display effect of the application may be improved, thereby improving the user experience.
Fig. 21 is a schematic flowchart of a second implementation of the interface processing method according to the embodiment of the present application. The interface processing method may specifically include:
s21: and responding to the target operation, and displaying target information in a target window of the application interface.
Optionally, the target operation may be an operation for the main interface and/or the target window.
Optionally, the first operation may be at least one of: long press, click, double click, continuous click, sliding operation in a preset direction, drag, air-separating gesture, fingerprint recognition, voice control, behavior control, vibration control, state control, and time control, for which specific embodiments refer to step S11, and are not described herein again.
Optionally, the target information may be a social message, a picture, a video, a text, and/or an audio, and optionally, the target information may be a message shared by the user. That is, the step S21 may specifically include:
identifying or determining a sharing link corresponding to the target operation, and displaying target information corresponding to the sharing link in a target window; and/or the presence of a gas in the atmosphere,
and identifying or determining the third party information in the application interface as target information, and displaying the target information in a target window.
Optionally, if the target operation is a sharing operation, identifying or determining a sharing link corresponding to the sharing operation, please refer to fig. 22, a user clicks a sharing button in a left diagram, that is, performs the sharing operation, and can share the content to be shared in the left diagram interface, where target information corresponding to the sharing link is audio, a target window in the right diagram interface displays a sharing interface of the audio, and the sharing interface displays shared content and a sharing control.
Optionally, if the third-party information is the target information, that is, locally serves as an information receiver, please refer to fig. 23, identify or determine that song a in the audio application interface is the target information, and display song a in the target window.
S22: and responding to the target instruction, and adjusting the target parameters of the target window.
Optionally, the target instruction may be preset by a machine, or may be generated according to an operation of a user, please refer to the foregoing embodiment, and optionally, the target parameter includes at least one of the following: please refer to the previous embodiment for the specific implementation of the display status, the display content, the display size, the display color, and the display brightness.
Optionally, the step S22 may specifically include:
and responding to the target instruction, and adjusting the display state of the target window to the first state and/or adjusting the display state of the application interface to the second state.
Optionally, the first state and the second state may be the same or different, the first state may be a horizontal screen display state, and the first state may also be a vertical screen display state, and similarly, the second state may be a horizontal screen display state, and the second state may also be a vertical screen display state, and please refer to the foregoing embodiment for a specific embodiment thereof.
As described above, the interface processing method according to the embodiment of the application can respond to the target operation, display the target information in the target window of the application interface, respond to the target instruction, and adjust the target parameter of the target window.
Referring to fig. 24, fig. 24 is a schematic structural diagram of a first implementation of the interface processing apparatus provided in the embodiment of the present application, the interface processing apparatus 30 may be integrated in an intelligent terminal, and the interface processing apparatus 30 may include an output module 301 and a first adjusting module 302.
Optionally, the output module 301 may be configured to output the target object in the target window of the main interface in response to the first operation.
Optionally, the first operation may be an operation for the main interface and/or the target window. Optionally, the first operation may be at least one of: long press, click, double click, continuous click, sliding operation in a preset direction, dragging, air gesture, fingerprint identification, voice control, behavior control, vibration control, state control and time control.
Alternatively, the target object may be video, audio, image, application interface, and/or plain text, among others.
Optionally, the output module 301 may be specifically configured to:
displaying the content of the target object in the main interface in the target window according to the operation position of the first operation; and/or the presence of a gas in the gas,
displaying the content of a target object in the main interface in a target window according to the operation track of the first operation; and/or the presence of a gas in the gas,
and displaying the content of the target object in the main interface in the target window according to the operation times of the first operation.
The first adjusting module 302 may be configured to adjust a target parameter of the main interface and/or a target parameter of the target window in response to the first target instruction.
Alternatively, the first target instruction may be preset by the machine, or may be generated according to an operation of the user, and therefore, in response to the first target instruction, the manner of adjusting the target parameter may include the following:
the first mode is as follows: the first target instruction is preset by the machine. For example, when the content in the main interface is displayed in the target window, the intelligent terminal indicates, according to preset configuration information: if the size adjusting operation of the user aiming at the target window is not received within 5s, the display state of the target window is adjusted to be the same as the display state of the main interface, namely the initial display state of the target window is vertical screen display, the display state of the main interface is horizontal screen display, and the size adjusting operation of the user aiming at the target window is not received within 5s, the display state of the target window is adjusted to be horizontal screen display.
The second mode is as follows: the first target instruction is generated according to the operation of a user. For example, a processing object corresponding to the first target instruction is identified or determined as a main interface and/or a target window, and a target parameter of the main interface and/or a target parameter of the target window are adjusted, optionally, the target parameter includes at least one of the following: display state, display content, display size, display color, display brightness.
Optionally, the adjusting module 302 may be specifically configured to:
adjusting the display state of the target window to be a horizontal screen state or a vertical screen state, and locking the display state of the main interface; and/or the presence of a gas in the gas,
and responding to the second target instruction, and releasing the locking state of the main interface.
Optionally, the adjusting apparatus of the present application may further include a display module 303, where the display module 303 may be specifically configured to:
identifying or determining that the interface sliding function is started;
and responding to the second operation, and performing sliding display on the target object of the target window.
Optionally, the display module 303 may be further configured to:
responding to a third operation aiming at the page turning control, and performing page turning display on the content of the target object of the target window; and/or the presence of a gas in the gas,
hiding and/or closing the target window in response to a fourth operation aiming at the switch control; and/or the presence of a gas in the gas,
responding to a fifth operation aiming at the switching control, and switching the target window into a floating window; and/or the presence of a gas in the atmosphere,
and responding to a sixth operation aiming at the shortcut control, and adjusting the display state of the target window to be a horizontal screen state or a vertical screen state, and/or adjusting the display state of the main interface to be a horizontal screen state or a vertical screen state.
Optionally, the third operation, the fourth operation, the fifth operation, and/or the sixth operation may be long pressing, clicking, double clicking, and continuous clicking, which is determined according to the actual situation and is not described herein again.
As described above, the interface processing apparatus according to the embodiment of the present application may output the target object in the target window of the main interface in response to the first operation, and adjust the target parameter of the main interface and/or the target parameter of the target window in response to the first target instruction.
Referring to fig. 25, fig. 25 is a schematic structural diagram of a second implementation of the interface processing apparatus provided in the present application, the interface processing apparatus 40 may be integrated in an intelligent terminal, and the interface processing apparatus 40 may include a display module 401 and a second adjusting module 402.
Optionally, the switching module 401 may be configured to respond to a target operation and display target information in a target window of the application interface.
Optionally, the target operation may be an operation for the main interface and/or the target window. Optionally, the first operation may be at least one of: for specific embodiments of long press, click, double click, continuous click, sliding operation in a preset direction, dragging, air separation gesture, fingerprint identification, voice control, behavior control, vibration control and wind power control, for specific embodiments of long press, click, double click, continuous click, sliding operation in a preset direction, dragging, air separation gesture, fingerprint identification, voice control, behavior control and vibration control, please refer to step S11, and for specific embodiments of wind power control, please refer to step S12, which is not described herein again.
Optionally, the display module 401 may be specifically configured to:
identifying or determining a sharing link corresponding to the target operation, and displaying target information corresponding to the sharing link in a target window; and/or the presence of a gas in the gas,
and identifying or determining the third party information in the application interface as target information, and displaying the target information in a target window.
The second adjusting module 402 may be configured to adjust a target parameter of the target window in response to the target instruction.
Optionally, the target instruction may be preset by a machine, or may be generated according to an operation of a user, and please refer to the foregoing embodiment. Optionally, the target parameter comprises at least one of: please refer to the previous embodiment for the specific implementation of the display status, the display content, the display size, the display color, and the display brightness.
Optionally, in some embodiments, the second adjusting module 402 may specifically be configured to:
and responding to the target instruction, and adjusting the display state of the target window to the first state and/or adjusting the display state of the application interface to the second state.
Optionally, the first state and the second state may be the same or different, the first state may be a horizontal screen display state, and the first state may also be a vertical screen display state, and similarly, the second state may be a horizontal screen display state, and the second state may also be a vertical screen display state, and please refer to the foregoing embodiment for a specific embodiment thereof.
As described above, the interface processing apparatus according to the embodiment of the present application may respond to a target operation, display target information in a target window of an application interface, respond to a target instruction, and adjust a target parameter of the target window.
The embodiment of the present application further provides an intelligent terminal, where the intelligent terminal includes a memory and a processor, and an interface processing program is stored in the memory, and when the interface processing program is executed by the processor, the steps of the interface processing method in any of the above embodiments are implemented.
The embodiments of the present application further provide a computer-readable storage medium, where an interface processing program is stored on the storage medium, and when the interface processing program is executed by a processor, the interface processing program implements the steps of the interface processing method in any of the embodiments.
In the embodiments of the intelligent terminal and the computer-readable storage medium provided in the present application, all technical features of any one of the embodiments of the interface processing method may be included, and the expanding and explaining contents of the specification are basically the same as those of the embodiments of the method, and are not described herein again.
Embodiments of the present application also provide a computer program product, which includes computer program code, when the computer program code runs on a computer, the computer is caused to execute the method in the above various possible embodiments.
Embodiments of the present application further provide a chip, which includes a memory and a processor, where the memory is used to store a computer program, and the processor is used to call and run the computer program from the memory, so that a device in which the chip is installed executes the method in the above various possible embodiments.
It is to be understood that the foregoing scenarios are only examples, and do not constitute a limitation on application scenarios of the technical solutions provided in the embodiments of the present application, and the technical solutions of the present application may also be applied to other scenarios. For example, as can be known by those skilled in the art, with the evolution of system architecture and the emergence of new service scenarios, the technical solution provided in the embodiments of the present application is also applicable to similar technical problems.
The above-mentioned serial numbers of the embodiments of the present application are merely for description and do not represent the merits of the embodiments.
The steps in the method of the embodiment of the application can be sequentially adjusted, combined and deleted according to actual needs.
The units in the device in the embodiment of the application can be merged, divided and deleted according to actual needs.
In the present application, the same or similar term concepts, technical solutions and/or application scenario descriptions will be generally described only in detail at the first occurrence, and when the description is repeated later, the detailed description will not be repeated in general for brevity, and when understanding the technical solutions and the like of the present application, reference may be made to the related detailed description before the description for the same or similar term concepts, technical solutions and/or application scenario descriptions and the like which are not described in detail later.
In the present application, each embodiment is described with emphasis, and reference may be made to the description of other embodiments for parts that are not described or illustrated in any embodiment.
All possible combinations of the technical features in the embodiments are not described in the present application for the sake of brevity, but should be considered as the scope of the present application as long as there is no contradiction between the combinations of the technical features.
Through the above description of the embodiments, those skilled in the art will clearly understand that the method of the above embodiments can be implemented by software plus a necessary general hardware platform, and certainly can also be implemented by hardware, but in many cases, the former is a better implementation manner. Based on such understanding, the technical solutions of the present application may be embodied in the form of a software product, which is stored in a storage medium (e.g., ROM/RAM, magnetic disk, optical disk) and includes instructions for enabling a terminal device (e.g., a mobile phone, a computer, a server, a controlled terminal, or a network device) to execute the method of each embodiment of the present application.
In the above embodiments, the implementation may be wholly or partially realized by software, hardware, firmware, or any combination thereof. When implemented in software, may be implemented in whole or in part in the form of a computer program product. The computer program product includes one or more computer instructions. The procedures or functions according to the embodiments of the present application are all or partially generated when the computer program instructions are loaded and executed on a computer. The computer may be a general purpose computer, a special purpose computer, a network of computers, or other programmable device. The computer instructions may be stored on a computer readable storage medium or transmitted from one computer readable storage medium to another, for example, the computer instructions may be transmitted from one website, computer, server, or data center to another website, computer, server, or data center by wire (e.g., coaxial cable, fiber optic, digital subscriber line) or wirelessly (e.g., infrared, wireless, microwave, etc.). The computer-readable storage medium can be any available medium that can be accessed by a computer or a data storage device, such as a server, a data center, etc., that incorporates one or more of the available media. The usable medium may be a magnetic medium (e.g., floppy Disk, memory Disk, magnetic tape), an optical medium (e.g., DVD), or a semiconductor medium (e.g., Solid State Disk (SSD)), among others.
The above description is only a preferred embodiment of the present application, and not intended to limit the scope of the present application, and all modifications of equivalent structures and equivalent processes, which are made by the contents of the specification and the drawings of the present application, or which are directly or indirectly applied to other related technical fields, are included in the scope of the present application.

Claims (10)

1. An interface processing method is characterized by comprising the following steps:
s11: responding to the first operation, and outputting a target object in a target window of the main interface;
s12: and responding to the first target instruction, and adjusting the target parameters of the main interface and/or the target parameters of the target window.
2. The method of claim 1, wherein the step of S11 includes:
and outputting the target object in a target window of the main interface according to the operation information of the first operation.
3. The method according to claim 1, wherein a trigger control is provided on the main interface, and the step S11 includes: responding to a first operation aiming at the trigger control, and displaying a target object in the main interface in the target window; and/or the presence of a gas in the gas,
the step of S12 includes: and identifying or determining a processing object corresponding to the first target instruction as a main interface and/or a target window, and adjusting a target parameter of the main interface and/or a target parameter of the target window.
4. The method of claim 1, wherein the adjusting the target parameter of the target window comprises:
adjusting the display state of the target window to be a horizontal screen state or a vertical screen state, and locking the display state of the main interface; and/or, responding to a second target instruction, and releasing the locking state of the main interface.
5. The method of any of claims 1 to 4, further comprising:
identifying or determining that the interface sliding function is opened;
and responding to a second operation, and performing sliding display on the target object of the target window.
6. The method according to any one of claims 1 to 4, wherein at least one of a page turning control, a switch control, a toggle control and a shortcut control is provided in the target window, and the method further comprises at least one of:
responding to a third operation aiming at the page turning control, and performing page turning display on the content of the target object of the target window;
hiding and/or closing the target window in response to a fourth operation directed to the switch control;
responding to a fifth operation aiming at the switching control, and switching the target window into a floating window;
and responding to a sixth operation aiming at the shortcut control, and adjusting the display state of the target window to be a horizontal screen state or a vertical screen state, and/or adjusting the display state of the main interface to be a horizontal screen state or a vertical screen state.
7. An interface processing method is characterized by comprising the following steps:
s21: responding to the target operation, and displaying target information in a target window of an application interface;
s22: and responding to a target instruction, and adjusting the target parameters of the target window.
8. The method of claim 7, wherein the step of S21 includes: identifying or determining a sharing link corresponding to the target operation, displaying target information corresponding to the sharing link in the target window, and/or identifying or determining third party information in the application interface as target information, and displaying the target information in the target window; and/or the presence of a gas in the gas,
the step of S22 includes: and responding to the target instruction, and adjusting the display state of the target window to a first state and/or adjusting the display state of the application interface to a second state.
9. An intelligent terminal, characterized in that, intelligent terminal includes: memory, a processor, wherein the memory has stored thereon an interface handler which, when executed by the processor, implements the steps of the interface processing method of any one of claims 1 to 8.
10. A computer-readable storage medium, characterized in that the storage medium has stored thereon a computer program which, when being executed by a processor, carries out the steps of the interface processing method according to any one of claims 1 to 8.
CN202210101264.0A 2022-01-27 2022-01-27 Interface processing method, intelligent terminal and storage medium Pending CN114443199A (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202210101264.0A CN114443199A (en) 2022-01-27 2022-01-27 Interface processing method, intelligent terminal and storage medium

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202210101264.0A CN114443199A (en) 2022-01-27 2022-01-27 Interface processing method, intelligent terminal and storage medium

Publications (1)

Publication Number Publication Date
CN114443199A true CN114443199A (en) 2022-05-06

Family

ID=81370698

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202210101264.0A Pending CN114443199A (en) 2022-01-27 2022-01-27 Interface processing method, intelligent terminal and storage medium

Country Status (1)

Country Link
CN (1) CN114443199A (en)

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN116095460A (en) * 2022-05-25 2023-05-09 荣耀终端有限公司 Video recording method, device and storage medium

Cited By (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN116095460A (en) * 2022-05-25 2023-05-09 荣耀终端有限公司 Video recording method, device and storage medium
CN116095460B (en) * 2022-05-25 2023-11-21 荣耀终端有限公司 Video recording method, device and storage medium

Similar Documents

Publication Publication Date Title
CN108037893B (en) Display control method and device of flexible screen and computer readable storage medium
CN107807767B (en) Communication service processing method, terminal and computer readable storage medium
CN107885448B (en) Control method for application touch operation, mobile terminal and readable storage medium
CN107809534B (en) Control method, terminal and computer storage medium
CN114398113A (en) Interface display method, intelligent terminal and storage medium
CN108287627B (en) Interface operation method based on flexible screen, terminal and computer readable storage medium
CN110083294B (en) Screen capturing method, terminal and computer readable storage medium
CN114443199A (en) Interface processing method, intelligent terminal and storage medium
CN114860674B (en) File processing method, intelligent terminal and storage medium
CN115914719A (en) Screen projection display method, intelligent terminal and storage medium
CN114138144A (en) Control method, intelligent terminal and storage medium
CN113835586A (en) Icon processing method, intelligent terminal and storage medium
CN113867765A (en) Application management method, intelligent terminal and storage medium
CN113342246A (en) Operation method, mobile terminal and storage medium
CN113867588A (en) Icon processing method, intelligent terminal and storage medium
CN113867586A (en) Icon display method, intelligent terminal and storage medium
CN114722010B (en) Folder processing method, intelligent terminal and storage medium
WO2023092343A1 (en) Icon area management method, intelligent terminal and storage medium
CN109669594B (en) Interaction control method, equipment and computer readable storage medium
CN114546198A (en) Processing method, intelligent terminal and storage medium
CN114995730A (en) Information display method, mobile terminal and storage medium
CN117032544A (en) Processing method, intelligent terminal and storage medium
CN113568532A (en) Display control method, mobile terminal and storage medium
CN116360659A (en) Processing method, intelligent terminal and storage medium
CN117193596A (en) Display method, intelligent terminal and storage medium

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication