CN116302285A - Split screen method, intelligent terminal and storage medium - Google Patents

Split screen method, intelligent terminal and storage medium Download PDF

Info

Publication number
CN116302285A
CN116302285A CN202310303592.3A CN202310303592A CN116302285A CN 116302285 A CN116302285 A CN 116302285A CN 202310303592 A CN202310303592 A CN 202310303592A CN 116302285 A CN116302285 A CN 116302285A
Authority
CN
China
Prior art keywords
application
split screen
split
target
applications
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
CN202310303592.3A
Other languages
Chinese (zh)
Inventor
郭涛陶
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Shenzhen Transsion Holdings Co Ltd
Original Assignee
Shenzhen Transsion Holdings Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Shenzhen Transsion Holdings Co Ltd filed Critical Shenzhen Transsion Holdings Co Ltd
Priority to CN202310303592.3A priority Critical patent/CN116302285A/en
Publication of CN116302285A publication Critical patent/CN116302285A/en
Pending legal-status Critical Current

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F9/00Arrangements for program control, e.g. control units
    • G06F9/06Arrangements for program control, e.g. control units using stored programs, i.e. using an internal store of processing equipment to receive or retain programs
    • G06F9/44Arrangements for executing specific programs
    • G06F9/451Execution arrangements for user interfaces
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]
    • G06F3/0481Interaction techniques based on graphical user interfaces [GUI] based on specific properties of the displayed interaction object or a metaphor-based environment, e.g. interaction with desktop elements like windows or icons, or assisted by a cursor's changing behaviour or appearance
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/14Digital output to display device ; Cooperation and interconnection of the display device with other functional units
    • G06F3/1407General aspects irrespective of display type, e.g. determination of decimal point position, display with fixed or driving decimal point, suppression of non-significant zeros

Landscapes

  • Engineering & Computer Science (AREA)
  • Theoretical Computer Science (AREA)
  • General Engineering & Computer Science (AREA)
  • Human Computer Interaction (AREA)
  • Software Systems (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • User Interface Of Digital Computer (AREA)

Abstract

The application provides a split screen method, an intelligent terminal and a storage medium, wherein the split screen method can be applied to the intelligent terminal and comprises the following steps: determining or generating at least two target applications to be split; and responding to the first operation of the target application, and displaying the target application in a split screen mode according to a preset mode. The method and the device have the advantages that the user does not need to carry out long and complicated operation on the path, the user can directly use the method and the device to easily get on hand and learn, the intuitive first operation of the user is met, the path of the user for extracting the interface content is greatly shortened, the screen separation efficiency and the use experience of the user are improved through simple, convenient and quick operation steps, the problems of complex and complicated screen separation operation and low screen separation efficiency are solved, and the user experience is further improved.

Description

Split screen method, intelligent terminal and storage medium
Technical Field
The application relates to the technical field of intelligent terminals, in particular to a split screen method, an intelligent terminal and a storage medium.
Background
At present, a user needs to perform long and complex operation on a path to realize the split screen of the application on the intelligent terminal, so that the split screen operation is not convenient and intelligent enough, and the split screen operation cost is high when the user performs the application split screen operation on the intelligent terminal.
In the process of designing and implementing the present application, the inventors found that at least the following problems exist: the split screen operation is complex and tedious, and the split screen efficiency is low.
The foregoing description is provided for general background information and does not necessarily constitute prior art.
Disclosure of Invention
Aiming at the technical problems, the application provides a split screen method, an intelligent terminal and a storage medium, so that a user does not need to carry out long and complex operation on a path, and split screens are carried out by using a simple first operation, and the operation is simple and convenient.
The application provides a split screen method, which comprises the following steps:
s10: determining or generating at least two target applications to be split;
s20: and responding to the first operation of the target application, and displaying the target application in a split screen mode according to a preset mode.
Optionally, the step S10 includes:
detecting whether a display style of an application is in a pressed state;
if the display patterns of at least two to-be-split-screen applications are in the pressed state at the same time, determining that the to-be-split-screen applications are the target applications;
and/or, the step S20 includes:
and detecting the first operation to aggregate and move each target application, and then displaying the target application in a split screen mode according to a preset mode.
Optionally, the step S10 includes:
s101: detecting whether a display style of an application is in a pressed state and in a moving state;
s102: if yes, determining the application to be split into known applications in the target application;
s103: the target application other than the known application is determined to be an unknown application.
Optionally, the step S103 includes:
determining the unknown application according to a second operation; or alternatively, the process may be performed,
and determining the application to be split, which is overlapped with the known application, as the unknown application.
Optionally, the step S20 includes:
determining all the target applications in response to a second operation of the target applications;
and after all the target applications are determined, the target applications are displayed in a split screen mode according to a preset mode.
Optionally, the display style of the application includes an application icon and/or an application component, and the application component includes an application card, an application function shortcut entry, and/or an application window.
Optionally, the method further comprises:
and if a third operation of exiting the split screen is detected, exiting at least one target application which is being displayed by the split screen based on the third operation.
Optionally, after the step S20, the method further includes:
If a fourth operation for replacing the target application is detected, determining a bit-filling application for replacing the target application in a split screen historical application or in an installed application;
and if the fifth operation of switching the target application is detected, switching the split screen area of the target application in the current preset mode.
The application also provides an intelligent terminal, the intelligent terminal includes: the system comprises a memory and a processor, wherein the memory stores a split screen program, and the split screen program realizes the steps of the split screen method when being executed by the processor.
The present application also provides a storage medium having stored thereon a computer program which, when executed by a processor, implements the steps of the split-screen method as described in any of the above.
As described above, the split screen method of the present application includes the steps of: s10: determining or generating at least two target applications to be split; s20: and responding to the first operation of the target application, and displaying the target application in a split screen mode according to a preset mode. Through the application, the user can be enabled to avoid lengthy and complex operation of the path, the direct use is simple and easy to operate and easy to learn, the intuitive first operation of the user is met, the path of the user for extracting the interface content is greatly shortened, the screen separation efficiency and the use experience of the user are improved through simple, convenient and quick operation steps, the problems of complex and complex screen separation operation and low screen separation efficiency are solved, and the user experience is further improved.
Drawings
The accompanying drawings, which are incorporated in and constitute a part of this specification, illustrate embodiments consistent with the application and together with the description, serve to explain the principles of the application. In order to more clearly illustrate the technical solutions of the embodiments of the present application, the drawings that are needed in the description of the embodiments will be briefly described below, and it will be obvious to those skilled in the art that other drawings can be obtained from these drawings without inventive effort.
Fig. 1 is a schematic hardware structure diagram of an intelligent terminal implementing various embodiments of the present application;
fig. 2 is a schematic diagram of a communication network system according to an embodiment of the present application;
fig. 3 is a flow chart of a split screen method according to the first embodiment;
fig. 4 is a schematic view of a split screen operation shown according to a second embodiment;
fig. 5 is a split screen presentation schematic diagram shown according to a second embodiment.
The realization, functional characteristics and advantages of the present application will be further described with reference to the embodiments, referring to the attached drawings. Specific embodiments thereof have been shown by way of example in the drawings and will herein be described in more detail. These drawings and the written description are not intended to limit the scope of the inventive concepts in any way, but to illustrate the concepts of the present application to those skilled in the art by reference to specific embodiments.
Detailed Description
Reference will now be made in detail to exemplary embodiments, examples of which are illustrated in the accompanying drawings. When the following description refers to the accompanying drawings, the same numbers in different drawings refer to the same or similar elements, unless otherwise indicated. The implementations described in the following exemplary examples are not representative of all implementations consistent with the present application. Rather, they are merely examples of apparatus and methods consistent with some aspects of the present application as detailed in the accompanying claims.
It should be noted that, in this document, the terms "comprises," "comprising," or any other variation thereof, are intended to cover a non-exclusive inclusion, such that a process, method, article, or apparatus that comprises a list of elements does not include only those elements but may include other elements not expressly listed or inherent to such process, method, article, or apparatus. Without further limitation, the element defined by the phrase "comprising one … …" does not exclude the presence of other identical elements in a process, method, article, or apparatus that comprises the element, and furthermore, elements having the same name in different embodiments of the present application may have the same meaning or may have different meanings, a particular meaning of which is to be determined by its interpretation in this particular embodiment or by further combining the context of this particular embodiment.
It should be understood that although the terms first, second, third, etc. may be used herein to describe various information, these information should not be limited by these terms. These terms are only used to distinguish one type of information from another. For example, first information may also be referred to as second information, and similarly, second information may also be referred to as first information, without departing from the scope herein. The word "if" as used herein may be interpreted as "at … …" or "at … …" or "responsive to a determination", depending on the context. Furthermore, as used herein, the singular forms "a", "an" and "the" are intended to include the plural forms as well, unless the context indicates otherwise. It will be further understood that the terms "comprises," "comprising," "includes," and/or "including" specify the presence of stated features, steps, operations, elements, components, items, categories, and/or groups, but do not preclude the presence, presence or addition of one or more other features, steps, operations, elements, components, items, categories, and/or groups. The terms "or," "and/or," "including at least one of," and the like, as used herein, may be construed as inclusive, or meaning any one or any combination. For example, "including at least one of: A. b, C "means" any one of the following: a, A is as follows; b, a step of preparing a composite material; c, performing operation; a and B; a and C; b and C; a and B and C ", again as examples," A, B or C "or" A, B and/or C "means" any of the following: a, A is as follows; b, a step of preparing a composite material; c, performing operation; a and B; a and C; b and C; a and B and C). An exception to this definition will occur only when a combination of elements, functions, steps or operations are in some way inherently mutually exclusive.
It should be understood that, although the steps in the flowcharts in the embodiments of the present application are shown in order as indicated by the arrows, these steps are not necessarily performed in order as indicated by the arrows. The steps are not strictly limited in order and may be performed in other orders, unless explicitly stated herein. Moreover, at least some of the steps in the figures may include multiple sub-steps or stages that are not necessarily performed at the same time, but may be performed at different times, the order of their execution not necessarily occurring in sequence, but may be performed alternately or alternately with other steps or at least a portion of the other steps or stages.
The words "if", as used herein, may be interpreted as "at … …" or "at … …" or "in response to a determination" or "in response to a detection", depending on the context. Similarly, the phrase "if determined" or "if detected (stated condition or event)" may be interpreted as "when determined" or "in response to determination" or "when detected (stated condition or event)" or "in response to detection (stated condition or event), depending on the context.
It should be noted that, in this document, step numbers such as S10 and S20 are adopted, and the purpose of the present invention is to more clearly and briefly describe the corresponding content, and not to constitute a substantial limitation on the sequence, and those skilled in the art may execute S20 first and then execute S10 when implementing the present invention, which is within the scope of protection of the present application.
It should be understood that the specific embodiments described herein are for purposes of illustration only and are not intended to limit the present application.
In the following description, suffixes such as "module", "component", or "unit" for representing elements are used only for facilitating the description of the present application, and are not of specific significance per se. Thus, "module," "component," or "unit" may be used in combination.
The intelligent terminal may be implemented in various forms. For example, the smart terminals described in the present application may include smart terminals such as cell phones, tablet computers, notebook computers, palm computers, personal digital assistants (Personal Digital Assistant, PDA), portable media players (Portable Media Player, PMP), navigation devices, wearable devices, smart bracelets, pedometers, and stationary terminals such as digital TVs, desktop computers, and the like.
In the following description, an intelligent terminal will be described as an example, and those skilled in the art will understand that the configuration according to the embodiment of the present application can be applied to a fixed type terminal in addition to elements particularly used for a mobile purpose.
Referring to fig. 1, which is a schematic hardware structure of an intelligent terminal for implementing various embodiments of the present application, the intelligent terminal 100 may include: an RF (Radio Frequency) unit 101, a WiFi module 102, an audio output unit 103, an a/V (audio/video) input unit 104, a sensor 105, a display unit 106, a user input unit 107, an interface unit 108, a memory 109, a processor 110, and a power supply 111. It will be appreciated by those skilled in the art that the configuration of the intelligent terminal shown in fig. 1 is not limiting of the intelligent terminal, and the intelligent terminal may include more or less components than those illustrated, or may combine certain components, or may have a different arrangement of components.
The following describes the components of the intelligent terminal in detail with reference to fig. 1:
the radio frequency unit 101 may be used for receiving and transmitting signals during the information receiving or communication process, specifically, after receiving downlink information of the base station, processing the downlink information by the processor 110; and, the uplink data is transmitted to the base station. Typically, the radio frequency unit 101 includes, but is not limited to, an antenna, at least one amplifier, a transceiver, a coupler, a low noise amplifier, a duplexer, and the like. In addition, the radio frequency unit 101 may also communicate with networks and other devices via wireless communications. The wireless communication may use any communication standard or protocol including, but not limited to, GSM (Global System of Mobile communication, global system for mobile communications), GPRS (General Packet Radio Service ), CDMA2000 (Code Division Multiple Access, 2000, CDMA 2000), WCDMA (Wideband Code Division Multiple Access ), TD-SCDMA (Time Division-Synchronous Code Division Multiple Access, time Division synchronous code Division multiple access), FDD-LTE (Frequency Division Duplexing-Long Term Evolution, frequency Division duplex long term evolution), TDD-LTE (Time Division Duplexing-Long Term Evolution, time Division duplex long term evolution), and 5G, among others.
WiFi belongs to a short-distance wireless transmission technology, and the intelligent terminal can help a user to send and receive emails, browse webpages, access streaming media and the like through the WiFi module 102, so that wireless broadband Internet access is provided for the user. Although fig. 1 shows a WiFi module 102, it is understood that it does not belong to the essential constitution of the intelligent terminal, and can be omitted entirely as required within the scope of not changing the essence of the invention.
The audio output unit 103 may convert audio data received by the radio frequency unit 101 or the WiFi module 102 or stored in the memory 109 into an audio signal and output as sound when the intelligent terminal 100 is in a call signal reception mode, a talk mode, a recording mode, a voice recognition mode, a broadcast reception mode, or the like. Also, the audio output unit 103 may also provide audio output (e.g., a call signal reception sound, a message reception sound, etc.) related to a specific function performed by the smart terminal 100. The audio output unit 103 may include a speaker, a buzzer, and the like.
The a/V input unit 104 is used to receive an audio or video signal. The a/V input unit 104 may include a graphics processor (Graphics Processing Unit, GPU) 1041 and a microphone 1042, the graphics processor 1041 processing image data of still pictures or video obtained by an image capturing device (e.g., a camera) in a video capturing mode or an image capturing mode. The processed image frames may be displayed on the display unit 106. The image frames processed by the graphics processor 1041 may be stored in the memory 109 (or other storage medium) or transmitted via the radio frequency unit 101 or the WiFi module 102. The microphone 1042 can receive sound (audio data) via the microphone 1042 in a phone call mode, a recording mode, a voice recognition mode, and the like, and can process such sound into audio data. The processed audio (voice) data may be converted into a format output that can be transmitted to the mobile communication base station via the radio frequency unit 101 in the case of a telephone call mode. The microphone 1042 may implement various types of noise cancellation (or suppression) algorithms to cancel (or suppress) noise or interference generated in the course of receiving and transmitting the audio signal.
The intelligent terminal 100 also includes at least one sensor 105, such as a light sensor, a motion sensor, and other sensors. Optionally, the light sensor includes an ambient light sensor and a proximity sensor, optionally, the ambient light sensor may adjust the brightness of the display panel 1061 according to the brightness of ambient light, and the proximity sensor may turn off the display panel 1061 and/or the backlight when the smart terminal 100 moves to the ear. As one of the motion sensors, the accelerometer sensor can detect the acceleration in all directions (generally three axes), and can detect the gravity and direction when stationary, and can be used for applications of recognizing the gesture of a mobile phone (such as horizontal and vertical screen switching, related games, magnetometer gesture calibration), vibration recognition related functions (such as pedometer and knocking), and the like; as for other sensors such as fingerprint sensors, pressure sensors, iris sensors, molecular sensors, gyroscopes, barometers, hygrometers, thermometers, infrared sensors, etc. that may also be configured in the mobile phone, the detailed description thereof will be omitted.
The display unit 106 is used to display information input by a user or information provided to the user. The display unit 106 may include a display panel 1061, and the display panel 1061 may be configured in the form of a liquid crystal display (Liquid Crystal Display, LCD), an Organic Light-Emitting Diode (OLED), or the like.
The user input unit 107 may be used to receive input numeric or character information and to generate key signal inputs related to user settings and function control of the intelligent terminal. Alternatively, the user input unit 107 may include a touch panel 1071 and other input devices 1072. The touch panel 1071, also referred to as a touch screen, may collect touch operations thereon or thereabout by a user (e.g., operations of the user on the touch panel 1071 or thereabout by using any suitable object or accessory such as a finger, a stylus, etc.) and drive the corresponding connection device according to a predetermined program. The touch panel 1071 may include two parts of a touch detection device and a touch controller. Optionally, the touch detection device detects the touch azimuth of the user, detects a signal brought by touch operation, and transmits the signal to the touch controller; the touch controller receives touch information from the touch detection device, converts it into touch point coordinates, and sends the touch point coordinates to the processor 110, and can receive and execute commands sent from the processor 110. Further, the touch panel 1071 may be implemented in various types such as resistive, capacitive, infrared, and surface acoustic wave. The user input unit 107 may include other input devices 1072 in addition to the touch panel 1071. Alternatively, other input devices 1072 may include, but are not limited to, one or more of a physical keyboard, function keys (e.g., volume control keys, switch keys, etc.), a trackball, mouse, joystick, etc., as specifically not limited herein.
Alternatively, the touch panel 1071 may overlay the display panel 1061, and when the touch panel 1071 detects a touch operation thereon or thereabout, the touch panel 1071 is transferred to the processor 110 to determine the type of touch event, and the processor 110 then provides a corresponding visual output on the display panel 1061 according to the type of touch event. Although in fig. 1, the touch panel 1071 and the display panel 1061 are two independent components for implementing the input and output functions of the smart terminal, in some embodiments, the touch panel 1071 may be integrated with the display panel 1061 to implement the input and output functions of the smart terminal, which is not limited herein.
The interface unit 108 serves as an interface through which at least one external device can be connected with the intelligent terminal 100. For example, the external devices may include a wired or wireless headset port, an external power (or battery charger) port, a wired or wireless data port, a memory card port, a port for connecting a device having an identification module, an audio input/output (I/O) port, a video I/O port, an earphone port, and the like. The interface unit 108 may be used to receive input (e.g., data information, power, etc.) from an external device and transmit the received input to one or more elements within the smart terminal 100 or may be used to transmit data between the smart terminal 100 and an external device.
Memory 109 may be used to store software programs as well as various data. The memory 109 may mainly include a storage program area and a storage data area, and alternatively, the storage program area may store an operating system, an application program required for at least one function (such as a sound playing function, an image playing function, etc.), and the like; the storage data area may store data (such as audio data, phonebook, etc.) created according to the use of the handset, etc. In addition, memory 109 may include high-speed random access memory, and may also include non-volatile memory, such as at least one magnetic disk storage device, flash memory device, or other volatile solid-state storage device.
The processor 110 is a control center of the intelligent terminal, connects various parts of the entire intelligent terminal using various interfaces and lines, and performs various functions of the intelligent terminal and processes data by running or executing software programs and/or modules stored in the memory 109 and calling data stored in the memory 109, thereby performing overall monitoring of the intelligent terminal. Processor 110 may include one or more processing units; preferably, the processor 110 may integrate an application processor and a modem processor, the application processor optionally handling mainly an operating system, a user interface, an application program, etc., the modem processor handling mainly wireless communication. It will be appreciated that the modem processor described above may not be integrated into the processor 110.
The intelligent terminal 100 may further include a power source 111 (such as a battery) for supplying power to the respective components, and preferably, the power source 111 may be logically connected to the processor 110 through a power management system, so as to implement functions of managing charging, discharging, power consumption management, etc. through the power management system.
Although not shown in fig. 1, the intelligent terminal 100 may further include a bluetooth module or the like, which is not described herein.
In order to facilitate understanding of the embodiments of the present application, a communication network system on which the intelligent terminal of the present application is based will be described below.
Referring to fig. 2, fig. 2 is a schematic diagram of a communication network system provided in the embodiment of the present application, where the communication network system is an LTE system of a general mobile communication technology, and the LTE system includes a UE (User Equipment) 201, an e-UTRAN (Evolved UMTS Terrestrial Radio Access Network ) 202, an epc (Evolved Packet Core, evolved packet core) 203, and an IP service 204 of an operator that are sequentially connected in communication.
Alternatively, the UE201 may be the terminal 100 described above, which is not described here again.
The E-UTRAN202 includes eNodeB2021 and other eNodeB2022, etc. Alternatively, the eNodeB2021 may connect with other enodebs 2022 over a backhaul (e.g., X2 interface), the eNodeB2021 is connected to the EPC203, and the eNodeB2021 may provide access for the UE201 to the EPC 203.
EPC203 may include MME (Mobility Management Entity ) 2031, hss (Home Subscriber Server, home subscriber server) 2032, other MMEs 2033, SGW (Serving Gate Way) 2034, pgw (PDN Gate Way) 2035 and PCRF (Policy and Charging Rules Function, policy and tariff function entity) 2036, and so on. Optionally, MME2031 is a control node that handles signaling between UE201 and EPC203, providing bearer and connection management. HSS2032 is used to provide registers to manage functions such as home location registers (not shown) and to hold user specific information about service characteristics, data rates, etc. All user data may be sent through SGW2034 and PGW2035 may provide IP address allocation and other functions for UE201, PCRF2036 is a policy and charging control policy decision point for traffic data flows and IP bearer resources, which selects and provides available policy and charging control decisions for a policy and charging enforcement function (not shown).
IP services 204 may include the internet, intranets, IMS (IP Multimedia Subsystem ), or other IP services, etc.
Although the LTE system is described above as an example, it should be understood by those skilled in the art that the present application is not limited to LTE systems, but may be applied to other wireless communication systems, such as GSM, CDMA2000, WCDMA, TD-SCDMA, 5G, and future new network systems (e.g., 6G), etc.
Based on the intelligent terminal hardware structure and the communication network system, various embodiments of the application are provided.
First embodiment
In this embodiment, referring to fig. 3, the steps include:
s10: determining or generating at least two target applications to be split;
s20: and responding to the first operation of the target application, and displaying the target application in a split screen mode according to a preset mode.
The application to be split screen in the application is an application which is developed in advance and set to be split screen, and if not specified, the application below is the split screen application. And displaying each application in each page of the desktop in the intelligent terminal, wherein each application is in a state of being selectable for split screen. First, by determining the target applications for splitting, optionally, the target applications that can be allocated include at least two, but may be three or more, so as to determine the number of target applications that can be split at most according to actual development and screen content size, for example, the two-sided screen and the panel can implement splitting of three or even four applications through three-finger or even four-finger multi-touch operation. Optionally, at least 2 target applications to be split may be different applications, or may be a certain application and a split of its applications. Optionally, the first operation is a split screen gesture operation of the application by the user. The preset mode can be up-down split screen display or left-right split screen display, and the up-down split screen or left-right split screen is triggered by identifying the opening angle or the placement direction of the screen of the intelligent terminal (the length parallel to the horizontal line after full screen, the width perpendicular to the horizontal line, if the length is greater than or equal to the width, the up-down split screen display, if the length is smaller than the width, the left-right split screen display), or the up-down folding mode or the left-right folding mode (up-down folding, the up-down split screen display, the left-right folding mode, the left-right split screen display). In addition, the preset mode may be a split display according to a split ratio, for example, 5:5 or 4:6, etc. The preset mode is not limited in this embodiment.
After at least one target application to be split is determined, at least two target applications are split according to the distribution operation gesture executed on the target applications, and at least two target applications are split and displayed on the current interface at the same time. After detecting the first operation on the target applications, the intelligent terminal enters a split screen mode and simultaneously runs at least two target applications.
At present, a user needs to perform long and complex operation on a path to realize the split screen of the application on the intelligent terminal, and the split screen operation is complex and tedious and has low split screen efficiency. In this embodiment, at least two target applications to be split are determined; after detecting the first operation on the target application, the target application is displayed in a split screen mode according to a preset mode. Through the technical scheme, the user can directly use the first operation which is simple, easy to operate, easy to learn and convenient to use and accords with intuitionistic use of the user without carrying out long and complicated operation on the path, the path of the user for extracting the interface content is greatly shortened, the screen separation efficiency and the use experience of the user are improved through simple, convenient and quick operation steps, the problems of complex and complicated screen separation operation and low screen separation efficiency are solved, and the user experience is further improved.
Second embodiment
In this embodiment, the step S10 includes:
detecting whether a display style of an application is in a pressed state;
if the display patterns of at least two to-be-split-screen applications are in the pressed state at the same time, determining that the to-be-split-screen applications are the target applications;
in this embodiment, one of the methods of determining the target application is to detect whether the display style of the application is simultaneously in a pressed state, that is, in a selected state determined by the user through a touch operation. And if the display patterns of at least two applications to be split are in the pressed state at the same time, determining that the selected application in the pressed state is the target application to be split. For example, referring to fig. 4, if a user selects two applications such as notes and files in the current page by two fingers, the two selected applications may be directly determined as target applications to be split. Or, one or two applications can be selected by double-finger pressing on the current page, and then the current page is slid to other pages (desktops), so that other target applications to be split can be selected in the other pages.
And/or, the step S20 includes:
And detecting the first operation to aggregate and move each target application, and then displaying the target application in a split screen mode according to a preset mode.
After each target application to be split is selected, the target applications can be split through a first operation. Optionally, when each target application is selected, the display style of each target application is in a pressed and selected state, and at this time, the first operation is used to aggregate and move each display style, so as to indicate that the target application is split screen, so as to trigger the split screen mode of the intelligent terminal, and each selected target application is simultaneously displayed in different page areas or different pages in the split screen mode. For example, as shown in fig. 5, the two applications of note and file management are directly displayed on the left and right split screens of the current page. Optionally, the action of aggregating and moving the display patterns of each target application is equivalent to a triggering action of enabling the intelligent terminal to start the split screen mode, where the aggregate movement operation may be to pinch the display patterns of each target application to overlap, or pinch the display patterns of each target application for a distance, or may be other multi-finger operation gestures convenient for the user to use, for example, may also be movement opposite to aggregation. In this embodiment, the operation gesture for triggering the intelligent terminal to start the split screen mode is not limited.
Optionally, the user uses two arbitrary fingers to press respectively on different applications that can divide the screen, and two fingers pinch the icon of two applications together, and two applications open in order to divide the screen show with the mode of dividing the screen simultaneously.
Optionally, the display style of the application includes an application icon and/or an application component, and the application component includes an application card, an application function shortcut entry, and/or an application window.
In this embodiment, the display style of the application includes an application icon and/or an application component, and optionally, the application component includes an application card, an application function shortcut entry, and/or an application window. When determining the target application, the confirmation can be performed only based on the application icon or the application component, or can be a combination of the application component and the application icon, such as an application card, an application function shortcut entry, an application window and the like. In the present embodiment, the display style type and implementation of the application are not limited.
Because the scenes of the split screen are few, the number of times of using the split screen by a user is small, the split screen function keys or the split screen function inlets are not placed at the striking positions during interface design development, the user can difficultly learn complex split screen operation, the split screen learning cost is further high, and finally the split screen operation cost is high when the user applies the split screen operation.
In this embodiment, after the target application is selected, the split-screen operation is completed by conforming to the display style of the similar aggregation mobile target application that the user uses intuition, so that the user remembers the split-screen operation of the intelligent terminal after learning the split-screen operation once. In addition, the split screen is easy to operate and use, the complexity of the split screen operation is further reduced, a user does not need to carry out long and complex operation on a path, the split screen is carried out by using simple first operation, and the operation is simple and convenient.
Third embodiment
In this embodiment, the step S10 includes:
s101: detecting whether a display style of an application is in a pressed state and in a moving state;
s102: if yes, determining the application to be split into known applications in the target application;
s103: the target application other than the known application is determined to be an unknown application.
Unlike the second embodiment, whether the display style of the application is simultaneously in the pressed state is detected, it is determined that the selected application is the target application to be split in the pressed state at the same time. In this embodiment, another method for determining the target application is: determining whether the display style of the application is in a pressed state and in a moving state (dragged state), taking the selected and dragged application as a known application in the target application, and then determining other unknown applications in the target application according to a second operation of the known application. Optionally, the second operation is a drag gesture operation of the application by the user.
Optionally, the dragged page may be a page where a known application is located, or may be another page after sliding on the current page, where the rest of unknown applications in the target application are determined in the other pages. Alternatively, the selected and dragged known application may be at least one, and is not limited in this embodiment, and the number of known applications is also determined in terms of actual development, screen content size, and the like.
Optionally, the step S103 includes:
determining the unknown application according to a second operation; or alternatively, the process may be performed,
and determining the application to be split, which is overlapped with the known application, as the unknown application.
In the case of determining a known application, the unknown application remaining in the target application determined by the second operation on the known application is required. Optionally, the dragging of multiple touch points to multiple known applications by means of multi-finger touch has the same motion track, including the same direction and distance, etc.
In this embodiment, an unknown application that conforms to the user's use of intuitive operations to determine the screen to be split is also provided. The operation may be to drag one or more known applications in any direction, taking the operation direction when the touch response of the drag operation disappears as the final gesture direction, so as to determine the nearest or preset number of or all applications to be split on the gesture direction as unknown applications, which may be a visual effect similar to drag and slide the known applications on the current page to "throw out" in the same direction. In particular, if the known applications are moved together in the same center, the split screen in the second embodiment is realized, and thus, it is not necessary or impossible to determine the remaining unknown applications.
The operation may further be to drag at least one known application to overlap with the to-be-split application on the current page or on each page such as the current page and other pages, and use one or more to-be-split applications overlapped with the known application as an unknown application, alternatively, at most, the same number of location applications as the known application may be overlapped.
In the simplest implementation mode, a user can press any finger on the widget to trigger a dragging mode of the widget, and drag the widget to a specific area at the top of the widget to overlap any application icon capable of being split, so that the application corresponding to the widget and the overlapped application enter the split display.
Optionally, the step S20 includes:
determining all the target applications in response to a second operation of the target applications;
and after all the target applications are determined, the target applications are displayed in a split screen mode according to a preset mode.
The second operation is specifically described as a first operation, except that all the target applications have been selected and confirmed when the split screen operation is completed by aggregating the display styles of the moving target applications in the second embodiment. In this embodiment, only one or several known applications in the target applications are determined before the split screen mode of the intelligent terminal is triggered and the applications are split, so that the remaining one or several unknown applications in the target applications need to be determined through a first operation, that is, a second operation.
Thus, upon detection of the second operation of the target application, corresponding to the determination of only the known application, it is necessary to wait for the determination of the remaining unknown application. After all the target applications are determined, the split screen mode of the intelligent terminal is triggered at the moment of the second operation failure, so that the split screen mode of the intelligent terminal can be triggered, and all the target applications are simultaneously operated.
Similarly, because the split screen has few scenes, the number of times of using the split screen by a user is small, the split screen function keys or the split screen function inlets are not placed at the striking positions during interface design development, the user can difficultly learn complex split screen operation, further the split screen learning cost is high, and finally the split screen operation cost is high when the user performs application split screen operation.
In this embodiment, after the known application is selected, the unknown application is determined by conforming to the display style of the user using intuitive similar drag and drop movement target application, so that the user remembers the split screen operation of the intelligent terminal after learning the split screen operation once. In addition, the split screen is easy to operate and use, the complexity of the split screen operation is further reduced, a user does not need to carry out long and complex operation on a path, the split screen is carried out by using simple first operation, and the operation is simple and convenient.
Fourth embodiment
In this embodiment, the method further includes:
if a third operation of exiting the split screen is detected, exiting at least one target application being displayed by the split screen based on the third operation, wherein the third operation optionally comprises a first operation or a second operation opposite to the first operation.
After the target application is displayed in a split screen manner, a method for exiting the target application from the split screen is also needed. In the present embodiment, the operation of pushing out the split screen is performed by whether or not the third operation of exiting the split screen is detected. If a third operation such as an opposite operation of the first operation or the second operation, an operation after corresponding improvement adaptation is performed on the screen display status quo when the screen is split, or a redesigned split-screen exiting operation is detected, the target application is exited in response to the third operation, optionally, the exiting target application may be one, a plurality of or all target applications being split-screen displayed, and the specific target application to be exited is determined by the third operation, which is not limited in this embodiment.
Fifth embodiment
In this embodiment, after the step S20, the method further includes:
if a fourth operation for replacing the target application is detected, determining a bit-filling application for replacing the target application in a split screen historical application or in an installed application;
And if the fifth operation of switching the target application is detected, switching the split screen area of the target application in the current preset mode.
After the target application is displayed in a split screen manner, a method for replacing the target application currently being displayed in a split screen manner needs to be provided. The replacement refers to replacing one or some target applications currently being displayed on the split screen with other applications not being displayed on the split screen or target applications being displayed on the split screen, that is to say, closing the target applications currently being displayed on the split screen. In this embodiment, if the fourth operation of replacing the target application is detected, the bit-filling application is used instead of the target application, and in this embodiment, the specific implementation of the fourth operation is not limited. Alternatively, the bit-filling application may be other target applications in split display, or split history applications that have been split display but are not currently split display, or applications that have been installed but are not in split display, or the like. In this embodiment, the type and the determination manner of the bit-filling application are not limited.
After the target application is displayed in a split screen manner, a method for switching the target application currently being displayed in a split screen manner needs to be provided. The switching means that the split screen display area of one or some target applications currently being split screen displayed is switched with the split screen display areas of other target applications being split screen displayed, that is to say, the split screen display areas of the target applications are adjusted. In this embodiment, if the fifth operation of switching the target application is detected, the split screen area of the target application is switched in the current preset mode. In this embodiment, the specific implementation manner of the fifth operation is not limited, and similarly, adjustment of the split screen display area of the fifth operation is not limited.
The application also provides an intelligent terminal, the intelligent terminal includes: the system comprises a memory and a processor, wherein the memory stores a split screen program, and the split screen program realizes the steps of the split screen method when being executed by the processor.
The present application also provides a storage medium having stored thereon a computer program which, when executed by a processor, implements the steps of the split-screen method as described in any of the above.
The embodiments of the intelligent terminal and the computer storage medium provided in the present application may include all technical features of any one of the embodiments of the split-screen method, and the expansion and explanation contents of the description are substantially the same as those of each embodiment of the method, which are not repeated herein.
The present embodiments also provide a computer program product comprising computer program code which, when run on a computer, causes the computer to perform the method in the various possible implementations as above.
The embodiments also provide a chip including a memory for storing a computer program and a processor for calling and running the computer program from the memory, so that a device on which the chip is mounted performs the method in the above possible embodiments.
It can be understood that the above scenario is merely an example, and does not constitute a limitation on the application scenario of the technical solution provided in the embodiments of the present application, and the technical solution of the present application may also be applied to other scenarios. For example, as one of ordinary skill in the art can know, with the evolution of the system architecture and the appearance of new service scenarios, the technical solutions provided in the embodiments of the present application are equally applicable to similar technical problems.
The foregoing embodiment numbers of the present application are merely for describing, and do not represent advantages or disadvantages of the embodiments.
The steps in the method of the embodiment of the application can be sequentially adjusted, combined and deleted according to actual needs.
The units in the device of the embodiment of the application can be combined, divided and pruned according to actual needs.
In this application, the same or similar term concept, technical solution, and/or application scenario description will generally be described in detail only when first appearing, and when repeated later, for brevity, will not generally be repeated, and when understanding the content of the technical solution of the present application, etc., reference may be made to the previous related detailed description thereof for the same or similar term concept, technical solution, and/or application scenario description, etc., which are not described in detail later.
In this application, the descriptions of the embodiments are focused on, and the details or descriptions of one embodiment may be found in the related descriptions of other embodiments.
The technical features of the technical solutions of the present application may be arbitrarily combined, and for brevity of description, all possible combinations of the technical features in the above embodiments are not described, however, as long as there is no contradiction between the combinations of the technical features, they should be considered as the scope of the present application.
From the above description of the embodiments, it will be clear to those skilled in the art that the above-described embodiment method may be implemented by means of software plus a necessary general hardware platform, but of course may also be implemented by means of hardware, but in many cases the former is a preferred embodiment. Based on such understanding, the technical solution of the present application may be embodied essentially or in a part contributing to the prior art in the form of a software product stored in a storage medium (e.g. ROM/RAM, magnetic disk, optical disk) as above, including several instructions for causing a terminal device (which may be a mobile phone, a computer, a server, a controlled terminal, or a network device, etc.) to perform the method of each embodiment of the present application.
In the above embodiments, it may be implemented in whole or in part by software, hardware, firmware, or any combination thereof. When implemented in software, may be implemented in whole or in part in the form of a computer program product. The computer program product includes one or more computer instructions. When the computer program instructions are loaded and executed on a computer, the processes or functions in accordance with embodiments of the present application are produced in whole or in part. The computer may be a general purpose computer, a special purpose computer, a network of computers, or other programmable devices. The computer instructions may be stored in a computer storage medium or transmitted from one computer storage medium to another computer storage medium, for example, from one website, computer, server, or data center to another website, computer, server, or data center by a wired (e.g., coaxial cable, fiber optic, digital subscriber line), or wireless (e.g., infrared, wireless, microwave, etc.). Computer storage media may be any available media that can be accessed by a computer or data storage devices, such as servers, data centers, etc. that contain an integration of one or more of the available media. Usable media may be magnetic media (e.g., floppy disks, storage disks, magnetic tape), optical media (e.g., DVD), or semiconductor media (e.g., solid State Disk (SSD)), among others.
The foregoing description is only of the preferred embodiments of the present application, and is not intended to limit the scope of the claims, and all equivalent structures or equivalent processes using the descriptions and drawings of the present application, or direct or indirect application in other related technical fields are included in the scope of the claims of the present application.

Claims (10)

1. The split screen method is characterized by comprising the following steps of:
s10: determining or generating at least two target applications to be split;
s20: and responding to the first operation of the target application, and displaying the target application in a split screen mode according to a preset mode.
2. The method according to claim 1, wherein the step S10 includes:
detecting whether a display style of an application is in a pressed state;
if the display patterns of at least two to-be-split-screen applications are in the pressed state at the same time, determining that the to-be-split-screen applications are the target applications;
and/or, the step S20 includes:
and detecting the first operation to aggregate and move each target application, and then displaying the target application in a split screen mode according to a preset mode.
3. The method according to claim 1, wherein the step S10 includes:
S101: detecting whether a display style of an application is in a pressed state and in a moving state;
s102: if yes, determining the application to be split into known applications in the target application;
s103: the target application other than the known application is determined to be an unknown application.
4. A method according to claim 3, wherein said step S103 comprises:
determining the unknown application according to a second operation; or alternatively, the process may be performed,
and determining the application to be split, which is overlapped with the known application, as the unknown application.
5. A method according to claim 3, wherein said step S20 comprises:
determining all the target applications in response to a second operation of the target applications;
and after all the target applications are determined, the target applications are displayed in a split screen mode according to a preset mode.
6. The method according to any of claims 2 to 5, wherein the display style of the application comprises application icons and/or application components comprising application cards, application function shortcut entries and/or application windows.
7. The method of any one of claims 2 to 5, further comprising:
And if a third operation of exiting the split screen is detected, exiting at least one target application which is being displayed by the split screen based on the third operation.
8. The method according to any one of claims 1 to 5, further comprising, after step S20:
if a fourth operation for replacing the target application is detected, determining a bit-filling application for replacing the target application in a split screen historical application or in an installed application;
and if the fifth operation of switching the target application is detected, switching the split screen area of the target application in the current preset mode.
9. An intelligent terminal, characterized in that, the intelligent terminal includes: memory, a processor, wherein the memory has stored thereon a split screen program which when executed by the processor implements the steps of the split screen method according to any of claims 1 to 8.
10. A storage medium having stored thereon a computer program which, when executed by a processor, implements the steps of the split screen method according to any of claims 1 to 8.
CN202310303592.3A 2023-03-13 2023-03-13 Split screen method, intelligent terminal and storage medium Pending CN116302285A (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202310303592.3A CN116302285A (en) 2023-03-13 2023-03-13 Split screen method, intelligent terminal and storage medium

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202310303592.3A CN116302285A (en) 2023-03-13 2023-03-13 Split screen method, intelligent terminal and storage medium

Publications (1)

Publication Number Publication Date
CN116302285A true CN116302285A (en) 2023-06-23

Family

ID=86820400

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202310303592.3A Pending CN116302285A (en) 2023-03-13 2023-03-13 Split screen method, intelligent terminal and storage medium

Country Status (1)

Country Link
CN (1) CN116302285A (en)

Similar Documents

Publication Publication Date Title
CN108037893B (en) Display control method and device of flexible screen and computer readable storage medium
CN112416223A (en) Display method, electronic device and readable storage medium
CN113407081A (en) Display method, mobile terminal and storage medium
CN112558826A (en) Shortcut operation method, mobile terminal and storage medium
CN112199141A (en) Message processing method, terminal and computer readable storage medium
CN114860674B (en) File processing method, intelligent terminal and storage medium
CN116610239A (en) Icon processing method, intelligent terminal and storage medium
CN115914719A (en) Screen projection display method, intelligent terminal and storage medium
CN115494997A (en) Information reading method, intelligent terminal and storage medium
CN114138144A (en) Control method, intelligent terminal and storage medium
CN114741361A (en) Processing method, intelligent terminal and storage medium
CN114443199A (en) Interface processing method, intelligent terminal and storage medium
CN113342246A (en) Operation method, mobile terminal and storage medium
CN113342244A (en) Interface display method, mobile terminal and storage medium
CN108008877B (en) Tab moving method, terminal equipment and computer storage medium
CN116302285A (en) Split screen method, intelligent terminal and storage medium
CN114722010B (en) Folder processing method, intelligent terminal and storage medium
WO2024045155A1 (en) Icon display control method, mobile terminal, and storage medium
CN116225279A (en) Adjustment method, intelligent terminal and storage medium
CN115718580A (en) File opening method, intelligent terminal and storage medium
CN115495419A (en) File transfer method, intelligent terminal and storage medium
CN117032544A (en) Processing method, intelligent terminal and storage medium
CN115098204A (en) Widget display method, intelligent terminal and storage medium
CN115617229A (en) Application classification method, mobile terminal and storage medium
CN115904598A (en) Data processing method, intelligent terminal and storage medium

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication