CN110442297B - Split screen display method, split screen display device and terminal equipment - Google Patents

Split screen display method, split screen display device and terminal equipment Download PDF

Info

Publication number
CN110442297B
CN110442297B CN201910728751.8A CN201910728751A CN110442297B CN 110442297 B CN110442297 B CN 110442297B CN 201910728751 A CN201910728751 A CN 201910728751A CN 110442297 B CN110442297 B CN 110442297B
Authority
CN
China
Prior art keywords
display
screen
split
interface
area
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Active
Application number
CN201910728751.8A
Other languages
Chinese (zh)
Other versions
CN110442297A (en
Inventor
孟婉婷
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Guangdong Oppo Mobile Telecommunications Corp Ltd
Original Assignee
Guangdong Oppo Mobile Telecommunications Corp Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Guangdong Oppo Mobile Telecommunications Corp Ltd filed Critical Guangdong Oppo Mobile Telecommunications Corp Ltd
Priority to CN201910728751.8A priority Critical patent/CN110442297B/en
Publication of CN110442297A publication Critical patent/CN110442297A/en
Application granted granted Critical
Publication of CN110442297B publication Critical patent/CN110442297B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]
    • G06F3/0484Interaction techniques based on graphical user interfaces [GUI] for the control of specific functions or operations, e.g. selecting or manipulating an object, an image or a displayed text element, setting a parameter value or selecting a range
    • G06F3/04847Interaction techniques to control parameter settings, e.g. interaction with sliders or dials
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]
    • G06F3/0487Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser
    • G06F3/0488Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser using a touch-screen or digitiser, e.g. input of commands through traced gestures
    • G06F3/04886Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser using a touch-screen or digitiser, e.g. input of commands through traced gestures by partitioning the display area of the touch-screen or the surface of the digitising tablet into independently controllable areas, e.g. virtual keyboards or menus

Abstract

The application is applicable to the technical field of terminal equipment, and provides a split-screen display method, a split-screen display device, the terminal equipment and a computer readable storage medium, wherein the split-screen display method comprises the following steps: when the terminal equipment is in a preset mode, if the operation that a user moves a first object is detected, acquiring the final position of the movement of the first object; determining a display area where a final position of the first object is moved; and if the display area contains a second object, displaying the interface of the first object and the interface of the second object in a split screen mode. By the split-screen display method, the efficiency of split-screen operation of a user can be improved.

Description

Split screen display method, split screen display device and terminal equipment
Technical Field
The present application belongs to the technical field of terminal devices, and in particular, to a split-screen display method, a split-screen display apparatus, a terminal device, and a computer-readable storage medium.
Background
With the increasingly complex use scenarios of various terminal devices such as mobile phones and notebook computers, a user may need to perform split-screen display on a display screen of the terminal device to view multiple pieces of information simultaneously on multiple interfaces.
Some split-screen display methods known by the inventor often include many steps, for example, a specific interface needs to be entered, and then real application is performed on each interface after split-screen is sequentially selected through preset keys or operation, such steps are complicated in flow, difficulty in using split-screen by a user is increased, and efficiency of split-screen operation is low.
Disclosure of Invention
The embodiment of the application provides a split-screen display method, a split-screen display device, a terminal device and a computer readable storage medium, and can improve the efficiency of split-screen operation of a user.
In a first aspect, an embodiment of the present application provides a split-screen display method, including:
when the terminal equipment is in a preset mode, if the operation that a user moves a first object is detected, acquiring the final position of the movement of the first object;
determining a display area where a final position of the first object is moved;
and if the display area contains a second object, displaying the interface of the first object and the interface of the second object in a split screen mode.
In a second aspect, an embodiment of the present application provides a split-screen display device, including:
the terminal equipment comprises a first determining module, a second determining module and a judging module, wherein the first determining module is used for acquiring a final position of a first object when the terminal equipment is in a preset mode and if the operation that a user moves the first object is detected;
a second determination module, configured to determine a display area where a final position of the first object moves;
and the split screen module is used for displaying the interface of the first object and the interface of the second object in a split screen mode if the display area contains the second object.
In a third aspect, an embodiment of the present application provides a terminal device, which includes a memory, a processor, a display, and a computer program stored in the memory and executable on the processor, where the processor executes the computer program to implement the split-screen display method according to the first aspect.
In a fourth aspect, the present application provides a computer-readable storage medium, where a computer program is stored, and when executed by a processor, the computer program implements the split-screen display method according to the first aspect.
In a fifth aspect, an embodiment of the present application provides a computer program product, which, when running on a terminal device, causes the terminal device to execute the split-screen display method described in the first aspect.
Compared with the prior art, the embodiment of the application has the advantages that: in the embodiment of the application, when the terminal equipment is in a preset mode, if the operation that a user moves a first object is detected, the final position of the first object is obtained; determining a display area where a final position of the first object is moved; and if the display area contains a second object, displaying the interface of the first object and the interface of the second object in a split screen mode. The user can conveniently and quickly determine the first object and the second object to be displayed in a split screen mode by executing the operation of moving the first object and determining the final position of the first object, so that the split screen efficiency is greatly improved. The embodiment is simple and easy to implement, high in operation efficiency, good in user experience, and high in practicability and usability.
Drawings
In order to more clearly illustrate the technical solutions in the embodiments of the present application, the drawings needed to be used in the embodiments or the prior art descriptions will be briefly described below, and it is obvious that the drawings in the following description are only some embodiments of the present application, and it is obvious for those skilled in the art to obtain other drawings based on these drawings without inventive exercise.
Fig. 1 is a schematic structural diagram of a mobile phone to which a split-screen display method according to an embodiment of the present application is applied;
fig. 2 is a schematic flowchart of a split-screen display method according to an embodiment of the present application;
fig. 3 is a schematic flowchart of step S203 according to an embodiment of the present application;
fig. 4 is a schematic diagram of a display interface of a terminal device according to an embodiment of the present application;
fig. 5 is a schematic structural diagram of a split-screen display device according to an embodiment of the present application;
fig. 6 is a schematic structural diagram of a terminal device according to an embodiment of the present application.
Detailed Description
In the following description, for purposes of explanation and not limitation, specific details are set forth, such as particular system structures, techniques, etc. in order to provide a thorough understanding of the embodiments of the present application. It will be apparent, however, to one skilled in the art that the present application may be practiced in other embodiments that depart from these specific details. In other instances, detailed descriptions of well-known systems, devices, circuits, and methods are omitted so as not to obscure the description of the present application with unnecessary detail.
It will be understood that the terms "comprises" and/or "comprising," when used in this specification and the appended claims, specify the presence of stated features, integers, steps, operations, elements, and/or components, but do not preclude the presence or addition of one or more other features, integers, steps, operations, elements, components, and/or groups thereof.
It should also be understood that the term "and/or" as used in this specification and the appended claims refers to and includes any and all possible combinations of one or more of the associated listed items.
As used in this specification and the appended claims, the term "if" may be interpreted contextually as "when", "upon" or "in response to" determining "or" in response to detecting ". Similarly, the phrase "if it is determined" or "if a [ described condition or event ] is detected" may be interpreted contextually to mean "upon determining" or "in response to determining" or "upon detecting [ described condition or event ]" or "in response to detecting [ described condition or event ]".
Reference throughout this specification to "one embodiment" or "some embodiments," or the like, means that a particular feature, structure, or characteristic described in connection with the embodiment is included in one or more embodiments of the present application. Thus, appearances of the phrases "in one embodiment," "in some embodiments," "in other embodiments," or the like, in various places throughout this specification are not necessarily all referring to the same embodiment, but rather "one or more but not all embodiments" unless specifically stated otherwise. The terms "comprising," "including," "having," and variations thereof mean "including, but not limited to," unless expressly specified otherwise.
The split-screen display method provided by the embodiment of the application can be applied to terminal devices such as a mobile phone, a tablet personal computer, a wearable device, a vehicle-mounted device, an Augmented Reality (AR)/Virtual Reality (VR) device, a notebook computer, an ultra-mobile personal computer (UMPC), a netbook, a Personal Digital Assistant (PDA) and the like, and the embodiment of the application does not limit the specific type of the terminal device at all.
Take the terminal device as a mobile phone as an example. Fig. 1 is a block diagram illustrating a partial structure of a mobile phone according to an embodiment of the present disclosure. Referring to fig. 1, the cellular phone includes: a Radio Frequency (RF) circuit 110, a memory 120, an input unit 130, a display unit 140, a sensor 150, an audio circuit 160, a wireless fidelity (WiFi) module 170, a processor 180, and a power supply 190. Those skilled in the art will appreciate that the handset configuration shown in fig. 1 is not intended to be limiting and may include more or fewer components than those shown, or some components may be combined, or a different arrangement of components.
The following describes each component of the mobile phone in detail with reference to fig. 1:
the RF circuit 110 may be used for receiving and transmitting signals during information transmission and reception or during a call, and in particular, receives downlink information of a base station and then processes the received downlink information to the processor 180; in addition, the data for designing uplink is transmitted to the base station. Typically, the RF circuitry includes, but is not limited to, an antenna, at least one Amplifier, a transceiver, a coupler, a Low Noise Amplifier (LNA), a duplexer, and the like. In addition, the RF circuitry 110 may also communicate with networks and other devices via wireless communications. The wireless communication may use any communication standard or protocol, including but not limited to Global System for Mobile communication (GSM), General Packet Radio Service (GPRS), Code Division Multiple Access (CDMA), Wideband Code Division Multiple Access (WCDMA), Long Term Evolution (LTE)), e-mail, Short Messaging Service (SMS), and the like.
The memory 120 may be used to store software programs and modules, and the processor 180 executes various functional applications and data processing of the mobile phone by operating the software programs and modules stored in the memory 120. The memory 120 may mainly include a storage program area and a storage data area, wherein the storage program area may store an operating system, an application program required by at least one function (such as a sound playing function, an image playing function, etc.), and the like; the storage data area may store data (such as audio data, a phonebook, etc.) created according to the use of the cellular phone, and the like. Further, the memory 120 may include high speed random access memory, and may also include non-volatile memory, such as at least one magnetic disk storage device, flash memory device, or other volatile solid state storage device.
The input unit 130 may be used to receive input numeric or character information and generate key signal inputs related to user settings and function control of the cellular phone 100. Specifically, the input unit 130 may include a touch panel 131 and other input devices 132. The touch panel 131, also referred to as a touch screen, may collect touch operations of a user on or near the touch panel 131 (e.g., operations of the user on or near the touch panel 131 using any suitable object or accessory such as a finger or a stylus pen), and drive the corresponding connection device according to a preset program. Alternatively, the touch panel 131 may include two parts, i.e., a touch detection device and a touch controller. The touch detection device detects the touch direction of a user, detects a signal brought by touch operation and transmits the signal to the touch controller; the touch controller receives touch information from the touch sensing device, converts the touch information into touch point coordinates, sends the touch point coordinates to the processor 180, and can receive and execute commands sent by the processor 180. In addition, the touch panel 131 may be implemented by various types such as a resistive type, a capacitive type, an infrared ray, and a surface acoustic wave. The input unit 130 may include other input devices 132 in addition to the touch panel 131. In particular, other input devices 132 may include, but are not limited to, one or more of a physical keyboard, function keys (such as volume control keys, switch keys, etc.), a trackball, a mouse, a joystick, and the like.
The display unit 140 may be used to display information input by a user or information provided to the user and various menus of the mobile phone. The Display unit 140 may include a Display panel 141, and optionally, the Display panel 141 may be configured in the form of a Liquid Crystal Display (LCD), an Organic Light-Emitting Diode (OLED), or the like. Further, the touch panel 131 can cover the display panel 141, and when the touch panel 131 detects a touch operation on or near the touch panel 131, the touch operation is transmitted to the processor 180 to determine the type of the touch event, and then the processor 180 provides a corresponding visual output on the display panel 141 according to the type of the touch event. Although the touch panel 131 and the display panel 141 are shown as two separate components in fig. 1 to implement the input and output functions of the mobile phone, in some embodiments, the touch panel 131 and the display panel 141 may be integrated to implement the input and output functions of the mobile phone.
The handset 100 may also include at least one sensor 150, such as a light sensor, motion sensor, and other sensors. Specifically, the light sensor may include an ambient light sensor that adjusts the brightness of the display panel 141 according to the brightness of ambient light, and a proximity sensor that turns off the display panel 141 and/or the backlight when the mobile phone is moved to the ear. As one of the motion sensors, the accelerometer sensor can detect the magnitude of acceleration in each direction (generally, three axes), can detect the magnitude and direction of gravity when stationary, and can be used for applications of recognizing the posture of a mobile phone (such as horizontal and vertical screen switching, related games, magnetometer posture calibration), vibration recognition related functions (such as pedometer and tapping), and the like; as for other sensors such as a gyroscope, a barometer, a hygrometer, a thermometer, and an infrared sensor, which can be configured on the mobile phone, further description is omitted here.
Audio circuitry 160, speaker 161, and microphone 162 may provide an audio interface between the user and the handset. The audio circuit 160 may transmit the electrical signal converted from the received audio data to the speaker 161, and convert the electrical signal into a sound signal for output by the speaker 161; on the other hand, the microphone 162 converts the collected sound signal into an electrical signal, which is received by the audio circuit 160 and converted into audio data, which is then processed by the audio data output processor 180 and then transmitted to, for example, another cellular phone via the RF circuit 110, or the audio data is output to the memory 120 for further processing.
WiFi belongs to short-distance wireless transmission technology, and the mobile phone can help a user to receive and send e-mails, browse webpages, access streaming media and the like through the WiFi module 170, and provides wireless broadband Internet access for the user. Although fig. 1 shows the WiFi module 170, it is understood that it does not belong to the essential constitution of the handset 100, and can be omitted entirely as needed within the scope not changing the essence of the invention.
The processor 180 is a control center of the mobile phone, connects various parts of the entire mobile phone by using various interfaces and lines, and performs various functions of the mobile phone and processes data by operating or executing software programs and/or modules stored in the memory 120 and calling data stored in the memory 120, thereby integrally monitoring the mobile phone. Alternatively, processor 180 may include one or more processing units; preferably, the processor 180 may integrate an application processor, which mainly handles operating systems, user interfaces, application programs, etc., and a modem processor, which mainly handles wireless communications. It will be appreciated that the modem processor described above may not be integrated into the processor 180.
The handset 100 also includes a power supply 190 (e.g., a battery) for powering the various components, which may preferably be logically connected to the processor 180 via a power management system, such that the power management system may be used to manage charging, discharging, and power consumption.
Although not shown, the handset 100 may also include a camera. Optionally, the position of the camera on the mobile phone 100 may be front-located or rear-located, which is not limited in this embodiment of the application.
Optionally, the mobile phone 100 may include a single camera, a dual camera, or a triple camera, which is not limited in this embodiment.
For example, the cell phone 100 may include three cameras, one being a main camera, one being a wide camera, and one being a tele camera.
Optionally, when the mobile phone 100 includes a plurality of cameras, the plurality of cameras may be all front-mounted, all rear-mounted, or a part of the cameras front-mounted and another part of the cameras rear-mounted, which is not limited in this embodiment of the present application.
In addition, although not shown, the mobile phone 100 may further include a bluetooth module or the like, which is not described herein.
Specifically, fig. 2 shows a flowchart of a first split-screen display method provided in an embodiment of the present application, where the split-screen display method can be applied to a terminal device.
The split-screen display method comprises the following steps:
step S201, when the terminal device is in the preset mode, if an operation of the user to move the first object is detected, acquiring a final position where the first object moves.
In the embodiment of the application, the preset mode can be preset by a user or a developer, and the preset mode indicates that the user can perform operations such as screen splitting on the terminal device. Specifically, in some real-time modes, the terminal device may enter the preset mode after receiving preset operations of a user, such as clicking, long-pressing one or more designated virtual keys or physical keys, or pulling down a screen.
When the terminal device is in the preset mode, a display interface of a display screen of the terminal device may display content that is selectable by a user for split-screen display according to a certain preset manner, for example, the display interface may include a plurality of icons, each icon representing a corresponding application program, or may include a plurality of pictures or thumbnails of the pictures, documents, videos, and the like. In the preset mode, the display content of the display screen may be determined according to the actual application scene, which is not limited herein.
In an embodiment of the application, the first object may identify a portion of content for split-screen display. The first object may be, for example, an application icon corresponding to an application program, or may be a thumbnail image or a file icon corresponding to a file, and the type of the file may be various types of computer files, such as pictures, documents, tables, and the like.
The operation of the user to move the first object may be determined according to a setting in the terminal device, for example, may be implemented by touching a touch screen, or may be implemented by a device such as a mouse. The operation of the user to move the first object may include one specific operation or may include a plurality of specific operations. For example, in some embodiments, it may be determined that a user selects a first object (such as an application icon or a thumbnail) displayed in a touch screen of a mobile terminal when the user is detected to touch the first object for a preset time; after that, if it is detected that the user continues to perform continuous touch operation with the touch screen of the terminal device without interruption after selecting the first object, detecting a movement track of the continuous touch operation, moving the first object according to the movement track, and determining that the operation of moving the first object by the user is completed when the continuous touch operation is completed.
In some embodiments, optionally, the operation of moving the first object by the user includes a continuous touch operation in which an initial touch position is located in a first image area where the first object is located, and an operation of moving the first object according to a movement trajectory when the user performs the continuous touch operation.
The continuous touch operation of which the initial touch position is located in the first image area where the first object is located may indicate that the user selects the first object, and the operation of moving the first object according to the movement track when the user performs the continuous touch operation may implement the movement of the first object on the display interface. Optionally, the continuous touch operation in which the initial touch position is located in the first image area where the first object is located may include a continuous touch operation in which a duration of the initial touch position in the first image area where the first object is located exceeds a preset duration threshold.
In addition, the final position of the first object may refer to a position of a designated feature point corresponding to the first object, or may be a position of a first area corresponding to the first object, where the first area may be, for example, a display area covered by the first object. That is, the first position may indicate a position of a point, and may also indicate a position of an area.
Step S202, determining a display area where the final position of the first object is moved.
In the embodiment of the application, the display area is an area obtained by pre-dividing a display interface of the terminal device. The display area may be a partial area in the display interface.
In some embodiments, the range of the display area may be displayed to the user on a display interface by displaying a frame of the display area, or the like.
Step S203, if the display area includes a second object, displaying an interface of the first object and an interface of the second object in a split screen manner.
In this embodiment, if the display area includes the second object, it may be considered that the user desires to perform split-screen display on the first object and the second object. The second object may be an application icon corresponding to an application program, or may be a thumbnail image or a file icon corresponding to a file, and the type of the file may be various types of computer files, such as pictures, documents, tables, and the like. In some embodiments, the second object may be of the same type as the first object. Of course, the second object may also be of a different type than the first object.
It should be noted that the second object may include one or more sub-objects. For example, in some embodiments, before the split-screen display method is executed, the terminal device is already in the split-screen state, and the split-screen display interface can be split again through the split-screen display method. For example, before the split-screen display method is executed, the display interface of the terminal device is split into two interfaces, and the two interfaces respectively display the application program P and the application program q, so that when the terminal device is in a preset mode, application icons corresponding to the application program P and the application program q in the two interfaces can be used as the second object. Further, when the split-screen display method is executed, when the interface of the first object and the interface of the second object are displayed in a split-screen manner, the display interface of the terminal device may include three interfaces, and the three interfaces are respectively an interface for displaying the application program P, the application program q and the first object.
Specifically, there may be many methods for displaying the interface of the first object and the interface of the second object in a split screen manner. For example, the areas, the positions, and the like of the interface of the first object and the interface of the second object may be determined according to a preset interface slicing direction and a slicing ratio, or the positions of the interface of the first object and the interface of the second object may be determined according to a relative positional relationship of the final position of the first object with respect to the display region.
In the embodiment of the application, the user can conveniently and quickly determine the first object and the second object to be displayed in a split screen mode by executing the operation of moving the first object and determining the final position of the first object, so that the split screen efficiency is greatly improved. The embodiment is simple and easy to implement, high in operation efficiency, good in user experience, and high in practicability and usability.
In some embodiments, the step S203 may include the steps of:
step S301, determining a first relative positional relationship between the final position and the display area.
For example, the relative position of the final position with respect to a specified feature point (e.g., a center point, a frame coordinate, etc.) of the display area may be determined, or the first relative positional relationship may also be determined by determining a sub-area (e.g., a lower half area, an upper half area, etc.) of the final position in the display area.
Step S302, according to the first relative position relationship, determining a first display position of the interface of the first object after being split and a second display position of the interface of the second object after being split, and according to the first display position and the second display position, displaying the interface of the first object and the interface of the second object in a split manner.
For example, in some embodiments, if the first relative positional relationship indicates that the final position of the first object is located in a lower half area of the display area, the first display position of the interface of the first object after being split is a lower half area of the split-screen display interface, and the second display position of the interface of the second object after being split is an upper half area of the split-screen display interface; if the first relative position relationship indicates that the final position of the first object is located in the upper half area of the display area, the first display position of the interface of the first object after screen splitting is the upper half part of the display interface, and the second display position of the interface of the second object after screen splitting is the lower half part of the screen splitting display interface. Of course, the first relative position relationship may have other defining manners, and is not limited herein.
In the embodiment of the application, the first relative position relation is determined, so that a user can quickly and conveniently send an instruction for indicating the split-screen display mode of the first object and the second object, tedious multiple selection operation is avoided, and the operation efficiency is improved.
Optionally, the second object comprises a plurality of sub-objects;
the determining a first relative positional relationship of the final position to the display area includes:
determining a second relative position relation between the final position and each sub-object in the display area;
the determining, according to the first relative position relationship, a first display position of the interface of the first object after being split and a second display position of the interface of the second object after being split, and displaying, according to the first display position and the second display position, the interface of the first object and the interface of the second object in a split manner, includes:
and determining a first display position of the interface of the first object after screen division and a third display position corresponding to the interface of each sub-object after screen division according to the second relative position relationship, and displaying the interface of the first object and the interface of each sub-object in a screen division manner according to the first display position and the third display position.
In some embodiments, before the split-screen display method of this embodiment is executed, the terminal device is already in the split-screen state, and the split-screen display interface can be split again through the split-screen display method, at this time, the content in each interface that has been split before can be indicated by the second object including the plurality of sub-objects, and the relative positional relationship between each sub-object can correspond to the relative positional relationship of each interface in the display interface that has been split before. Of course, the sub-objects of the second object may also be applied to other application scenarios, for example, the final position of the first object may be located in a plurality of sub-display areas at the same time, and the sub-objects corresponding to the sub-display areas may be considered as the sub-objects of the second object. The specific setting mode and application scenario of the sub-object are not limited herein.
In the embodiment of the application, the user can quickly and conveniently send the instruction for indicating the split-screen display mode of the first object and the second object through the first relative position relationship, and in the embodiment of the application, the user can complete a series of instructions including selecting the first object and the second object, determining the split-screen display mode and the like at one time through one continuous operation of continuous touch operation, so that the step flow of split-screen display is reduced, and the efficiency of split-screen operation of the user is improved. In addition, in the embodiment of the application, the number of the split screens is not limited, the split screen display of multiple screens can be realized, and the operation efficiency is higher.
Optionally, in some embodiments, the split-screen display method further includes:
if the terminal equipment is in a preset mode, displaying a foreground display object area and a menu bar area of a foreground display object on a screen of the terminal equipment, wherein the foreground display object area comprises the foreground display object, the first object is located in the foreground display object area or located in the menu bar area, and the second object is located in the foreground display object area or located in the menu bar area.
In this embodiment, the foreground display object may refer to an application running in a foreground of the terminal device, a file (such as a picture, a document, a table, and a video) displayed in the foreground, and the like. The menu bar region is determined according to system and the like settings of the terminal device. It should be noted that there may be more than one foreground display object, for example, in a terminal device that has been split-screen displayed, the foreground display object may include each application that is displayed on each interface respectively. The areas where the first object and the second object are located are not limited, that is, the user may select the content displayed in the foreground display object area and/or the menu bar area for split screen display.
In some embodiments, the foreground display object region may be laid out in a different manner than the menu bar region, so that a user can clearly distinguish the foreground display object region from the menu bar region.
Optionally, after the foreground display object area and the menu bar area of the foreground display object are displayed on the screen of the terminal device, the method further includes:
and sequentially displaying one or more preset objects in the menu bar area based on a preset sequence according to the sliding direction of the user on the sliding operation in the menu bar area, wherein the preset sequence is determined according to the latest use time of each preset object or the use frequency in a preset time period.
In some embodiments, the sliding direction of the sliding operation may be set according to an actual scene, for example, the sliding operation may be sliding left, sliding right, sliding up or sliding down on the screen, or the like.
Fig. 4 is a schematic diagram of a display interface of the terminal device according to some embodiments. Wherein A, B, C, D, E indicates icons for five pictures in the album. The first object may be C, and the user may move C from position 1 in a menu bar area below a display interface to position 2 in an area where a is located through a long-press drag gesture, and at this time, if the long-press drag operation on C is finished, the a and C may be displayed in a split screen manner; and when the long press dragging operation is finished, the C is positioned in the lower half part area of the area where the A is positioned, so that the interface of the A can be displayed on the upper half part of the display interface of the terminal equipment, and the interface of the C is displayed on the lower half part of the display interface of the terminal equipment.
In this embodiment, the foreground display object area and the menu bar area enable the user to conveniently and intuitively select the content to be displayed in a split screen manner, and the user does not need to repeatedly open the interface to sequentially select the content displayed on each interface after the split screen manner, so that the efficiency of the split screen operation is improved.
It should be understood that, the sequence numbers of the steps in the foregoing embodiments do not imply an execution sequence, and the execution sequence of each process should be determined by its function and inherent logic, and should not constitute any limitation to the implementation process of the embodiments of the present application.
Fig. 5 shows a block diagram of a split-screen display device provided in the embodiment of the present application, which corresponds to the split-screen display method described in the foregoing embodiment, and for convenience of description, only the parts related to the embodiment of the present application are shown.
Referring to fig. 5, the split-screen display device 5 includes: a first determination module 501, a second determination module 502, and a split screen module 503. Wherein:
a first determining module 501, configured to, when a terminal device is in a preset mode, if an operation of a user moving a first object is detected, obtain a final position where the first object moves;
a second determining module 502, configured to determine a display area where a final position of the first object moves;
a split screen module 503, configured to display an interface of the first object and an interface of the second object in a split screen manner if the display area includes the second object.
Optionally, the split screen module 503 specifically includes:
a first determination unit configured to determine a first relative positional relationship between the final position and the display area;
and the screen splitting unit is used for determining a first display position of the interface of the first object after screen splitting and a second display position of the interface of the second object after screen splitting according to the first relative position relationship, and displaying the interface of the first object and the interface of the second object in a screen splitting manner according to the first display position and the second display position.
Optionally, the second object comprises a plurality of sub-objects;
the first determining unit is specifically configured to:
determining a second relative position relation between the final position and each sub-object in the display area;
the split screen unit is specifically configured to:
and determining a first display position of the interface of the first object after screen division and a third display position corresponding to the interface of each sub-object after screen division according to the second relative position relationship, and displaying the interface of the first object and the interface of each sub-object in a screen division manner according to the first display position and the third display position.
Optionally, the operation of moving the first object by the user includes a continuous touch operation in which an initial touch position is located in a first image area where the first object is located, and an operation of moving the first object according to a movement trajectory when the user performs the continuous touch operation.
Optionally, the split-screen display device 5 further includes:
the first display module is used for displaying a foreground display object area and a menu bar area of a foreground display object on a screen of the terminal equipment if the terminal equipment is in a preset mode, wherein the foreground display object area contains the foreground display object, the first object is located in the foreground display object area or located in the menu bar area, and the second object is located in the foreground display object area or located in the menu bar area.
Optionally, the split-screen display device 5 further includes:
and the second display module is used for sequentially displaying one or more preset objects in the menu bar area based on a preset sequence according to the sliding direction of the user on the sliding operation in the menu bar area, wherein the preset sequence is determined according to the latest use time of each preset object or the use frequency in a preset time period.
Through the embodiment of the application, the user can conveniently and quickly determine the first object and the second object to be displayed in a split screen mode by executing the operation of moving the first object and determining the final position of the first object, and therefore the split screen efficiency is greatly improved. The embodiment is simple and easy to implement, high in operation efficiency, good in user experience, and high in practicability and usability.
It should be noted that, for the information interaction, execution process, and other contents between the above-mentioned devices/units, the specific functions and technical effects thereof are based on the same concept as those of the embodiment of the method of the present application, and specific reference may be made to the part of the embodiment of the method, which is not described herein again.
It will be apparent to those skilled in the art that, for convenience and brevity of description, only the above-mentioned division of the functional units and modules is illustrated, and in practical applications, the above-mentioned function distribution may be performed by different functional units and modules according to needs, that is, the internal structure of the apparatus is divided into different functional units or modules to perform all or part of the above-mentioned functions. Each functional unit and module in the embodiments may be integrated in one processing unit, or each unit may exist alone physically, or two or more units are integrated in one unit, and the integrated unit may be implemented in a form of hardware, or in a form of software functional unit. In addition, specific names of the functional units and modules are only for convenience of distinguishing from each other, and are not used for limiting the protection scope of the present application. The specific working processes of the units and modules in the system may refer to the corresponding processes in the foregoing method embodiments, and are not described herein again.
Fig. 6 is a schematic structural diagram of a terminal device according to an embodiment of the present application. As shown in fig. 6, the terminal device 6 of this embodiment includes: at least one processor 60 (only one shown in fig. 6), a memory 61, and a computer program 62 stored in the memory 61 and executable on the at least one processor 60, wherein the processor 60 executes the computer program 62 to implement the steps of any of the various split-screen display method embodiments described above.
The terminal device 6 may be a desktop computer, a notebook, a palm computer, a cloud server, or other computing devices. The terminal device may include, but is not limited to, a processor 60, a memory 61. Those skilled in the art will appreciate that fig. 6 is only an example of the terminal device 6, and does not constitute a limitation to the terminal device 6, and may include more or less components than those shown, or combine some components, or different components, such as an input/output device, a network access device, and the like.
The Processor 60 may be a Central Processing Unit (CPU), and the Processor 60 may be other general purpose Processor, a Digital Signal Processor (DSP), an Application Specific Integrated Circuit (ASIC), a Field Programmable Gate Array (FPGA) or other Programmable logic device, discrete Gate or transistor logic device, discrete hardware component, or the like. A general purpose processor may be a microprocessor or the processor may be any conventional processor or the like.
The memory 61 may in some embodiments be an internal storage unit of the terminal device 6, such as a hard disk or a memory of the terminal device 6. The memory 61 may also be an external storage device of the terminal device 6 in other embodiments, such as a plug-in hard disk, a Smart Media Card (SMC), a Secure Digital (SD) Card, a Flash memory Card (Flash Card), and the like, which are equipped on the terminal device 6. Further, the memory 61 may also include both an internal storage unit and an external storage device of the terminal device 6. The memory 61 is used for storing an operating system, an application program, a BootLoader (BootLoader), data, and other programs, such as program codes of the computer program. The memory 61 may also be used to temporarily store data that has been output or is to be output.
The embodiments of the present application further provide a computer-readable storage medium, where a computer program is stored, and when the computer program is executed by a processor, the computer program implements the steps in the above-mentioned method embodiments.
The embodiments of the present application provide a computer program product, which when running on a mobile terminal, enables the mobile terminal to implement the steps in the above method embodiments when executed.
The integrated unit, if implemented in the form of a software functional unit and sold or used as a stand-alone product, may be stored in a computer readable storage medium. Based on such understanding, all or part of the processes in the methods of the embodiments described above can be implemented by a computer program, which can be stored in a computer-readable storage medium and can implement the steps of the embodiments of the methods described above when the computer program is executed by a processor. Wherein the computer program comprises computer program code, which may be in the form of source code, object code, an executable file or some intermediate form, etc. The computer readable medium may include at least: any entity or device capable of carrying computer program code to a photographing apparatus/terminal apparatus, a recording medium, computer Memory, Read-Only Memory (ROM), Random Access Memory (RAM), an electrical carrier signal, a telecommunications signal, and a software distribution medium. Such as a usb-disk, a removable hard disk, a magnetic or optical disk, etc. In certain jurisdictions, computer-readable media may not be an electrical carrier signal or a telecommunications signal in accordance with legislative and patent practice.
In the above embodiments, the descriptions of the respective embodiments have respective emphasis, and reference may be made to the related descriptions of other embodiments for parts that are not described or illustrated in a certain embodiment.
Those of ordinary skill in the art will appreciate that the various illustrative elements and algorithm steps described in connection with the embodiments disclosed herein may be implemented as electronic hardware or combinations of computer software and electronic hardware. Whether such functionality is implemented as hardware or software depends upon the particular application and design constraints imposed on the implementation. Skilled artisans may implement the described functionality in varying ways for each particular application, but such implementation decisions should not be interpreted as causing a departure from the scope of the present application.
In the embodiments provided in the present application, it should be understood that the disclosed apparatus/network device and method may be implemented in other ways. For example, the above-described apparatus/network device embodiments are merely illustrative, and for example, the division of the modules or units is only one logical division, and there may be other divisions when actually implementing, for example, a plurality of units or components may be combined or integrated into another system, or some features may be omitted, or not implemented. In addition, the shown or discussed mutual coupling or direct coupling or communication connection may be an indirect coupling or communication connection through some interfaces, devices or units, and may be in an electrical, mechanical or other form.
The units described as separate parts may or may not be physically separate, and parts displayed as units may or may not be physical units, may be located in one place, or may be distributed on a plurality of network units. Some or all of the units can be selected according to actual needs to achieve the purpose of the solution of the embodiment.
The above-mentioned embodiments are only used for illustrating the technical solutions of the present application, and not for limiting the same; although the present application has been described in detail with reference to the foregoing embodiments, it should be understood by those of ordinary skill in the art that: the technical solutions described in the foregoing embodiments may still be modified, or some technical features may be equivalently replaced; such modifications and substitutions do not substantially depart from the spirit and scope of the embodiments of the present application and are intended to be included within the scope of the present application.

Claims (10)

1. A split-screen display method, comprising:
when the terminal equipment is in a preset mode, if the operation that a user moves a first object is detected, acquiring the moving final position of the first object, wherein the first object is an application icon corresponding to an application program, or a thumbnail or a file icon corresponding to a file;
determining a display area where a final position of the first object is moved;
if the display area contains a second object, displaying an interface of the first object and an interface of the second object in a split screen mode, wherein the second object comprises a plurality of sub-objects, the number of the corresponding interfaces of the second object corresponds to the number of the sub-objects, and the second object is an application icon corresponding to an application program or a thumbnail or a file icon corresponding to a file;
before the split-screen display method is executed, the terminal equipment is already in a split-screen state, each interface corresponds to one sub-object, and the split-screen display interface is split again through the split-screen display method.
2. The split-screen display method of claim 1, wherein if the display area contains a second object, the split-screen displaying an interface of the first object and an interface of the second object comprises:
determining a first relative positional relationship of the final position to the display area;
and determining a first display position of the interface of the first object after screen division and a second display position of the interface of the second object after screen division according to the first relative position relationship, and displaying the interface of the first object and the interface of the second object in a screen division manner according to the first display position and the second display position.
3. The split-screen display method as claimed in claim 2, wherein the determining of the first relative positional relationship of the final position to the display area comprises:
determining a second relative position relation between the final position and each sub-object in the display area;
the determining, according to the first relative position relationship, a first display position of the interface of the first object after being split and a second display position of the interface of the second object after being split, and displaying, according to the first display position and the second display position, the interface of the first object and the interface of the second object in a split manner, includes:
and determining a first display position of the interface of the first object after screen division and a third display position corresponding to the interface of each sub-object after screen division according to the second relative position relationship, and displaying the interface of the first object and the interface of each sub-object in a screen division manner according to the first display position and the third display position.
4. The screen-division display method of claim 1, wherein the operation of the user moving the first object comprises a continuous touch operation in which an initial touch position is located in a first image area where the first object is located, and an operation of moving the first object according to a movement trajectory when the user performs the continuous touch operation.
5. The split-screen display method as claimed in any one of claims 1 to 4, further comprising:
and if the terminal equipment is in a preset mode, displaying a foreground display object area and a menu bar area of a foreground display object on a screen of the terminal equipment, wherein the foreground display object area comprises the foreground display object, the first object is located in the menu bar area, and the second object is located in the foreground display object area.
6. The split-screen display method of claim 5, wherein after the screen of the terminal device displays a foreground display object area and a menu bar area of a foreground display object, further comprising:
and sequentially displaying one or more preset objects in the menu bar area based on a preset sequence according to the sliding direction of the user on the sliding operation in the menu bar area, wherein the preset sequence is determined according to the latest use time of each preset object or the use frequency in a preset time period.
7. A split screen display device, comprising:
the terminal device comprises a first determining module, a second determining module and a third determining module, wherein the first determining module is used for acquiring a final position of a first object when the terminal device is in a preset mode and if the operation that a user moves the first object is detected, the first object is an application icon corresponding to an application program, or a thumbnail or a file icon corresponding to a file;
a second determination module, configured to determine a display area where a final position of the first object moves;
the screen splitting module is used for displaying the interface of the first object and the interface of the second object in a split screen mode if the display area contains the second object, wherein the second object comprises a plurality of sub-objects, the number of the corresponding interfaces of the second object corresponds to the number of the sub-objects, the second object is an application icon corresponding to an application program, or a thumbnail or a file icon corresponding to a file;
before the first determining module, the second determining module and the screen splitting module are executed, the terminal device is already in the screen splitting state, each interface corresponds to one sub-object, and the split display interface is split again through the first determining module, the second determining module and the screen splitting module.
8. The split-screen display device of claim 7, wherein the split-screen module specifically comprises:
a first determination unit configured to determine a first relative positional relationship between the final position and the display area;
and the screen splitting unit is used for determining a first display position of the interface of the first object after screen splitting and a second display position of the interface of the second object after screen splitting according to the first relative position relationship, and displaying the interface of the first object and the interface of the second object in a screen splitting manner according to the first display position and the second display position.
9. A terminal device comprising a memory, a processor, a display, and a computer program stored in the memory and executable on the processor, wherein the processor implements the split-screen display method according to any one of claims 1 to 6 when executing the computer program.
10. A computer-readable storage medium, in which a computer program is stored, which, when being executed by a processor, implements the split-screen display method according to any one of claims 1 to 6.
CN201910728751.8A 2019-08-08 2019-08-08 Split screen display method, split screen display device and terminal equipment Active CN110442297B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN201910728751.8A CN110442297B (en) 2019-08-08 2019-08-08 Split screen display method, split screen display device and terminal equipment

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN201910728751.8A CN110442297B (en) 2019-08-08 2019-08-08 Split screen display method, split screen display device and terminal equipment

Publications (2)

Publication Number Publication Date
CN110442297A CN110442297A (en) 2019-11-12
CN110442297B true CN110442297B (en) 2021-08-27

Family

ID=68433840

Family Applications (1)

Application Number Title Priority Date Filing Date
CN201910728751.8A Active CN110442297B (en) 2019-08-08 2019-08-08 Split screen display method, split screen display device and terminal equipment

Country Status (1)

Country Link
CN (1) CN110442297B (en)

Families Citing this family (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN110941340B (en) * 2019-11-28 2022-06-17 维沃移动通信有限公司 Split screen display method and terminal equipment
CN112199017A (en) * 2020-09-30 2021-01-08 京东方科技集团股份有限公司 Split-screen interaction method and device, electronic equipment and readable storage medium
CN113703903A (en) * 2021-09-10 2021-11-26 广州朗国电子科技股份有限公司 Split screen display method and device
CN113703902A (en) * 2021-09-10 2021-11-26 广州朗国电子科技股份有限公司 Menu bar construction method and device for split screen display

Citations (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN103885711A (en) * 2014-03-21 2014-06-25 深圳市东方拓宇科技有限公司 Method and system for controlling screen splitting of electronic device
CN104978123A (en) * 2015-06-29 2015-10-14 努比亚技术有限公司 Screen division method and apparatus
EP3493042A1 (en) * 2012-09-24 2019-06-05 Samsung Electronics Co., Ltd. Method and apparatus for executing applications in a touch device

Family Cites Families (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US10585553B2 (en) * 2012-12-06 2020-03-10 Samsung Electronics Co., Ltd. Display device and method of controlling the same
EP2767896B1 (en) * 2013-02-14 2019-01-16 LG Electronics Inc. Mobile terminal and method of controlling the mobile terminal
CN103324435B (en) * 2013-05-24 2017-02-08 华为技术有限公司 Multi-screen display method and device and electronic device thereof
CN106648314A (en) * 2016-12-09 2017-05-10 珠海市魅族科技有限公司 Method and device for splitting screen
CN107193516A (en) * 2017-04-21 2017-09-22 北京安云世纪科技有限公司 Display methods, device and the mobile terminal of content are managed in sidebar
CN108595100B (en) * 2018-04-19 2020-05-12 Oppo广东移动通信有限公司 Split screen display method and device, storage medium and electronic equipment

Patent Citations (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
EP3493042A1 (en) * 2012-09-24 2019-06-05 Samsung Electronics Co., Ltd. Method and apparatus for executing applications in a touch device
CN103885711A (en) * 2014-03-21 2014-06-25 深圳市东方拓宇科技有限公司 Method and system for controlling screen splitting of electronic device
CN104978123A (en) * 2015-06-29 2015-10-14 努比亚技术有限公司 Screen division method and apparatus

Also Published As

Publication number Publication date
CN110442297A (en) 2019-11-12

Similar Documents

Publication Publication Date Title
CN110442297B (en) Split screen display method, split screen display device and terminal equipment
US20200183574A1 (en) Multi-Task Operation Method and Electronic Device
US10725646B2 (en) Method and apparatus for switching screen interface and terminal
WO2020063091A1 (en) Picture processing method and terminal device
CN108446058B (en) Mobile terminal operation method and mobile terminal
CN110007835B (en) Object management method and mobile terminal
CN110007996B (en) Application program management method and terminal
CN105518605A (en) Touch operation method and apparatus for terminal
CN110032309B (en) Screen splitting method and terminal equipment
CN110196668B (en) Information processing method and terminal equipment
CN108920069B (en) Touch operation method and device, mobile terminal and storage medium
CN108228902B (en) File display method and mobile terminal
EP3699743B1 (en) Image viewing method and mobile terminal
CN110673770B (en) Message display method and terminal equipment
CN110752981B (en) Information control method and electronic equipment
CN107193451B (en) Information display method and device, computer equipment and computer readable storage medium
CN106445340B (en) Method and device for displaying stereoscopic image by double-screen terminal
CN111026299A (en) Information sharing method and electronic equipment
CN110865745A (en) Screen capturing method and terminal equipment
CN113552986A (en) Multi-window screen capturing method and device and terminal equipment
CN110795189A (en) Application starting method and electronic equipment
CN108762613B (en) State icon display method and mobile terminal
CN110768804A (en) Group creation method and terminal device
CN108170329B (en) Display control method and terminal equipment
CN110705497A (en) Image frame processing method and device, terminal equipment and computer readable storage medium

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant