CN109710130B - Display method and terminal - Google Patents

Display method and terminal Download PDF

Info

Publication number
CN109710130B
CN109710130B CN201811613851.8A CN201811613851A CN109710130B CN 109710130 B CN109710130 B CN 109710130B CN 201811613851 A CN201811613851 A CN 201811613851A CN 109710130 B CN109710130 B CN 109710130B
Authority
CN
China
Prior art keywords
screen
input
target object
movement
terminal
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Active
Application number
CN201811613851.8A
Other languages
Chinese (zh)
Other versions
CN109710130A (en
Inventor
唐俊坤
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Vivo Mobile Communication Co Ltd
Original Assignee
Vivo Mobile Communication Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Vivo Mobile Communication Co Ltd filed Critical Vivo Mobile Communication Co Ltd
Priority to CN201811613851.8A priority Critical patent/CN109710130B/en
Publication of CN109710130A publication Critical patent/CN109710130A/en
Application granted granted Critical
Publication of CN109710130B publication Critical patent/CN109710130B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Landscapes

  • User Interface Of Digital Computer (AREA)
  • Telephone Function (AREA)

Abstract

The embodiment of the invention provides a display method and a terminal, wherein the display method is applied to the terminal, the terminal comprises a first screen and a second screen, and the display method comprises the following steps: receiving a user's movement input of a target object displayed on the first screen; and placing the target object to the target position of the second screen according to the parameters input by the movement. Through the mode, interaction can be carried out between the different screens at many screen terminals, and the user can more conveniently transmit the content on a display screen to another display screen, uses many screen terminals also more convenient, more rich and varied.

Description

Display method and terminal
Technical Field
The embodiment of the invention relates to the technical field of display, in particular to a display method and a terminal.
Background
With the development of science and technology, the dual-screen technology is mature day by day, and the dual-screen terminal has become a popular trend. Thanks to the double screen, the display of the terminal is made more creative, for example: the user can use the dual screen terminal in different display modes, such as a single screen mode, a dual screen mode, or a split screen mode.
However, in the prior art, although different display modes of the dual-screen terminal can deal with different usage scenarios, the dual-screen terminal cannot interact with each other, for example: the user wants to move the content displayed on one display screen of the dual-screen terminal to the other display screen, and needs the fingers of the user to move in a large range on the two display screens, so that the moving speed is low, and the dual-screen terminal is not convenient enough.
Disclosure of Invention
The embodiment of the invention provides a display method and a terminal, and aims to solve the problems that the existing double screens cannot interact with each other, the moving speed of moving the display content of one display screen to the other display screen is low, and the convenience is low.
In order to solve the technical problem, the invention is realized as follows:
in a first aspect, an embodiment of the present invention provides a display method, which is applied to a terminal, where the terminal includes a first screen and a second screen, and the display method includes:
receiving a user's movement input of a target object displayed on the first screen;
and placing the target object to the target position of the second screen according to the parameters input by the movement.
In a second aspect, an embodiment of the present invention further provides a terminal, including a first screen and a second screen, where the terminal further includes:
the first receiving module is used for receiving the movement input of a user to the target object displayed on the first screen;
and the placing module is used for placing the target object to the target position of the second screen according to the parameters input by the movement.
In a third aspect, an embodiment of the present invention further provides a terminal, which includes a first screen, a second screen, a processor, a memory, and a computer program stored in the memory and executable on the processor, where the computer program, when executed by the processor, implements the steps of the display method described above.
In a fourth aspect, the embodiment of the present invention further provides a computer-readable storage medium, where a computer program is stored on the computer-readable storage medium, and when the computer program is executed by a processor, the steps of the display method described above are implemented.
In the embodiment of the invention, the target object can be placed to the target position of the second screen by receiving the mobile input of the user to the target object displayed on the first screen, so that different screens of the multi-screen terminal can interact with each other, the user can more conveniently transfer the content on one display screen to the other display screen, and the multi-screen terminal is more convenient and more colorful to use.
Drawings
FIG. 1 is a schematic flow chart of a display method according to a first embodiment of the present invention;
fig. 2 is a schematic structural diagram of a terminal according to a second embodiment of the present invention;
fig. 3 is a schematic structural diagram of a terminal according to a third embodiment of the present invention;
fig. 4 is a schematic diagram of a hardware structure of a terminal implementing various embodiments of the present invention.
Detailed Description
In order to make the objects, technical solutions and advantages of the embodiments of the present invention clearer, the technical solutions of the embodiments of the present invention will be clearly and completely described below with reference to the drawings of the embodiments of the present invention. It is to be understood that the embodiments described are only a few embodiments of the present invention, and not all embodiments. All other embodiments, which can be derived by a person skilled in the art from the described embodiments of the invention, are within the scope of the invention.
Referring to fig. 1, fig. 1 is a schematic flowchart of a display method according to a first embodiment of the present invention, where the method is applied to a terminal, and includes:
step 11: receiving a user's movement input of a target object displayed on the first screen;
step 12: and placing the target object to the target position of the second screen according to the parameters input by the movement.
By adopting the display method provided by the embodiment of the invention, different screens of the multi-screen terminal can interact, a user can more conveniently transfer the content on one display screen to the other display screen, and the multi-screen terminal is more convenient and more colorful to use.
In the above embodiment, the terminal may be a terminal in which the first side surface is provided with the first screen and the second side surface is provided with the second screen (for example, the front surface is provided with the first screen and the back surface is provided with the second screen, or the back surface is provided with the first screen and the front surface is provided with the second screen); it may also be a terminal comprising a foldable display screen, which is folded and can be divided into at least 2 display screens, comprising: a first screen and a second screen; the invention is not limited.
In the above embodiment, the movement input is a drag input or a slide input.
In the above embodiment, the target object is an application icon or an application window.
Specifically, interaction is carried out among multiple screens, and a target object of the interaction is an application icon on a desktop of a first screen of the terminal, or an application window displayed on the first screen (for example, an application window playing a video).
In the embodiment of the present invention, step 12 includes:
and under the condition that the terminal is in a multi-screen interaction mode, placing the target object to the target position of the second screen according to the parameters input by the movement.
That is to say, when the terminal is in the multi-screen interaction mode, interaction is performed among multiple screens of the terminal.
There are various situations in which the control terminal enters the multi-screen interaction mode, which are described below.
As one of the optional embodiments: and controlling the terminal to enter a multi-screen interaction mode according to the received movement input of the user to the target object displayed on the first screen.
Specifically, the movement input satisfies at least one of the following conditions:
the movement rate of the movement input exceeds a preset rate;
the moving distance of the moving input exceeds a preset distance;
the termination position of the movement input is located at an edge of the first screen.
That is to say, when the mobile input of the user on the first screen meets at least one of the above conditions, the terminal enters the multi-screen interaction mode, so that the target object on the first screen is automatically placed on the second screen, the user operation is simpler, and the intelligent degree is high.
Of course, the condition that the mobile input for triggering the terminal to enter the multi-screen interaction mode satisfies can be set according to the requirement of the user, for example: the user can set the movement acceleration of the movement input to exceed a preset acceleration threshold, and/or the movement speed of the movement input is increased along with the increase of the movement distance, and the terminal is in a multi-screen interaction mode.
In short, whether the terminal is in the multi-screen interaction mode or not is determined according to the mobile input, so that the user operation is simplified, and convenience are realized.
As another alternative embodiment: and controlling the terminal to enter a multi-screen interaction mode according to the received press input pressed by the user in the preset display area of the first screen.
Specifically, before the step of receiving a movement input of the target object displayed on the first screen by the user, the method further includes:
receiving a pressing input pressed by a user in a preset display area of the first screen;
acquiring a pressure value corresponding to the press input;
and under the condition that the pressure value is greater than or equal to a preset pressure threshold value, executing the step of receiving the movement input of the user to the target object displayed on the first screen.
Controlling the terminal to enter a multi-screen interaction mode under the condition that the pressure value is greater than or equal to a preset pressure threshold value;
and in a multi-screen interaction mode, executing the step of receiving the movement input of the target object displayed on the first screen by the user.
Specifically, a user presses a preset display area (which may be any area of the first screen or a specific area) of the first screen, whether a trigger critical value (i.e., a preset pressure threshold) of multi-screen interaction is reached is determined according to the pressing force of the user, if the trigger critical value does not reach the preset pressure threshold, the pressing operation of the user is a normal screen operation, and when the pressure value is greater than or equal to the preset pressure threshold, the terminal is controlled to enter a multi-screen interaction mode, and in the multi-screen interaction mode, a mobile input of the user to a target object on the first screen is received.
In short, the user presses again in the preset display area of the first screen, the terminal enters a multi-screen interaction mode, and under the multi-screen interaction mode, the mobile input of the user to the target object on the first screen is received, and the target object on the first screen can be automatically placed on the second screen, so that the intelligent degree is high, and convenience are achieved.
Optionally, the preset pressure threshold is different from a target pressure threshold for triggering the 3D touch function to be started.
Specifically, if the first screen of the terminal supports the 3D touch function, the trigger threshold preset pressure threshold of the multi-screen interaction mode of the terminal is different from the trigger threshold target pressure threshold of the 3D touch function. By setting different trigger critical values, when a user presses the first screen again, whether the multi-screen interaction mode is entered or the 3D touch function is started can be distinguished.
Preferably, after the step of receiving a pressing input pressed by a user in a preset display area of the first screen and acquiring a pressure value corresponding to the pressing input, the method further includes:
and outputting prompt information under the condition that the pressure value is greater than or equal to the preset pressure threshold value.
That is to say, when the pressing force of the user reaches the preset pressure threshold of the trigger critical value of the multi-screen interaction, the prompt information is output, and the user is made to realize that the terminal enters the multi-screen interaction mode.
Further, the output prompt information adopts at least one of the following modes:
highlighting an area associated with the press input;
sending out vibration;
a ring tone is emitted.
For example, a large application window is displayed on the first screen, and 3 small application windows are displayed in the first screen, and if the user presses the large application window, the large application window may be highlighted (e.g., highlighted or jittered), and if the user presses a small application window, the small application window may be highlighted.
And a feedback prompt such as vibration and/or ring can be sent out to make the user realize that the terminal enters the multi-screen interaction mode.
Of course, in some other embodiments of the present invention, if the voice message of "entering a multi-screen interaction mode" is received, the terminal is controlled to enter the multi-screen interaction mode. Or, the user can set the mode of triggering the terminal to enter the multi-screen interaction mode according to actual requirements, which is more flexible and convenient, and the invention is not limited.
In some preferred embodiments of the present invention, the parameter of the movement input includes a movement direction of the movement input, and the step of placing the target object on the second screen according to the parameter of the movement input includes:
and determining the target position according to the motion direction of the mobile input, and placing the target object at the target position.
Taking the example that the user operates the application icon on the desktop of the first screen, if the user moves one application icon to the left in the dual-screen interaction mode, the second screen searches for a proper position from the left side of the desktop as a target position to store the application icon moved from the first screen. Or the user can quickly move an application icon to the right to be lost (if the condition that the moving speed of the mobile input exceeds the preset speed is met), the terminal is in a double-screen interaction mode, and the second screen searches a proper position from the right side of the desktop to serve as a target position to store the application icon moved from the first screen.
Taking the example that the user operates the application window being used on the first screen, in the dual-screen interaction mode, the user moves one application window upwards, and the second screen searches for a proper position from the upper side of the desktop as a target position to store the application window moved from the first screen. Or, if the user moves an application window downwards (if the condition that the moving distance of the moving input exceeds the preset distance is met), the terminal is in a dual-screen interaction mode, and the second screen searches a proper position from the lower side of the desktop to serve as a target position to store the application window moved from the first screen.
Optionally, the second screen includes a plurality of desktop surfaces;
the step of determining the target position according to the motion direction of the movement input comprises:
and determining a target desktop from the plurality of desktops according to the motion direction of the mobile input, and determining a target position for placing the target object on the target desktop.
For example, if in the dual-screen interaction mode, the user moves one application icon to the left side of the first screen, and the second screen includes 3 desktops, it is determined that the leftmost desktop of the 3 desktops is the target desktop, and the application icon is placed at the target position of the target desktop, such as: the left blank position of the leftmost desktop.
Optionally, the determining the target position according to the motion direction of the movement input, and the placing the target object at the target position includes:
if the target object is an application window and the number of the application windows is at least 2, the user divides the second screen into at least 2 display areas aiming at different movement directions of the movement input of each application window, and different application windows are placed in different display areas of the at least 2 display areas according to the movement direction of each movement input.
For example, when the user moves one application window in 4 directions, i.e., the upper left, the upper right, the lower left, and the lower right, of the first screen, the second screen is divided into 4 display regions, and one application window is displayed in each display region. Therefore, the application windows are displayed on the second screen in a split screen mode, operation is more convenient and faster, and interestingness is stronger.
In some other preferred embodiments of the present invention, when the target object is an application window, the application window may also be directly displayed on the second screen in a full screen without depending on a motion direction of a movement input to the application window by a user, which is not limited in the present invention.
In some preferred embodiments of the present invention, the target position comprises: desktop, hidden location, or destination folder.
Specifically, the target object may be directly placed on the desktop of the second screen, or may be hidden in the second screen, or may be placed in a target folder of the second screen for storage.
Preferably, the target location comprises the hidden location;
the step of placing the target object at the target position of the second screen comprises:
hiding the target object on the second screen;
after the step of placing the target object at the target position of the second screen, the method further includes:
and if a triggering operation for triggering the target object to be displayed is received, displaying the target object on the second screen.
That is, the user moves the target object on the first screen, hides the target object on the second screen, receives a trigger operation for triggering display of the target object (e.g., a three-finger slide input on the second screen, where the voice content is voice information of "display hidden content"), and displays the hidden target object on the second screen.
Referring to fig. 2, fig. 2 is a schematic structural diagram of a terminal according to a second embodiment of the present invention, where the terminal 20 includes a first screen (not shown) and a second screen (not shown), and further includes:
a first receiving module 21, configured to receive a movement input of a target object displayed on the first screen from a user;
and the placing module 22 is configured to place the target object at the target position of the second screen according to the parameters input by the movement.
According to the terminal provided by the embodiment of the invention, interaction can be carried out among different screens, a user can more conveniently transfer the content on one display screen to the other display screen, and the multi-screen terminal is more convenient and more colorful.
Preferably, the movement input satisfies at least one of the following conditions:
the movement rate of the movement input exceeds a preset rate;
the moving distance of the moving input exceeds a preset distance;
the termination position of the movement input is located at an edge of the first screen.
Preferably, the terminal 20 further includes:
the second receiving module is used for receiving a pressing input pressed by a user in a preset display area of the first screen;
the control module is used for acquiring a pressure value corresponding to the pressing input; and under the condition that the pressure value is greater than or equal to a preset pressure threshold value, controlling the first receiving module to execute the step of receiving the movement input of the user to the target object displayed on the first screen.
Preferably, the terminal 20 further includes:
and the output module is used for outputting prompt information under the condition that the pressure value is greater than or equal to the preset pressure threshold value.
Preferably, the parameter of the movement input includes a movement direction of the movement input, and the placing module 22 is configured to determine the target position according to the movement direction of the movement input, and place the target object at the target position.
Preferably, the second screen comprises a plurality of desktop surfaces;
the placing module 22 is configured to determine a target desktop from the multiple desktops according to the motion direction of the movement input, and determine a target position where the target object is placed on the target desktop.
Preferably, the placing module 22 is configured to, if the target object is an application window and the number of the application windows is at least 2, divide the second screen into at least 2 display areas when the movement directions of the user for the movement inputs of each of the application windows are different, and place different application windows in different display areas of the at least 2 display areas according to the movement direction of each of the movement inputs.
Preferably, the movement input is a drag input or a slide input.
Preferably, the target position includes: desktop, hidden location, or destination folder.
Preferably, the target location comprises the hidden location;
the placing module 22 is configured to hide the target object on the second screen;
the terminal 20 further includes:
and the display module is used for displaying the target object on the second screen if receiving a triggering operation for triggering the display of the target object.
The terminal provided by the embodiment of the present invention can implement each process in the method embodiment corresponding to fig. 1, and is not described herein again to avoid repetition.
Referring to fig. 3, fig. 3 is a schematic structural diagram of a terminal according to a third embodiment of the present invention, where the terminal 30 includes a first screen (not shown) and a second screen (not shown), and further includes a processor 31, a memory 32, and a computer program stored in the memory 32 and operable on the processor 31, where the computer program implements the following steps when executed by the processor 31:
receiving a user's movement input of a target object displayed on the first screen;
and placing the target object to the target position of the second screen according to the parameters input by the movement.
According to the terminal provided by the embodiment of the invention, interaction can be carried out among different screens, a user can more conveniently transfer the content on one display screen to the other display screen, and the multi-screen terminal is more convenient and more colorful.
Preferably, the movement input satisfies at least one of the following conditions:
the movement rate of the movement input exceeds a preset rate;
the moving distance of the moving input exceeds a preset distance;
the termination position of the movement input is located at an edge of the first screen.
Preferably, the computer program when executed by the processor 31 further implements the steps of:
before the step of receiving the user's movement input for the target object displayed on the first screen, the method further includes:
receiving a pressing input pressed by a user in a preset display area of the first screen;
acquiring a pressure value corresponding to the press input;
and under the condition that the pressure value is greater than or equal to a preset pressure threshold value, executing the step of receiving the movement input of the user to the target object displayed on the first screen.
Preferably, the computer program when executed by the processor 31 further implements the steps of:
after the step of receiving a pressing input pressed by a user in a preset display area of the first screen and acquiring a pressure value corresponding to the pressing input, the method further includes:
and outputting prompt information under the condition that the pressure value is greater than or equal to the preset pressure threshold value.
Preferably, the parameter of the movement input comprises a direction of movement of the movement input, and the computer program, when executed by the processor 31, further implements the steps of:
the step of placing the target object on the second screen according to the parameters input by the movement comprises the following steps:
and determining the target position according to the motion direction of the mobile input, and placing the target object at the target position.
Preferably, the second screen comprises a plurality of desktop surfaces;
the computer program when executed by the processor 31 may further implement the steps of:
the step of determining the target position according to the motion direction of the movement input comprises:
and determining a target desktop from the plurality of desktops according to the motion direction of the mobile input, and determining a target position for placing the target object on the target desktop.
Preferably, the computer program when executed by the processor 31 further implements the steps of:
the step of determining the target position according to the motion direction of the mobile input, and the step of placing the target object at the target position comprises:
if the target object is an application window and the number of the application windows is at least 2, the user divides the second screen into at least 2 display areas aiming at different movement directions of the movement input of each application window, and different application windows are placed in different display areas of the at least 2 display areas according to the movement direction of each movement input.
Preferably, the movement input is a drag input or a slide input.
Preferably, the target position includes: desktop, hidden location, or destination folder.
Preferably, the target location comprises the hidden location;
the computer program when executed by the processor 31 may further implement the steps of:
the step of placing the target object at the target position of the second screen comprises:
hiding the target object on the second screen;
after the step of placing the target object at the target position of the second screen, the method further includes:
and if a triggering operation for triggering the target object to be displayed is received, displaying the target object on the second screen.
The terminal can realize each process of the display method embodiment, and can achieve the same technical effect, and for avoiding repetition, the details are not repeated here.
Fig. 4 is a schematic diagram of a hardware structure of a terminal for implementing various embodiments of the present invention, where the terminal 100 includes, but is not limited to: radio frequency unit 101, network module 102, audio output unit 103, input unit 104, sensor 105, display unit 106, user input unit 107, interface unit 108, memory 109, processor 110, and power supply 111. Those skilled in the art will appreciate that the terminal configuration shown in fig. 4 is not intended to be limiting, and that the terminal may include more or fewer components than those shown, or some components may be combined, or a different arrangement of components. In the embodiment of the present invention, the terminal includes, but is not limited to, a mobile phone, a tablet computer, a notebook computer, a palm computer, a vehicle-mounted terminal, a wearable device, a pedometer, and the like.
The user input unit 107 is used for receiving the movement input of a user to the target object displayed on the first screen;
and the processor 110 is configured to place the target object at the target position of the second screen according to the parameters input by the movement.
According to the terminal provided by the embodiment of the invention, interaction can be carried out among different screens, a user can more conveniently transfer the content on one display screen to the other display screen, and the multi-screen terminal is more convenient and more colorful.
It should be understood that, in the embodiment of the present invention, the radio frequency unit 101 may be used for receiving and sending signals during a message transmission or call process, and specifically, after receiving downlink data from a base station, the downlink data is processed by the processor 110; in addition, the uplink data is transmitted to the base station. Typically, radio frequency unit 101 includes, but is not limited to, an antenna, at least one amplifier, a transceiver, a coupler, a low noise amplifier, a duplexer, and the like. In addition, the radio frequency unit 101 can also communicate with a network and other devices through a wireless communication system.
The terminal provides wireless broadband internet access to the user through the network module 102, such as helping the user send and receive e-mails, browse web pages, access streaming media, and the like.
The audio output unit 103 may convert audio data received by the radio frequency unit 101 or the network module 102 or stored in the memory 109 into an audio signal and output as sound. Also, the audio output unit 103 may also provide audio output related to a specific function performed by the terminal 100 (e.g., a call signal reception sound, a message reception sound, etc.). The audio output unit 103 includes a speaker, a buzzer, a receiver, and the like.
The input unit 104 is used to receive an audio or video signal. The input Unit 104 may include a Graphics Processing Unit (GPU) 1041 and a microphone 1042, and the Graphics processor 1041 processes image data of a still picture or video obtained by an image capturing device (e.g., a camera) in a video capturing mode or an image capturing mode. The processed image frames may be displayed on the display unit 106. The image frames processed by the graphic processor 1041 may be stored in the memory 109 (or other storage medium) or transmitted via the radio frequency unit 101 or the network module 102. The microphone 1042 may receive sound and may be capable of processing such sound into audio data. The processed audio data may be converted into a format output transmittable to a mobile communication base station via the radio frequency unit 101 in case of a phone call mode.
The terminal 100 also includes at least one sensor 105, such as a light sensor, motion sensor, and other sensors. Specifically, the light sensor includes an ambient light sensor that can adjust the brightness of the display panel 1061 according to the brightness of ambient light, and a proximity sensor that can turn off the display panel 1061 and/or a backlight when the terminal 100 is moved to the ear. As one of the motion sensors, the accelerometer sensor can detect the magnitude of acceleration in each direction (generally three axes), detect the magnitude and direction of gravity when stationary, and can be used to identify the terminal posture (such as horizontal and vertical screen switching, related games, magnetometer posture calibration), vibration identification related functions (such as pedometer, tapping), and the like; the sensors 105 may also include fingerprint sensors, pressure sensors, iris sensors, molecular sensors, gyroscopes, barometers, hygrometers, thermometers, infrared sensors, etc., which are not described in detail herein.
The display unit 106 is used to display information input by a user or information provided to the user. The Display unit 106 may include a Display panel 1061, and the Display panel 1061 may be configured in the form of a Liquid Crystal Display (LCD), an Organic Light-Emitting Diode (OLED), or the like.
The user input unit 107 may be used to receive input numeric or character information and generate key signal inputs related to user settings and function control of the terminal. Specifically, the user input unit 107 includes a touch panel 1071 and other input devices 1072. Touch panel 1071, also referred to as a touch screen, may collect touch operations by a user on or near the touch panel 1071 (e.g., operations by a user on or near touch panel 1071 using a finger, stylus, or any suitable object or attachment). The touch panel 1071 may include two parts of a touch detection device and a touch controller. The touch detection device detects the touch direction of a user, detects a signal brought by touch operation and transmits the signal to the touch controller; the touch controller receives touch information from the touch sensing device, converts the touch information into touch point coordinates, sends the touch point coordinates to the processor 110, and receives and executes commands sent by the processor 110. In addition, the touch panel 1071 may be implemented in various types, such as a resistive type, a capacitive type, an infrared ray, and a surface acoustic wave. In addition to the touch panel 1071, the user input unit 107 may include other input devices 1072. Specifically, other input devices 1072 may include, but are not limited to, a physical keyboard, function keys (e.g., volume control keys, switch keys, etc.), a trackball, a mouse, and a joystick, which are not described in detail herein.
Further, the touch panel 1071 may be overlaid on the display panel 1061, and when the touch panel 1071 detects a touch operation thereon or nearby, the touch panel 1071 transmits the touch operation to the processor 110 to determine the type of the touch event, and then the processor 110 provides a corresponding visual output on the display panel 1061 according to the type of the touch event. Although in fig. 4, the touch panel 1071 and the display panel 1061 are two independent components to implement the input and output functions of the terminal, in some embodiments, the touch panel 1071 and the display panel 1061 may be integrated to implement the input and output functions of the terminal, and is not limited herein.
The interface unit 108 is an interface for connecting an external device to the terminal 100. For example, the external device may include a wired or wireless headset port, an external power supply (or battery charger) port, a wired or wireless data port, a memory card port, a port for connecting a device having an identification module, an audio input/output (I/O) port, a video I/O port, an earphone port, and the like. The interface unit 108 may be used to receive input (e.g., data information, power, etc.) from an external device and transmit the received input to one or more elements within the terminal 100 or may be used to transmit data between the terminal 100 and the external device.
The memory 109 may be used to store software programs as well as various data. The memory 109 may mainly include a storage program area and a storage data area, wherein the storage program area may store an operating system, an application program required by at least one function (such as a sound playing function, an image playing function, etc.), and the like; the storage data area may store data (such as audio data, a phonebook, etc.) created according to the use of the cellular phone, and the like. Further, the memory 109 may include high speed random access memory, and may also include non-volatile memory, such as at least one magnetic disk storage device, flash memory device, or other volatile solid state storage device.
The processor 110 is a control center of the terminal, connects various parts of the entire terminal using various interfaces and lines, and performs various functions of the terminal and processes data by operating or executing software programs and/or modules stored in the memory 109 and calling data stored in the memory 109, thereby performing overall monitoring of the terminal. Processor 110 may include one or more processing units; preferably, the processor 110 may integrate an application processor, which mainly handles operating systems, user interfaces, application programs, etc., and a modem processor, which mainly handles wireless communications. It will be appreciated that the modem processor described above may not be integrated into the processor 110.
The terminal 100 may further include a power supply 111 (e.g., a battery) for supplying power to various components, and preferably, the power supply 111 may be logically connected to the processor 110 through a power management system, so as to manage charging, discharging, and power consumption management functions through the power management system.
In addition, the terminal 100 includes some functional modules that are not shown, and thus, the detailed description thereof is omitted.
The embodiment of the present invention further provides a computer-readable storage medium, where a computer program is stored on the computer-readable storage medium, and when the computer program is executed by a processor, the computer program implements each process of the display method embodiment, and can achieve the same technical effect, and in order to avoid repetition, details are not repeated here. The computer-readable storage medium may be a Read-Only Memory (ROM), a Random Access Memory (RAM), a magnetic disk or an optical disk.
It should be noted that, in this document, the terms "comprises," "comprising," or any other variation thereof, are intended to cover a non-exclusive inclusion, such that a process, method, article, or apparatus that comprises a list of elements does not include only those elements but may include other elements not expressly listed or inherent to such process, method, article, or apparatus. Without further limitation, an element defined by the phrase "comprising an … …" does not exclude the presence of other like elements in a process, method, article, or apparatus that comprises the element.
Through the above description of the embodiments, those skilled in the art will clearly understand that the method of the above embodiments can be implemented by software plus a necessary general hardware platform, and certainly can also be implemented by hardware, but in many cases, the former is a better implementation manner. Based on such understanding, the technical solutions of the present invention may be embodied in the form of a software product, which is stored in a storage medium (such as ROM/RAM, magnetic disk, optical disk) and includes instructions for enabling a terminal (such as a mobile phone, a computer, a server, an air conditioner, or a network device) to execute the method according to the embodiments of the present invention.
While the present invention has been described with reference to the embodiments shown in the drawings, the present invention is not limited to the embodiments, which are illustrative and not restrictive, and it will be apparent to those skilled in the art that various changes and modifications can be made therein without departing from the spirit and scope of the invention as defined in the appended claims.

Claims (13)

1. A display method is applied to a terminal, the terminal comprises a first screen and a second screen, and the display method is characterized by comprising the following steps:
receiving a user's movement input of a target object displayed on the first screen;
placing the target object to a target position of the second screen according to the parameters input by the movement;
the parameter of the movement input comprises a movement direction of the movement input, and the step of placing the target object on the second screen according to the parameter of the movement input comprises the following steps:
determining the target position according to the motion direction of the mobile input, and placing the target object at the target position;
the step of determining the target position according to the motion direction of the mobile input, and the step of placing the target object at the target position comprises:
if the target object is an application window and the number of the application windows is at least 2, the user divides the second screen into at least 2 display areas aiming at different movement directions of the movement input of each application window, and different application windows are placed in different display areas of the at least 2 display areas according to the movement direction of each movement input.
2. The display method according to claim 1, wherein the movement input satisfies at least one of the following conditions:
the movement rate of the movement input exceeds a preset rate;
the moving distance of the moving input exceeds a preset distance;
the termination position of the movement input is located at an edge of the first screen.
3. The display method according to claim 1,
before the step of receiving the user's movement input for the target object displayed on the first screen, the method further includes:
receiving a pressing input pressed by a user in a preset display area of the first screen;
acquiring a pressure value corresponding to the press input;
and under the condition that the pressure value is greater than or equal to a preset pressure threshold value, executing the step of receiving the movement input of the user to the target object displayed on the first screen.
4. The display method according to claim 1,
the second screen comprises a plurality of desktop surfaces;
the step of determining the target position according to the motion direction of the movement input comprises:
and determining a target desktop from the plurality of desktops according to the motion direction of the mobile input, and determining a target position for placing the target object on the target desktop.
5. The display method according to claim 1, wherein the target position includes: desktop, hidden location, or destination folder.
6. The display method according to claim 5,
the target location comprises the hidden location;
the step of placing the target object at the target position of the second screen comprises:
hiding the target object on the second screen;
after the step of placing the target object at the target position of the second screen, the method further includes:
and if a triggering operation for triggering the target object to be displayed is received, displaying the target object on the second screen.
7. A terminal, includes first screen and second screen, its characterized in that still includes:
the first receiving module is used for receiving the movement input of a user to the target object displayed on the first screen;
the placing module is used for placing the target object to the target position of the second screen according to the parameters input by the movement;
the parameter of the mobile input comprises the motion direction of the mobile input, and the placing module is used for determining the target position according to the motion direction of the mobile input and placing the target object at the target position;
the placing module is configured to, if the target object is an application window and the number of the application windows is at least 2, divide the second screen into at least 2 display areas if the movement directions of the user for the movement inputs of each application window are different, and place different application windows in different display areas of the at least 2 display areas according to the movement direction of each movement input.
8. The terminal of claim 7, wherein the movement input satisfies at least one of the following conditions:
the movement rate of the movement input exceeds a preset rate;
the moving distance of the moving input exceeds a preset distance;
the termination position of the movement input is located at an edge of the first screen.
9. The terminal of claim 7, further comprising:
the second receiving module is used for receiving a pressing input pressed by a user in a preset display area of the first screen;
the control module is used for acquiring a pressure value corresponding to the pressing input; and under the condition that the pressure value is greater than or equal to a preset pressure threshold value, controlling the first receiving module to execute the step of receiving the movement input of the user to the target object displayed on the first screen.
10. The terminal of claim 7,
the second screen comprises a plurality of desktop surfaces;
the placing module is further configured to determine a target desktop from the multiple desktops according to the motion direction of the mobile input, and determine a target position where the target object is placed on the target desktop.
11. The terminal of claim 7, wherein the target location comprises: desktop, hidden location, or destination folder.
12. The terminal of claim 11,
the target location comprises the hidden location;
the placing module is used for hiding the target object on the second screen;
the terminal further comprises:
and the display module is used for displaying the target object on the second screen if receiving a triggering operation for triggering the display of the target object.
13. A terminal comprising a first screen and a second screen, characterized by comprising a processor, a memory and a computer program stored on the memory and executable on the processor, the computer program, when executed by the processor, implementing the steps of the display method according to any one of claims 1 to 6.
CN201811613851.8A 2018-12-27 2018-12-27 Display method and terminal Active CN109710130B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN201811613851.8A CN109710130B (en) 2018-12-27 2018-12-27 Display method and terminal

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN201811613851.8A CN109710130B (en) 2018-12-27 2018-12-27 Display method and terminal

Publications (2)

Publication Number Publication Date
CN109710130A CN109710130A (en) 2019-05-03
CN109710130B true CN109710130B (en) 2020-11-17

Family

ID=66258687

Family Applications (1)

Application Number Title Priority Date Filing Date
CN201811613851.8A Active CN109710130B (en) 2018-12-27 2018-12-27 Display method and terminal

Country Status (1)

Country Link
CN (1) CN109710130B (en)

Families Citing this family (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN110888581A (en) * 2019-10-11 2020-03-17 广州视源电子科技股份有限公司 Element transfer method, device, equipment and storage medium
CN110837331B (en) * 2019-11-04 2021-07-09 网易(杭州)网络有限公司 Method, system and medium for moving operation object based on multiple display screens
CN111078091A (en) * 2019-11-29 2020-04-28 华为技术有限公司 Split screen display processing method and device and electronic equipment
CN112416230B (en) * 2020-11-26 2022-04-15 维沃移动通信有限公司 Object processing method and device

Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN105867531A (en) * 2011-02-10 2016-08-17 三星电子株式会社 Portable device comprising touch-screen display, and method for controlling same
CN106843730A (en) * 2017-01-19 2017-06-13 宇龙计算机通信科技(深圳)有限公司 A kind of mobile terminal double screen changing method and mobile terminal
CN107340948A (en) * 2017-06-27 2017-11-10 维沃移动通信有限公司 A kind of video playing control method and mobile terminal
CN107678664A (en) * 2017-08-28 2018-02-09 中兴通讯股份有限公司 A kind of terminal interface switching, the method, apparatus and terminal of gesture processing

Family Cites Families (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20120225694A1 (en) * 2010-10-01 2012-09-06 Sanjiv Sirpal Windows position control for phone applications

Patent Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN105867531A (en) * 2011-02-10 2016-08-17 三星电子株式会社 Portable device comprising touch-screen display, and method for controlling same
CN106843730A (en) * 2017-01-19 2017-06-13 宇龙计算机通信科技(深圳)有限公司 A kind of mobile terminal double screen changing method and mobile terminal
CN107340948A (en) * 2017-06-27 2017-11-10 维沃移动通信有限公司 A kind of video playing control method and mobile terminal
CN107678664A (en) * 2017-08-28 2018-02-09 中兴通讯股份有限公司 A kind of terminal interface switching, the method, apparatus and terminal of gesture processing

Also Published As

Publication number Publication date
CN109710130A (en) 2019-05-03

Similar Documents

Publication Publication Date Title
CN108255378B (en) Display control method and mobile terminal
CN108762954B (en) Object sharing method and mobile terminal
CN107835321B (en) Incoming call processing method and mobile terminal
CN110413168B (en) Icon management method and terminal
CN109407921B (en) Application processing method and terminal device
CN109710130B (en) Display method and terminal
CN108897473B (en) Interface display method and terminal
CN109710349B (en) Screen capturing method and mobile terminal
CN111026484A (en) Application sharing method, first electronic device and computer-readable storage medium
CN109407949B (en) Display control method and terminal
CN108900695B (en) Display processing method, terminal equipment and computer readable storage medium
CN107613095B (en) Incoming call processing method and mobile terminal
CN107992342B (en) Application configuration changing method and mobile terminal
CN109683802B (en) Icon moving method and terminal
CN109508136B (en) Display method of application program and mobile terminal
CN108958593B (en) Method for determining communication object and mobile terminal
CN109343788B (en) Operation control method of mobile terminal and mobile terminal
CN107728923B (en) Operation processing method and mobile terminal
US20210200432A1 (en) Display method and mobile terminal
CN110990172A (en) Application sharing method, first electronic device and computer-readable storage medium
CN108600544B (en) Single-hand control method and terminal
CN111061404A (en) Control method and first electronic device
CN110941469B (en) Application splitting creation method and terminal equipment thereof
CN109597546B (en) Icon processing method and terminal equipment
CN109002245B (en) Application interface operation method and mobile terminal

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant