CN109491631B - Display control method and terminal - Google Patents

Display control method and terminal Download PDF

Info

Publication number
CN109491631B
CN109491631B CN201811278017.8A CN201811278017A CN109491631B CN 109491631 B CN109491631 B CN 109491631B CN 201811278017 A CN201811278017 A CN 201811278017A CN 109491631 B CN109491631 B CN 109491631B
Authority
CN
China
Prior art keywords
input
screen
display
target object
touch
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Active
Application number
CN201811278017.8A
Other languages
Chinese (zh)
Other versions
CN109491631A (en
Inventor
史玲玲
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Vivo Mobile Communication Co Ltd
Original Assignee
Vivo Mobile Communication Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Vivo Mobile Communication Co Ltd filed Critical Vivo Mobile Communication Co Ltd
Priority to CN201811278017.8A priority Critical patent/CN109491631B/en
Publication of CN109491631A publication Critical patent/CN109491631A/en
Application granted granted Critical
Publication of CN109491631B publication Critical patent/CN109491631B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/14Digital output to display device ; Cooperation and interconnection of the display device with other functional units
    • G06F3/1423Digital output to display device ; Cooperation and interconnection of the display device with other functional units controlling a plurality of local displays, e.g. CRT and flat panel display
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]
    • G06F3/0484Interaction techniques based on graphical user interfaces [GUI] for the control of specific functions or operations, e.g. selecting or manipulating an object, an image or a displayed text element, setting a parameter value or selecting a range

Landscapes

  • Engineering & Computer Science (AREA)
  • Theoretical Computer Science (AREA)
  • General Engineering & Computer Science (AREA)
  • Human Computer Interaction (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • User Interface Of Digital Computer (AREA)

Abstract

The invention provides a display control method and a terminal, wherein the method comprises the following steps: receiving a first input of a target object displayed in a first screen by a user; selecting a target object in response to a first input; receiving a second input of the user on a second screen; displaying a target object in a target area in a second screen in response to a second input; the target area is a part or all of the display area of the second screen. According to the display control method provided by the embodiment of the invention, when the terminal displays the object displayed in one screen in the other screen, the operation is convenient and fast, and the time is saved.

Description

Display control method and terminal
Technical Field
The embodiment of the invention relates to the technical field of communication, in particular to a display control method and a terminal.
Background
With the rapid development of electronic technology, terminals such as smart phones and tablet computers are becoming more and more popular, and become indispensable tools in people's daily life. In order to meet the requirements of people for terminal display functions, a multi-screen terminal comprising at least two screens is produced, and different screens of the multi-screen terminal can realize the display of the same or different contents, so that the display effect and the user experience of the terminal are improved.
However, in the current multi-screen terminal, when a user displays an object (such as an icon, a picture, or a display page of an application) displayed in one screen in another screen, for example: if a user needs to display a display page of an application program in a first screen in a second screen, the application icon of the application program needs to be found in a desktop application icon in the second screen, and the found application icon is clicked to run the corresponding application program, so that the display page of the application program in the first screen is displayed in the second screen, and the operation is complicated and time-consuming, especially when the desktop application icons are more.
Therefore, in the conventional multi-screen terminal, an object displayed in one screen is displayed in another screen, so that the problems of complex operation and time consumption exist.
Disclosure of Invention
The embodiment of the invention provides a display control method and a terminal, and aims to solve the problems that in the conventional multi-screen terminal, an object displayed in one screen is displayed in the other screen, and the operation is complicated and time-consuming.
In order to solve the technical problem, the invention is realized as follows:
in a first aspect, an embodiment of the present invention provides a display control method, which is applied to a terminal including a first screen and a second screen, and the method includes:
receiving a first input of a user to a target object displayed in the first screen;
selecting the target object in response to the first input;
receiving a second input of the user on the second screen;
displaying the target object in a target area in the second screen in response to the second input;
and the target area is a part of or the whole display area of the second screen.
In a second aspect, an embodiment of the present invention further provides a terminal, including a first screen and a second screen, including:
the first input module receives first input of a user on a target object displayed in the first screen;
the selecting module is used for responding to the first input received by the first input module and selecting the target object;
the second input module is used for receiving a second input of the user on the second screen;
a display module, configured to display the target object in a target area in the second screen in response to a second input received by the second input module;
the target area is a part of or the whole display area of the second screen.
In a third aspect, an embodiment of the present invention further provides a terminal, including a processor, a memory, and a computer program stored in the memory and capable of running on the processor, where the computer program, when executed by the processor, implements the steps of the display control method.
In a fourth aspect, an embodiment of the present invention further provides a computer-readable storage medium, where a computer program is stored, and when the computer program is executed by a processor, the computer program implements the steps of the display control method.
In the embodiment of the invention, a first input of a user to a target object displayed in a first screen is received; selecting a target object in response to a first input; receiving a second input of the user on a second screen; displaying a target object in a target area in a second screen in response to a second input; the target area is a part or all of the display area of the second screen, so that when the terminal displays the object displayed in one screen in another screen, the operation is convenient and fast, and time is saved.
Drawings
Fig. 1 is a schematic flowchart of a display control method according to an embodiment of the present invention;
fig. 2 is one of schematic diagrams of a display interface of a terminal according to an embodiment of the present invention;
fig. 3 is a second schematic diagram of a display interface of the terminal according to the embodiment of the present invention;
fig. 4 is a third schematic diagram of a display interface of the terminal according to the embodiment of the present invention;
fig. 5 is a fourth schematic diagram of a display interface of a terminal according to an embodiment of the present invention;
fig. 6 is a fifth schematic view of a display interface of a terminal according to an embodiment of the present invention;
fig. 7 is a sixth schematic view of a display interface of a terminal according to an embodiment of the present invention;
fig. 8 is a seventh schematic diagram of a display interface of a terminal according to an embodiment of the present invention;
fig. 9 is a schematic structural diagram of a terminal according to an embodiment of the present invention;
fig. 10 is a schematic diagram of a hardware structure of a terminal according to an embodiment of the present invention.
Detailed Description
The technical solutions in the embodiments of the present invention will be clearly and completely described below with reference to the drawings in the embodiments of the present invention, and it is obvious that the described embodiments are some, not all, embodiments of the present invention. All other embodiments, which can be derived by a person skilled in the art from the embodiments given herein without making any creative effort, shall fall within the protection scope of the present invention.
Referring to fig. 1, fig. 1 is a schematic flowchart of a display control method according to an embodiment of the present invention, where the display control method is applied to a terminal including a first screen and a second screen, and as shown in fig. 1, the method includes the following steps:
step 101, receiving a first input of a user to a target object displayed in a first screen;
step 102, responding to a first input, and selecting the target object;
step 103, receiving a second input of the user on the second screen;
step 104, responding to a second input, and displaying a target object in a target area in a second screen;
the target area is a part of or the whole display area of the second screen.
In step 101, if the user needs to display the first object displayed in the first screen in the second screen in the case that the first object is displayed in the first screen of the terminal, the user may input a first input for selecting the first object, and the terminal receives the first input of the user.
The target object may be any object that can be displayed in the first screen and the second screen, and the target object includes at least one of an icon, a control, an image, a display page of an application program, and the like.
In addition, the first input may be any input for selecting the first object, and may be at least one of a voice input, a touch input, and a motion sensing input. The touch input may include, but is not limited to: a press operation, a slide operation, a click operation, a double-click operation, and a multi-click operation, and the like.
For example: in the case that the display page of the application program 1 is displayed in the first screen, the first input may be a pressing operation input by a user on the display page of the first application program 1 in the first screen, and a pressing force degree or a pressing time duration of the pressing operation exceeds a preset threshold; or, the user inputs a sliding operation of sliding the display page of the application program 1 in the first screen onto the second screen; alternatively, the user inputs a voice input of "select the display page of the application 1".
It should be noted that the first input may be a touch input by one finger of the user, or may be a touch input performed by a plurality of fingers simultaneously, for example: the first input includes a touch sub-input of a first finger and a touch sub-input of a second finger, as shown in fig. 2, the terminal includes a screen 21 and a screen 22, and the first input is: the user's finger 23 and finger 24 touch different positions in the screen 21, respectively, and the finger 23 and the finger 24 slide toward each other, i.e., the first input is a pinch operation of the finger 23 and the finger 24.
It should be noted that, before the step 101, only one object may be displayed in the first screen of the terminal, for example, a display page of one application program is displayed in a full screen; alternatively, a plurality of objects may be simultaneously displayed in the first screen, such as: the first screen is divided into a plurality of sub-screens by a screen division technology, and each sub-screen displays a display page of an application program; or the first screen is in an image browsing mode, that is, thumbnails of a plurality of images can be displayed in the first screen at the same time; alternatively, a plurality of icons are displayed in a plurality of display areas in the first screen, and so on.
Of course, in the case where a plurality of objects are simultaneously displayed in the first screen, the terminal may select a target object among the plurality of objects according to the above-described first input of the user.
In step 102, after the terminal receives the first input to the target object, the terminal may select the target object in response to the first input.
The selecting the target object in response to the first input may be that the terminal records information of the selected target object in a background process.
Or, optionally, the step 102 includes:
displaying the selected identification on the target object;
adjusting the target object to a preset display color;
adjusting the display position of the target object or the area of the display area;
and displaying the preset masking layer on the target object.
In the mode, when the terminal responds to the first input to select the target object, the selection state of the target object can be visually displayed for the user, so that the user can timely know whether the target object is selected successfully or unsuccessfully.
For example: as shown in fig. 2, in the case where the application interface of the application program 1 is displayed full screen in the screen 21, when the terminal receives a pinch-in input by the user in the screen 21, the terminal may set a mask layer on the display page of the application program 1 so that the user knows that the display page of the application program 1 has been selected, as shown in fig. 3.
It should be noted that, after the step 102 is performed, that is, after the terminal selects the target object in the first screen, the display of the target object may be continuously maintained in the first screen, or the terminal exits the target object in the first screen, that is, returns to the desktop of the first screen.
Optionally, after the step 102, the method further includes:
under the condition that a third input of the user to the target control is received, or under the condition that the input of the user on the first screen is not received within a preset time length after the second input is received, quitting the display of the target object on the first screen;
the target control is a control displayed in the first screen after the target object is selected by the terminal.
Here, after selecting the target object in the first screen, the terminal may exit the display of the target object in the first screen according to a third input of the user on the target control, or an operation that the user has not input in the first screen for a long time, for example: the content of the display desktop can be changed, or the display of the first screen is closed, so that the electric energy loss caused by continuously displaying the target object in the first screen is reduced, the electric energy is saved, and the cruising ability of the terminal is improved.
For example: as shown in fig. 3, in the case that the terminal sets a mask on the display page of the application program 1, the terminal may further display an "exit screen" control 31 and a "reserve screen" control 32 on the display page of the application program 1 in a frame flipping manner, and if the terminal receives an operation of clicking the "exit screen" control 31 by a user, the terminal exits the display of the display page of the application program 1 on the first screen and returns to the desktop of the first screen; if the terminal receives an operation of the user clicking the "keep screen" control 32, the terminal continues to keep displaying the display page of the application 1 in the first screen.
In step 103, after the terminal selects the target object in the first screen, when the user inputs a second input indicating that the selected target object is displayed on the second screen, the terminal may receive the second input.
The second input may be any input for instructing the selected target object to be displayed on the second screen, and may be at least one of a voice input, a touch input, and a motion sensing input. The touch input may include, but is not limited to: a press operation, a slide operation, a click operation, a double-click operation, a multi-click operation, and the like.
For example: the second input may also be a pressing operation in which the pressing force degree or the pressing time duration input by the user on the second screen exceeds a preset threshold; or inputting a sliding input with a sliding track as a preset track on the second screen; or, the first finger and the second finger of the user touch the second screen at the same time, as shown in fig. 4, and so on.
It should be noted that, after the target object in the first screen is selected in step 102, the display control method may further include:
and under the condition that the second input is not received within the preset time length after the target object is selected or the fourth input of the user to the target object is received, the target object is canceled from being selected, so that other operations of the user are prevented from being influenced by the fact that the target object is selected for a long time.
The fourth input may be any input for canceling the selected target object, and may be a voice input, a touch input, or a gesture input, and the like, which are not described herein again.
In the above step 104, after the terminal receives the second input, the terminal may display the target object in the target area in the second screen in response to the second input.
The target area may be a preset area in the second screen, such as a quarter display area of a fixed position in the second screen; alternatively, the target area may be a display area determined by the terminal according to the second input parameter, that is, the target area and the second input parameter have an association relationship, for example: the second input is a sliding operation, the terminal is preset with corresponding relations between different sliding tracks and display areas at different display positions, for example, a sliding track "L" corresponds to a quarter of the display area at the upper left corner in the second screen, a sliding track "R" corresponds to a quarter of the display area at the upper right corner in the second screen, and when the sliding track of the second input is "L", the target area is a quarter of the display area at the upper left corner in the second screen; similarly, when the sliding track of the second input is "R", the target area is a quarter of the display area in the upper right corner of the second screen.
Optionally, step 104 includes:
acquiring input parameters of a second input;
and displaying the target object in a display area of the second screen, wherein the display parameter corresponds to the input parameter, and the display parameter comprises at least one of a display position and an area size.
Here, the terminal displays the target object in the target region where the display parameter in the second screen corresponds to the input parameter input by the second input, so that the target object can be displayed in the second screen according to a display position, a display size, or the like desired by the user.
For example: when the second input is a pressing operation, the terminal can acquire pressing position information and pressing intensity of the second input in the second screen, and display a target object in an area size corresponding to the pressing intensity of the second input and a display area with the pressing position as a central point according to a preset linear relation between the pressing intensity and the area size and the pressing position information; or, when the second input is a sliding operation, if the sliding track of the sliding operation encloses a closed area, the terminal obtains track information of the sliding track of the sliding operation on the second screen, and displays the target object in a display area corresponding to the closed area enclosed by the sliding track, and so on.
It should be noted that, when the second input is a touch input, the second input may be a touch input of one finger of the user, or may be a touch input of a plurality of fingers, and the second input is not limited herein.
Optionally, the second input includes a first touch sub-input of a first finger of the user and a second touch sub-input of a second finger;
the above displaying the target object in the target area where the display parameter in the second screen corresponds to the input parameter based on the input parameter includes:
acquiring a first touch position of the first touch sub input and a second touch position of the second touch sub input;
displaying a target object in an area including a first touch position and a second touch position on a second screen;
the input parameters comprise position parameters of a first touch position and position parameters of a second touch position.
Here, the terminal may display the target object in the area including the touch positions of the two fingers on the second screen according to the touch sub-input of the two fingers of the user, and not only may determine the position and size of the target area, i.e., the display position and area size of the target object, according to the needs of the user, so that the terminal may display the target object on the second screen more flexibly; in addition, compared with the touch sub-input of a single finger of a user, the complexity of the user input is increased, so that the probability of misoperation can be reduced.
The target object is displayed in a display area including a first touch position and a second touch position on the second screen, and the size and position of the display area are determined by a position parameter of the first touch position and a position parameter of the second touch position, for example: the display area may be a circular area with a first touch position located at the center of the circular area and a second touch position located on the circumference; alternatively, a circular area may be defined by taking the first touch position and the second touch position as the diameters, and the like.
Optionally, the displaying the target object in the display area including the first touch position and the second touch position on the second screen includes:
displaying a target object in a target rectangular area in the second screen;
the target rectangular area takes the first touch position and the second touch position as diagonal vertices.
Here, the terminal may display the target object in a rectangular area with the first touch position and the second touch position as diagonal fixed points, thereby further improving the display effect of the terminal.
For example: as shown in fig. 4, the second input includes a touch sub-input in which the finger 41 and the finger 42 of the user touch the second screen at the same time, and the terminal may display the display page of the application 1 in the rectangular display area 43 having the touch position of the finger 41 and the touch position of the finger 42 as diagonal vertices in the second screen.
Of course, in some embodiments, the target object may also be displayed in other rectangular areas, such as: the first touch position and the second touch position are respectively distributed on a diagonal of the rectangular area displaying the target object, and the length of the diagonal may be determined according to the distance between the first touch position and the second touch position, and so on, which are not limited herein.
It should be noted that, in the step 104, the terminal places the target object in the second screen, and the display size, the display position, and the like of the first application target object may be fixed.
Optionally, after the target object is displayed in the target area in the second screen, the method further includes:
and updating the display state of the target object in the second screen in response to the first touch sub-input and the second touch sub-input.
Here, the terminal may update the display state of the target state according to the touch sub-input of the two fingers of the user, so that the display state of the target object may be adjusted according to the user's requirement.
The updating of the display state of the target object in the second screen in response to the first touch sub-input and the second touch sub-input may be performed by adjusting the display state of the target object to a display state corresponding to the touch strength or the touch duration according to the touch strength or the touch duration of the first touch sub-input and the second touch sub-input, that is, the first touch sub-input and the second touch sub-input are press inputs.
Optionally, the first touch sub-input and the second touch sub-input are opposite or reverse sliding sub-inputs;
in response to the first touch sub-input and the second touch sub-input, updating a display state of a target object in the second screen, including:
and updating the display state of the target object in the second screen according to the real-time distance between the touch positions of the first finger and the second finger on the second screen.
Here, when the first touch sub-input and the second touch sub-input are opposite or opposite sliding sub-inputs, the terminal may update the display state of the target object according to a real-time distance between touch positions of the first finger and the second finger, so that real-time adjustment of the target object may be achieved, and the operation of the user is more convenient and time-saving.
The updating of the display state of the target object in the second screen according to the real-time distance between the touch positions of the first finger and the second finger on the second screen may be performed by reducing the display size of the target object when the first touch sub-input and the second touch sub-input are opposite sliding sub-inputs, that is, the real-time distance between the touch positions of the first finger and the second finger is reduced; and under the condition that the first touch sub-input and the second touch sub-input are opposite sliding sub-inputs, namely the real-time distance between the touch positions of the first finger and the second finger is increased, the display size of the target object is enlarged.
For example: when the terminal receives a second input that the finger 41 and the finger 42 of the user touch the second screen at the same time as shown in fig. 4, and the terminal displays the display page of the application 1 in the rectangular display area 43, if the finger 41 and the finger 42 respectively slide on the second screen in opposite directions, that is, as shown in fig. 5, the real-time distance between the touch positions of the finger 41 and the finger 42 becomes larger, the terminal adjusts the size of the display page of the application 1 to become larger; if the finger 41 and the finger 42 slide on the second screen in the opposite direction, that is, the real-time distance between the touch positions of the finger 41 and the finger 42 becomes smaller, the size of the display page of the terminal adjustment application 1 becomes smaller.
The updated display state of the target object may be a display state in which the terminal updates the display state of the target object to a preset display state corresponding to the real-time interval, or may be a display state in which the terminal updates the target object according to another update rule, which is not limited herein.
Optionally, the updating the display state of the target object in the second screen according to the real-time distance between the touch positions of the first finger and the second finger on the second screen includes:
under the condition that the real-time distance between the touch positions of the first finger and the second finger on the second screen is smaller than a first threshold value, displaying the target object on the second screen in a floating window mode;
under the condition that the real-time distance between the touch positions of the first finger and the second finger on the second screen is larger than a second threshold value, displaying the target object on the second screen in a full screen mode;
and under the condition that the real-time distance between the touch positions of the first finger and the second finger on the second screen is greater than or equal to a first threshold value and less than or equal to a second threshold value, displaying the target object in a target sub-display area on the second screen.
Here, the real-time distance between the touch positions of the first finger and the second finger is compared with each preset threshold value through the terminal, and the selected display state is displayed, namely, the display state is displayed in a suspension window mode, the display state is displayed in a split screen mode or in a target sub-display area, so that the target object is displayed in the second screen through the terminal according to different display states, and the display effect of the terminal is improved.
Wherein, the target sub-display area is a preset area on the second screen, for example: which may be one-half of the display area on the second screen, or one-quarter of the display area, etc., and is not limited herein.
For example: in the process that the third input is the reverse sliding of the finger 41 and the finger 42 of the user as shown in fig. 5, if the real-time distance between the touch positions of the finger 41 and the finger 42 is smaller than the first threshold, the display page of the application program 1 is displayed in a floating window manner, and the size of the floating window corresponds to the real-time distance, as shown in fig. 6; if the real-time distance between the touch positions of the finger 41 and the finger 42 is greater than the second threshold, displaying the display page of the application program 1 on the full screen of the second screen, as shown in fig. 7; if the real-time distance between the touch positions of the finger 41 and the finger 42 is greater than or equal to the first threshold and less than or equal to the second threshold, the display page of the application 1 is displayed in the display area 81, as shown in fig. 8, the display area of the second screen is divided into the display area 81 and the display area 82, and the display page of the application 2 can be displayed in the display area 82.
In the embodiment of the invention, a first input of a user to a target object displayed in a first screen is received; selecting a target object in response to a first input; receiving a second input of the user on a second screen; displaying a target object in a target area in a second screen in response to a second input; the target area is a part or all of the display area of the second screen, so that when the terminal displays the object displayed in one screen in another screen, the operation is convenient and fast, and time is saved.
Referring to fig. 9, fig. 9 is a schematic structural diagram of a terminal according to an embodiment of the present invention, where the terminal includes a first screen and a second screen, and as shown in fig. 9, the terminal 900 includes:
a first input module 901, configured to receive a first input of a user on a target object displayed in the first screen;
a selecting module 902, configured to select the target object in response to the first input received by the first input module 901;
a second input module 903, configured to receive a second input of the user on the second screen;
a first display module 904, configured to display the target object in a target area in the second screen in response to a second input received by the second input module 903;
the target area is a part of or the whole display area of the second screen.
Optionally, the first display module 904 includes:
an input parameter obtaining submodule, configured to obtain an input parameter of a second input received by the second input module 903;
and the display sub-module is used for displaying the target object in a target area corresponding to the display parameter and the input parameter in the second screen based on the input parameter acquired by the input parameter acquisition sub-module, wherein the display parameter comprises at least one of a display position and an area size.
Optionally, the second input includes a first touch sub-input of a first finger of the user and a second touch sub-input of a second finger;
the display sub-module comprises:
a touch position acquiring unit, configured to acquire a first touch position of the first touch sub-input and a second touch position of the second touch sub-input received by the second input module 903;
a display unit configured to display the target object in a display area of the second screen that includes the first touch position and the second touch position acquired by the touch position acquisition unit;
wherein the input parameters include a position parameter of the first touch position and a position parameter of the second touch position.
Optionally, the display unit includes:
a display subunit that displays the target object within a target rectangular region in the second screen;
the target rectangular area takes the first touch position and the second touch position acquired by the touch position acquisition unit as diagonal vertices.
Optionally, the display unit further includes:
a display state updating subunit, configured to update a display state of a target object in the second screen in response to the first touch sub-input and the second touch sub-input received by the second input module 903.
Optionally, the first touch sub-input and the second touch sub-input are opposite or reverse sliding sub-inputs;
the display state updating subunit is specifically configured to:
and updating the display state of the target object in the second screen according to the real-time distance between the touch positions of the first finger and the second finger on the second screen.
Optionally, the display state updating subunit is specifically configured to:
displaying the target object on a second screen in a floating window mode under the condition that the real-time distance between the touch positions of the first finger and the second finger on the second screen is smaller than a first threshold value;
under the condition that the real-time distance between the touch positions of the first finger and the second finger on a second screen is larger than a second threshold value, displaying the target object on the second screen in a full-screen manner;
and under the condition that the real-time distance between the touch positions of the first finger and the second finger on the second screen is greater than or equal to a first threshold value and less than or equal to a second threshold value, displaying the target object in a target sub-display area on the second screen.
Optionally, the selecting module 902 is specifically configured to at least one of:
displaying the selected identification on the target object;
adjusting the target object to a preset display color;
adjusting the display position of the target object or the area of a display area;
and displaying a preset masking layer on the target object.
The terminal 900 can implement each process implemented by the terminal in the above method embodiments, and is not described here again to avoid repetition.
The terminal 900 of the embodiment of the present invention receives a first input of a target object displayed in a first screen from a user; selecting a target object in response to a first input; receiving a second input of the user on a second screen; displaying a target object in a target area in a second screen in response to a second input; the target area is a part or all of the display area of the second screen, so that when the terminal displays the object displayed in one screen in another screen, the operation is convenient and fast, and time is saved.
Fig. 10 is a schematic diagram of a hardware structure of a terminal for implementing various embodiments of the present invention, where the terminal 1000 includes, but is not limited to: a radio frequency unit 1001, a network module 1002, an audio output unit 1003, an input unit 1004, a sensor 1005, a display unit 1006, a user input unit 1007, an interface unit 1008, a memory 1009, a processor 1010, and a power supply 1011. The display unit 1006 includes a first screen and a second screen. Those skilled in the art will appreciate that the terminal configuration shown in fig. 10 is not intended to be limiting, and that the terminal may include more or fewer components than shown, or some components may be combined, or a different arrangement of components. In the embodiment of the present invention, the terminal includes, but is not limited to, a mobile phone, a tablet computer, a notebook computer, a palm computer, a vehicle-mounted terminal, a wearable device, a pedometer, and the like.
The user input unit 1007 is configured to:
receiving a first input of a user to a target object displayed in the first screen;
receiving a second input of the user on the second screen;
the processor 1010 is configured to:
selecting the target object in response to the first input;
displaying the target object in a target area in the second screen in response to the second input;
and the target area is a part of or the whole display area of the second screen.
Optionally, the processor 1010 is further configured to:
acquiring input parameters of the second input;
displaying the target object in a target area of which the display parameters correspond to the input parameters in the second screen based on the input parameters;
wherein the display parameter comprises at least one of a display position and a region size.
Optionally, the second input includes a first touch sub-input of a first finger of the user and a second touch sub-input of a second finger;
the processor 1010 is further configured to:
acquiring a first touch position of the first touch sub-input and a second touch position of the second touch sub-input;
displaying the target object in a display area of the second screen including the first touch position and the second touch position;
wherein the input parameters include a position parameter of the first touch position and a position parameter of the second touch position.
Optionally, the processor 1010 is further configured to:
displaying the target object in a target rectangular area in the second screen;
and the target rectangular area takes the first touch position and the second touch position as diagonal vertexes.
Optionally, the processor 1010 is further configured to:
updating a display state of a target object in the second screen in response to the first touch sub-input and the second touch sub-input.
Alternatively to this, the first and second parts may,
the first touch sub-input and the second touch sub-input are opposite or reverse sliding sub-inputs;
the processor 1010 is further configured to:
and updating the display state of the target object in the second screen according to the real-time distance between the touch positions of the first finger and the second finger on the second screen.
Optionally, the processor 1010 is further configured to:
displaying the target object on a second screen in a floating window manner under the condition that the real-time distance between the touch positions of the first finger and the second finger on the second screen is smaller than a first threshold value;
displaying the target object on a second screen in a full screen mode under the condition that the real-time distance between the touch positions of the first finger and the second finger on the second screen is larger than a second threshold value;
and under the condition that the real-time distance between the touch positions of the first finger and the second finger on the second screen is greater than or equal to a first threshold value and less than or equal to a second threshold value, displaying the target object in a target sub-display area on the second screen.
The processor 1010 is further configured to:
displaying a selected identifier on the target object;
adjusting the target object to a preset display color;
adjusting the display position of the target object or the area of a display area;
and displaying a preset masking layer on the target object.
Terminal 1000 can implement each process implemented by the terminal in the foregoing embodiments, and details are not described here to avoid repetition.
Terminal 1000 according to the embodiment of the present invention receives a first input of a user to a target object displayed in a first screen; selecting a target object in response to a first input; receiving a second input of the user on a second screen; displaying a target object in a target area in a second screen in response to a second input; the target area is a part or all of the display area of the second screen, so that when the terminal displays the object displayed in one screen in another screen, the operation is convenient and fast, and time is saved.
It should be understood that, in the embodiment of the present invention, the radio frequency unit 1001 may be used for receiving and sending signals during a message transmission or a call, and specifically, receives downlink data from a base station and then processes the received downlink data to the processor 1010; in addition, the uplink data is transmitted to the base station. In general, radio frequency unit 1001 includes, but is not limited to, an antenna, at least one amplifier, a transceiver, a coupler, a low noise amplifier, a duplexer, and the like. Further, the radio frequency unit 1001 may also communicate with a network and other devices through a wireless communication system.
The terminal provides the user with wireless broadband internet access through the network module 1002, such as helping the user send and receive e-mails, browse webpages, access streaming media, and the like.
The audio output unit 1003 may convert audio data received by the radio frequency unit 1001 or the network module 1002 or stored in the memory 1009 into an audio signal and output as sound. Also, the audio output unit 1003 can provide audio output (e.g., a call signal reception sound, a message reception sound, etc.) related to a specific function performed by the terminal 1000. The audio output unit 1003 includes a speaker, a buzzer, a receiver, and the like.
The input unit 1004 is used to receive an audio or video signal. The input Unit 1004 may include a Graphics Processing Unit (GPU) 10041 and a microphone 10042, the Graphics processor 10041 Processing image data of still pictures or video obtained by an image capturing device (such as a camera) in a video capturing mode or an image capturing mode. The processed image frames may be displayed on the display unit 1006. The image frames processed by the graphic processor 10041 may be stored in the memory 1009 (or other storage medium) or transmitted via the radio frequency unit 1001 or the network module 1002. The microphone 10042 can receive sound and can process such sound into audio data. The processed audio data may be converted into a format output transmittable to a mobile communication base station via the radio frequency unit 1001 in case of a phone call mode.
Terminal 1000 can also include at least one sensor 1005 such as a light sensor, motion sensor, and other sensors. Specifically, the light sensor includes an ambient light sensor that can adjust the brightness of the display panel 10061 according to the brightness of ambient light, and a proximity sensor that can turn off the display panel 10061 and/or a backlight when the terminal 1000 moves to the ear. As one of the motion sensors, the accelerometer sensor can detect the magnitude of acceleration in each direction (generally three axes), detect the magnitude and direction of gravity when stationary, and can be used to identify the terminal posture (such as horizontal and vertical screen switching, related games, magnetometer posture calibration), vibration identification related functions (such as pedometer, tapping), and the like; the sensors 1005 may also include a fingerprint sensor, a pressure sensor, an iris sensor, a molecular sensor, a gyroscope, a barometer, a hygrometer, a thermometer, an infrared sensor, etc., which will not be described in detail herein.
The display unit 1006 is used to display information input by the user or information provided to the user. The Display unit 1006 may include a Display panel 10061, and the Display panel 10061 may be configured in the form of a Liquid Crystal Display (LCD), an Organic Light-Emitting Diode (OLED), or the like.
The user input unit 1007 may be used to receive input numeric or character information and generate key signal inputs related to user settings and function control of the terminal. Specifically, the user input unit 1007 includes a touch panel 10071 and other input devices 10072. The touch panel 10071, also referred to as a touch screen, may collect touch operations by a user on or near the touch panel 10071 (e.g., operations by a user on or near the touch panel 10071 using a finger, a stylus, or any other suitable object or attachment). The touch panel 10071 can include two portions, a touch detection device and a touch processor. The touch detection device detects the touch direction of a user, detects a signal brought by touch operation and transmits the signal to the touch processor; the touch processor receives touch information from the touch sensing device, converts the touch information into touch point coordinates, sends the touch point coordinates to the processor 1010, and receives and executes commands sent by the processor 1010. In addition, the touch panel 10071 may be implemented by various types, such as a resistive type, a capacitive type, an infrared ray, and a surface acoustic wave. In addition to the touch panel 10071, the user input unit 1007 can include other input devices 10072. Specifically, the other input devices 10072 may include, but are not limited to, a physical keyboard, function keys (such as volume control keys, switch keys, etc.), a track ball, a mouse, and a joystick, which are not described herein again.
Further, the touch panel 10071 can be overlaid on the display panel 10061, and when the touch panel 10071 detects a touch operation thereon or nearby, the touch operation is transmitted to the processor 1010 to determine the type of the touch event, and then the processor 1010 provides a corresponding visual output on the display panel 10061 according to the type of the touch event. Although in fig. 10, the touch panel 10071 and the display panel 10061 are two independent components for implementing the input and output functions of the terminal, in some embodiments, the touch panel 10071 and the display panel 10061 may be integrated for implementing the input and output functions of the terminal, which is not limited herein.
Interface unit 1008 is an interface for connecting an external device to terminal 1000. For example, the external device may include a wired or wireless headset port, an external power supply (or battery charger) port, a wired or wireless data port, a memory card port, a port for connecting a device having an identification module, an audio input/output (I/O) port, a video I/O port, an earphone port, and the like. Interface unit 1008 can be used to receive input from external devices (e.g., data information, power, etc.) and transmit the received input to one or more elements within terminal 1000 or can be used to transmit data between terminal 1000 and external devices.
The memory 1009 may be used to store software programs as well as various data. The memory 1009 may mainly include a program storage area and a data storage area, wherein the program storage area may store an operating system, an application program required by at least one function (such as a sound playing function, an image playing function, and the like), and the like; the storage data area may store data (such as audio data, a phonebook, etc.) created according to the use of the cellular phone, etc. Further, the memory 1009 may include high-speed random access memory, and may also include non-volatile memory, such as at least one magnetic disk storage device, flash memory device, or other volatile solid state storage device.
The processor 1010 is a control center of the terminal, connects various parts of the entire terminal using various interfaces and lines, and performs various functions of the terminal and processes data by operating or executing software programs and/or modules stored in the memory 1009 and calling data stored in the memory 1009, thereby integrally monitoring the terminal. Processor 1010 may include one or more processing units; preferably, the processor 1010 may integrate an application processor, which mainly handles operating systems, user interfaces, application programs, etc., and a modem processor, which mainly handles wireless communications. It will be appreciated that the modem processor described above may not be integrated into processor 1010.
Terminal 1000 can also include a power supply 1011 (e.g., a battery) for powering the various components, and preferably, power supply 1011 can be logically coupled to processor 1010 through a power management system that provides management of charging, discharging, and power consumption.
In addition, terminal 1000 can include some functional blocks not shown, which are not described herein.
Preferably, an embodiment of the present invention further provides a terminal, including a processor 1010, a memory 1009, and a computer program stored in the memory 1009 and capable of running on the processor 1010, where the computer program is executed by the processor 1010 to implement each process of the foregoing display control method embodiment, and can achieve the same technical effect, and for avoiding repetition, details are not described here again.
An embodiment of the present invention further provides a computer-readable storage medium, where a computer program is stored on the computer-readable storage medium, and when the computer program is executed by a processor, the computer program implements each process of the display control method embodiment, and can achieve the same technical effect, and in order to avoid repetition, details are not repeated here. The computer-readable storage medium may be a Read-Only Memory (ROM), a Random Access Memory (RAM), a magnetic disk or an optical disk.
It should be noted that, in this document, the terms "comprises," "comprising," or any other variation thereof, are intended to cover a non-exclusive inclusion, such that a process, method, article, or apparatus that comprises a list of elements does not include only those elements but may include other elements not expressly listed or inherent to such process, method, article, or apparatus. Without further limitation, an element defined by the phrase "comprising an … …" does not exclude the presence of other like elements in a process, method, article, or apparatus that comprises the element.
Through the above description of the embodiments, those skilled in the art will clearly understand that the method of the above embodiments can be implemented by software plus a necessary general hardware platform, and certainly can also be implemented by hardware, but in many cases, the former is a better implementation manner. Based on such understanding, the technical solutions of the present invention may be embodied in the form of a software product, which is stored in a storage medium (such as ROM/RAM, magnetic disk, optical disk) and includes instructions for enabling a terminal (such as a mobile phone, a computer, a server, an air conditioner, or a network device) to execute the method according to the embodiments of the present invention.
While the present invention has been described with reference to the embodiments shown in the drawings, the present invention is not limited to the embodiments, which are illustrative and not restrictive, and it will be apparent to those skilled in the art that various changes and modifications can be made therein without departing from the spirit and scope of the invention as defined in the appended claims.

Claims (7)

1. A display control method applied to a terminal including a first screen and a second screen, the method comprising:
receiving a first input of a user to a target object displayed in the first screen;
selecting the target object in response to the first input;
receiving a second input of the user on the second screen;
displaying the target object in a target area in the second screen in response to the second input;
the target area is a part of or all of a display area of the second screen;
the displaying the target object in a target area in the second screen in response to the second input includes:
acquiring input parameters of the second input;
displaying the target object in a target area of which the display parameters correspond to the input parameters in the second screen based on the input parameters;
wherein the display parameters include at least one of a display position and a region size; the target object comprises a display page of an application program;
the second input comprises a first touch sub-input of a first finger of a user and a second touch sub-input of a second finger;
the displaying the target object in a target area corresponding to the display parameter and the input parameter in the second screen based on the input parameter comprises:
acquiring a first touch position of the first touch sub-input and a second touch position of the second touch sub-input;
displaying the target object in a display area of the second screen including the first touch position and the second touch position;
wherein the input parameters comprise a position parameter of the first touch position and a position parameter of the second touch position;
the displaying the target object in a display area of the second screen that includes the first touch position and the second touch position includes:
displaying the target object in a target rectangular area in the second screen;
the target rectangular area takes the first touch position and the second touch position as diagonal vertexes;
the target area in the second screen, after displaying the target object, further includes:
updating a display state of a target object in the second screen in response to the first touch sub-input and the second touch sub-input.
2. The method of claim 1, wherein the first touch sub-input and the second touch sub-input are opposite or opposite sliding sub-inputs;
the updating the display state of the target object in the second screen in response to the first touch sub-input and the second touch sub-input comprises:
and updating the display state of the target object in the second screen according to the real-time distance between the touch positions of the first finger and the second finger on the second screen.
3. The method of claim 2, wherein the updating the display state of the target object in the second screen according to the real-time distance between the touch positions of the first finger and the second finger on the second screen comprises:
displaying the target object on a second screen in a floating window mode under the condition that the real-time distance between the touch positions of the first finger and the second finger on the second screen is smaller than a first threshold value;
under the condition that the real-time distance between the touch positions of the first finger and the second finger on a second screen is larger than a second threshold value, displaying the target object on the second screen in a full-screen manner;
and under the condition that the real-time distance between the touch positions of the first finger and the second finger on the second screen is greater than or equal to a first threshold value and less than or equal to a second threshold value, displaying the target object in a target sub-display area on the second screen.
4. The method according to any one of claims 1 to 3, wherein the selecting the target object comprises at least one of:
displaying the selected identification on the target object;
adjusting the target object to a preset display color;
adjusting the display position of the target object or the area of a display area;
and displaying a preset masking layer on the target object.
5. A terminal comprising a first screen and a second screen, comprising:
the first input module receives first input of a user on a target object displayed in the first screen;
the selecting module is used for responding to the first input received by the first input module and selecting the target object;
the second input module is used for receiving a second input of the user on the second screen;
a display module, configured to display the target object in a target area in the second screen in response to a second input received by the second input module;
the target area is a part of or the whole display area of the second screen;
a first display module comprising:
an input parameter obtaining submodule, configured to obtain an input parameter of a second input received by the second input module 903;
the display sub-module is used for displaying the target object in a target area corresponding to the display parameter and the input parameter in the second screen based on the input parameter acquired by the input parameter acquisition sub-module, wherein the display parameter comprises at least one of a display position and an area size; the target object comprises a display page of an application program;
the second input comprises a first touch sub-input of a first finger of a user and a second touch sub-input of a second finger;
the display sub-module includes:
a touch position acquiring unit, configured to acquire a first touch position of the first touch sub-input and a second touch position of the second touch sub-input received by the second input module;
a display unit configured to display the target object in a display area of the second screen that includes the first touch position and the second touch position acquired by the touch position acquisition unit;
wherein the input parameters comprise a position parameter of the first touch position and a position parameter of the second touch position;
the display unit includes:
a display subunit that displays the target object within a target rectangular region in the second screen;
the target rectangular area takes the first touch position and the second touch position acquired by the touch position acquisition unit as diagonal vertexes;
the display unit further includes:
a display state updating subunit, configured to update a display state of the target object in the second screen in response to the first touch sub-input and the second touch sub-input received by the second input module 903.
6. A terminal, characterized in that it comprises a processor, a memory and a computer program stored on the memory and executable on the processor, the computer program, when executed by the processor, implementing the steps of the display control method according to any one of claims 1 to 4.
7. A computer-readable storage medium, characterized in that a computer program is stored thereon, which computer program, when being executed by a processor, carries out the steps of the display control method according to any one of claims 1 to 4.
CN201811278017.8A 2018-10-30 2018-10-30 Display control method and terminal Active CN109491631B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN201811278017.8A CN109491631B (en) 2018-10-30 2018-10-30 Display control method and terminal

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN201811278017.8A CN109491631B (en) 2018-10-30 2018-10-30 Display control method and terminal

Publications (2)

Publication Number Publication Date
CN109491631A CN109491631A (en) 2019-03-19
CN109491631B true CN109491631B (en) 2022-09-13

Family

ID=65691855

Family Applications (1)

Application Number Title Priority Date Filing Date
CN201811278017.8A Active CN109491631B (en) 2018-10-30 2018-10-30 Display control method and terminal

Country Status (1)

Country Link
CN (1) CN109491631B (en)

Families Citing this family (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN110213437B (en) * 2019-05-27 2020-10-13 维沃移动通信有限公司 Editing method and mobile terminal
CN110851226A (en) * 2019-11-13 2020-02-28 联想(北京)有限公司 Control method and electronic equipment

Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2013135286A1 (en) * 2012-03-14 2013-09-19 Nokia Corporation Touch screen hover input handling
KR20150025635A (en) * 2013-08-29 2015-03-11 삼성전자주식회사 Electronic device and method for controlling screen
CN104423548A (en) * 2013-08-28 2015-03-18 联想(北京)有限公司 Control method and control device
CN108469898A (en) * 2018-03-15 2018-08-31 维沃移动通信有限公司 A kind of image processing method and flexible screen terminal

Family Cites Families (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
KR101587211B1 (en) * 2009-05-25 2016-01-20 엘지전자 주식회사 Mobile Terminal And Method Of Controlling Same
US20130318437A1 (en) * 2012-05-22 2013-11-28 Samsung Electronics Co., Ltd. Method for providing ui and portable apparatus applying the same
CN103019609B (en) * 2012-12-28 2016-02-03 广东欧珀移动通信有限公司 The method that territory, screen partition shows, device and touch screen terminal
KR102311221B1 (en) * 2014-04-28 2021-10-13 삼성전자주식회사 operating method and electronic device for object
KR20160120103A (en) * 2015-04-07 2016-10-17 엘지전자 주식회사 Mobile terminal and control method thereof

Patent Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2013135286A1 (en) * 2012-03-14 2013-09-19 Nokia Corporation Touch screen hover input handling
CN104423548A (en) * 2013-08-28 2015-03-18 联想(北京)有限公司 Control method and control device
KR20150025635A (en) * 2013-08-29 2015-03-11 삼성전자주식회사 Electronic device and method for controlling screen
CN108469898A (en) * 2018-03-15 2018-08-31 维沃移动通信有限公司 A kind of image processing method and flexible screen terminal

Also Published As

Publication number Publication date
CN109491631A (en) 2019-03-19

Similar Documents

Publication Publication Date Title
CN108536365B (en) Image sharing method and terminal
CN110460907B (en) Video playing control method and terminal
CN109002243B (en) Image parameter adjusting method and terminal equipment
CN108495029B (en) Photographing method and mobile terminal
CN110096326B (en) Screen capturing method, terminal equipment and computer readable storage medium
CN108509123B (en) Application program closing method and mobile terminal
EP3767431A1 (en) Image processing method and flexible-screen terminal
CN110196667B (en) Notification message processing method and terminal
CN109491738B (en) Terminal device control method and terminal device
CN109379484B (en) Information processing method and terminal
CN108712577B (en) Call mode switching method and terminal equipment
CN109032486B (en) Display control method and terminal equipment
CN110032309B (en) Screen splitting method and terminal equipment
CN109151367B (en) Video call method and terminal equipment
CN109739407B (en) Information processing method and terminal equipment
CN109525710B (en) Method and device for accessing application program
CN108920226B (en) Screen recording method and device
CN110196668B (en) Information processing method and terminal equipment
CN110750189B (en) Icon display method and device
CN111562896B (en) Screen projection method and electronic equipment
CN109407948B (en) Interface display method and mobile terminal
CN108898555B (en) Image processing method and terminal equipment
CN108804628B (en) Picture display method and terminal
CN111124571A (en) Interface display method and electronic equipment
CN110851098B (en) Video window display method and electronic equipment

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant