CN111201507A - Multi-screen-based information display method and terminal - Google Patents

Multi-screen-based information display method and terminal Download PDF

Info

Publication number
CN111201507A
CN111201507A CN201780095813.5A CN201780095813A CN111201507A CN 111201507 A CN111201507 A CN 111201507A CN 201780095813 A CN201780095813 A CN 201780095813A CN 111201507 A CN111201507 A CN 111201507A
Authority
CN
China
Prior art keywords
display screen
external display
input
display area
target icon
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Granted
Application number
CN201780095813.5A
Other languages
Chinese (zh)
Other versions
CN111201507B (en
Inventor
张献中
黄成钟
郑雪瑞
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Shenzhen Transsion Communication Co Ltd
Original Assignee
Shenzhen Transsion Communication Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Shenzhen Transsion Communication Co Ltd filed Critical Shenzhen Transsion Communication Co Ltd
Publication of CN111201507A publication Critical patent/CN111201507A/en
Application granted granted Critical
Publication of CN111201507B publication Critical patent/CN111201507B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]
    • G06F3/0487Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser
    • G06F3/0488Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser using a touch-screen or digitiser, e.g. input of commands through traced gestures

Landscapes

  • Engineering & Computer Science (AREA)
  • General Engineering & Computer Science (AREA)
  • Theoretical Computer Science (AREA)
  • Human Computer Interaction (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • User Interface Of Digital Computer (AREA)

Abstract

The application discloses information display method and terminal based on multiple screens, wherein the terminal comprises a first display screen and is connected with at least one external display screen, and the method comprises the following steps: detecting a first input (e.g., a long press) for a target icon, the target icon being displayed in the first display screen; displaying information representing the at least one external display screen in at least one display area, respectively; detecting a second input (such as a drag, slide, gesture, or release operation) for the target icon; and displaying the object represented by the target icon in the external display screen corresponding to the display area targeted by the second input. By providing the multi-screen-based information display method and the multi-screen-based information display terminal, user experience is greatly improved.

Description

Multi-screen-based information display method and terminal Technical Field
The application relates to the technical field of electronic display, in particular to a multi-screen-based information display method and a multi-screen-based information display terminal.
Background
Mobile terminals integrate communication, entertainment, reading and working, and more people like to browse news, play games, watch movies or use social applications such as WeChat, QQ, etc. simultaneously with mobile terminals, so mobile terminals connected with at least one external display screen are strongly sought after and widely used.
Currently, in a mobile terminal with multiple screens, fingers are not able to move across the screens like a mouse, making the operation of displaying information in at least one external display screen connected to the mobile terminal not friendly enough.
Disclosure of Invention
The application provides a multi-screen-based information display method and a multi-screen-based information display terminal, which are convenient for a user to quickly select proper external display screens to display information in a multi-screen mobile terminal application scene.
In a first aspect, the present application provides a multi-screen-based information display method, including:
detecting a first input for a target icon, the target icon being displayed in the first display screen;
displaying information representing the at least one external display screen in at least one display area, respectively;
detecting a second input for the target icon;
and displaying the object represented by the target icon in the external display screen corresponding to the display area targeted by the second input.
In a second aspect, the present application provides a terminal, comprising:
a first detection unit configured to detect a first input (e.g., a long press) for a target icon displayed in the first display screen;
a first display unit for displaying information representing the at least one external display screen in at least one display area, respectively;
a second detection unit, configured to detect a second input (such as a drag, a slide, a move, a gesture, or a release operation) for the target icon;
and the second display unit is used for displaying the object represented by the target icon in the external display screen corresponding to the display area targeted by the second input.
In a third aspect, the present application provides another terminal, including a processor, an input device, an output device, and a memory, where the processor, the input device, the output device, and the memory are connected to each other, where the memory is used for storing application program codes that support the terminal to execute the above method, and the processor is configured to execute the above method of the first aspect.
In a fourth aspect, the present application provides a computer-readable storage medium having stored thereon instructions which, when executed by a processor, cause the processor to perform the method of the first aspect described above.
In the application, a terminal detects a first input aiming at a target icon, and the target icon is displayed in a first display screen; further, information representing the at least one external display screen is displayed in at least one display area, respectively; then, a second input for the target icon is detected; and finally, displaying the object represented by the target icon in the external display screen corresponding to the display area targeted by the second input. Understandably, the terminal provides a display area for displaying information representing at least one external display screen, thereby facilitating the user to select a proper external display screen to display the object represented by the icon according to the information of the corresponding external display screen in the display area. Furthermore, according to the type of the object represented by the target icon, the information of the external display screen matched with the object represented by the display target icon is displayed in a distinguishing mode, and therefore a user can conveniently and quickly select a proper object represented by the display icon on the external display screen according to the information of the external display screen in the display area. Optionally, the information of the external display screen matched with the object represented by the display target icon is output in the display frame popped up in the display area, so that a user can conveniently and quickly select a proper object represented by the display icon on the external display screen according to the information of the external display screen in the display area. Thereby greatly improving the user experience.
Drawings
In order to more clearly illustrate the technical solutions of the embodiments of the present application, the drawings needed to be used in the description of the embodiments are briefly introduced below, and it is obvious that the drawings in the following description are some embodiments of the present application, and it is obvious for those skilled in the art to obtain other drawings based on these drawings without creative efforts.
FIG. 1 is a schematic flow chart of a multi-screen-based information display method provided by the present application;
FIG. 2A is an interface display diagram of another multi-screen based information display method provided herein;
FIG. 2B is an interface display diagram of another multi-screen based information display method provided herein;
FIG. 2C is an interface display diagram of another multi-screen based information display method provided herein;
FIG. 2D is an interface display diagram of another multi-screen based information display method provided herein;
FIG. 2E is an interface display diagram of another multi-screen based information display method provided herein;
FIG. 2F is an interface display diagram of another multi-screen based information display method provided herein;
FIG. 2G is an interface display diagram of another multi-screen based information display method provided herein;
FIG. 2H is an interface display diagram of another multi-screen based information display method provided herein;
FIG. 3 is a functional block diagram of a terminal provided herein;
fig. 4 is a schematic block diagram of a terminal provided in the present application.
Detailed Description
The technical solutions in the embodiments of the present application will be clearly and completely described below with reference to the drawings in the embodiments of the present application, and it is obvious that the described embodiments are some, but not all, embodiments of the present application. All other embodiments, which can be derived by a person skilled in the art from the embodiments given herein without making any creative effort, shall fall within the protection scope of the present application.
It will be understood that the terms "comprises" and/or "comprising," when used in this specification and the appended claims, specify the presence of stated features, integers, steps, operations, elements, and/or components, but do not preclude the presence or addition of one or more other features, integers, steps, operations, elements, components, and/or groups thereof.
It is also to be understood that the terminology used in the description of the present application herein is for the purpose of describing particular embodiments only and is not intended to be limiting of the application. As used in the specification of the present application and the appended claims, the singular forms "a," "an," and "the" are intended to include the plural forms as well, unless the context clearly indicates otherwise.
It should be further understood that the term "and/or" as used in this specification and the appended claims refers to and includes any and all possible combinations of one or more of the associated listed items.
As used in this application and the appended claims, the term "if" may be interpreted contextually as "when", "upon" or "in response to a determination" or "in response to a detection". Similarly, the phrase "if it is determined" or "if a [ described condition or event ] is detected" may be interpreted contextually to mean "upon determining" or "in response to determining" or "upon detecting [ described condition or event ]" or "in response to detecting [ described condition or event ]".
In particular implementations, the terminals described in embodiments of the present application include, but are not limited to, other portable devices such as mobile phones, laptop computers, or tablet computers having touch sensitive surfaces (e.g., touch screen displays and/or touch pads). It should also be understood that in some embodiments, the device is not a portable communication device, but is a desktop computer having a touch-sensitive surface (e.g., a touch screen display and/or touchpad).
In the discussion that follows, a terminal that includes a display and a touch-sensitive surface is described. However, it should be understood that the terminal may include one or more other physical user interface devices such as a physical keyboard, mouse, and/or joystick.
The terminal supports various applications, such as one or more of the following: a drawing application, a presentation application, a word processing application, a website creation application, a disc burning application, a spreadsheet application, a gaming application, a telephone application, a video conferencing application, an email application, an instant messaging application, an exercise support application, a photo management application, a digital camera application, a web browsing application, a digital music player application, and/or a digital video player application.
Various applications that may be executed on the terminal may use at least one common physical user interface device, such as a touch-sensitive surface. One or more functions of the touch-sensitive surface and corresponding information displayed on the terminal can be adjusted and/or changed between applications and/or within respective applications. In this way, a common physical architecture (e.g., touch-sensitive surface) of the terminal can support various applications with user interfaces that are intuitive and transparent to the user.
Referring to fig. 1, fig. 1 is a schematic flowchart of a multi-screen-based information display method provided in the present application, and as shown in fig. 1, the method includes:
s101, a terminal detects a first input aiming at a target icon, and the target icon is displayed in a first display screen.
In this application, the target icon may represent a file, program, web page or command, etc. The user may execute a command (e.g., open an application characterized by an icon) by clicking, double-clicking, dragging, or long-pressing the icon. The terminal comprises a first display screen, the target icon, namely the icon for the first input, can be any icon in the first display screen, and here, the first display screen can be at least one touch screen of the terminal.
Optionally, the first input may be a touch operation, such as a single click, a double click, or a long press, and the terminal may detect the first input through the touch screen.
Alternatively, the first input may be a somatosensory operation (including a gesture operation), such as an operation of a hand selecting an object in the air. The terminal may detect the first input through an infrared ranging sensor in the camera.
Optionally, the first input may also be a voice input, e.g., "open WeChat in external display screen" for a particular voice input, the target icon for which the voice input is a WeChat icon. The terminal may detect the first input through the voice recognition module (i.e., the voice recognition module recognizes for each voice message).
Without being limited to the above-described implementations, the first input may also be other forms of user input, and is not limited herein.
And S102, respectively displaying information used for representing the at least one external display screen in at least one display area.
Specifically, the terminal includes a first display screen and at least one external display screen, and the display area may include at least one of a two-dimensional display area or a three-dimensional display area on the first display screen.
The following further describes how to display information representing the at least one external display screen in at least one display area, respectively, in conjunction with fig. 2A-2H.
Fig. 2A illustrates a scheme of how information representing the at least one external display screen is displayed in the display area of the at least one first display screen, respectively. The information of the external display screen may include parameters such as an appearance picture, a brand name, a size, a color, and a shape of the external display screen.
As shown in fig. 2A, the first display 201 is displayed with icons 202 and includes at least one display area 203, and the display area 203 is displayed with the name "external display 3" of the external display 3. The examples are merely illustrative of the present application and should not be construed as limiting.
In some optional embodiments, when the external display screen corresponding to the display area reaches the display threshold of the external display screen, the information of the external display screen is removed from the display area. The display threshold of the external display screen is related to the display capability of the external display screen. Display capability is how many objects an external display screen can display simultaneously.
For example, in fig. 2A, it is assumed that the display capability of the external display screen 3 is such that 2 documents can be displayed simultaneously. That is, the display threshold of the external display 3 is such that 2 documents are displayed simultaneously, and when the external display 3 displays 2 documents simultaneously, the external display 3 reaches the display threshold. This implementation is applicable not only to the embodiment corresponding to fig. 2A, but also to the embodiments corresponding to fig. 2B-2H, and fig. 2A is only used for explaining the present application and should not be construed as limiting.
In some optional embodiments, an object with a target icon representation displayed thereon and an object without the target icon representation displayed thereon may be displayed in different display areas. And displaying information for representing the first external display screen at the first position, and displaying information for representing the second external display screen at the second position, wherein the first external display screen displays an object represented by the target icon, the second display screen does not display the object represented by the target icon, and the first position is different from the second position. The object represented by the icon may be a file, program, web page or command, etc.
As shown in fig. 2B, the first display screen 201 displays icons 202, which include a first position 204 and a second position 205, 6 display areas such as a display area 203 are distributed in the first position 204, and 6 display areas such as a display area 9 are distributed in the second position 205. The information used for representing the first external display screen is displayed at the first position 204, and the information used for representing the second external display screen is displayed at the second position 205, wherein the first external display screen comprises 6 external display screens including an external display screen 1, an external display screen 2, an external display screen 3, an external display screen 4, an external display screen 5, an external display screen 6 and the like, the second external display screen comprises 6 external display screens including an external display screen 7, an external display screen 8, an external display screen 9, an external display screen 10, an external display screen 11, an external display screen 12 and the like, an object represented by a target icon is displayed in the first external display screen, and an object represented by the target icon is not displayed in the second display screen. The above features are not only applicable to the embodiment corresponding to fig. 2A, but also applicable to the annular region, as shown in fig. 2E, the first display screen 201 displays icons 207, including an inner ring display region 210 and an outer ring display region 211. In the annular region, the first position may be an inner ring display region 210, the second position may be an outer ring display region 211, information representing a first outer display screen is displayed in the inner ring display region 210, information representing a second outer display screen is displayed in the outer ring display region 211, and an object represented by a target icon is displayed in the first outer display screen, and an object represented by the target icon is not displayed in the second outer display screen. Fig. 2B and 2E are merely for explaining the present application and should not be construed as limiting. In practical applications, the number, size, shape, etc. of the display areas may be determined according to practical requirements, and are not limited herein.
In some alternative embodiments, the information representing the first external display screen and the information representing the second external display screen may also be displayed differently. Here, the differential display may mean that the display manner of the information is different, such as font, color, and the like.
As shown in fig. 2C, the first display screen 201 is displayed with icons 202 and includes at least one display area 203, the name of the external display screen 3 in the highlighted (i.e., distinctively displayed) display area 203 "external display screen 3" and the names of the external display screens in the other display areas in the first display screen. This implementation is not only applicable to the present embodiment, but also to the embodiments corresponding to fig. 2D-2H, respectively. Fig. 2C is merely illustrative of the present application and should not be construed as limiting.
In some alternative embodiments, the at least one display area may also be distributed in one annular display area.
As shown in fig. 2D, the first display screen 201 displays icons 207 and includes a first ring 209, and the first ring 209 has 8 display areas such as a display area 203. The display area 203 displays the name "external display 3" of the external display 3. Fig. 2D is merely illustrative of the present application and should not be construed as limiting.
In some alternative embodiments, the plurality of display areas may also be distributed among a nested plurality of annular display areas.
As shown in fig. 2E, the first display 201 displays icons 207, and includes an inner ring display area 210 and an outer ring display area 211. Information representing a first external display screen in which an object represented by a target icon is displayed in the inner ring display area 210, and information representing a second external display screen in which the object represented by the target icon is not displayed is displayed in the outer ring display area 211.
In some optional embodiments, the terminal may record the duration of the first input in real time, and distinguish information of the external display screen in the display area corresponding to the duration in real time according to the duration, where the information of the external display screen may be a name, a size, a color, and a brand type of the external display screen.
As shown in fig. 2F, the first display screen 201 displays an icon 207 and includes a first ring 209, and the first ring 209 has eight display areas, i.e., display areas 203, 219, and 220. Assuming that the first input is a long press input of the user's finger 218, the duration of the first input, i.e., the time for which the user's finger 218 presses the target icon 207, for example, when the time for which the user's finger 218 presses the target icon 202 reaches 1s, the name of the external display screen 1 "in the highlight (i.e., difference display) display area 219 and the names of the external display screens in the other display areas in the first ring 209 are highlighted. When the target icon 207 is pressed by the user's finger 218 for a duration of 2s, the name of the external display screen 2 in the display area 220, "external display screen 2," is highlighted (i.e., distinguished from the display) and the names of the external display screens in the other display areas in the first ring 209. When the target icon 207 is pressed by the user's finger 218 for a duration of 1s, the name of the external display screen 3 in the display area 203 "external display screen 3" is highlighted (i.e., distinguished from the display) and the names of the external display screens in the other display areas in the first ring 209. This implementation is not only applicable to the embodiment, but also to the embodiments corresponding to fig. 2A-2C, 2F and 2G-2H, respectively, and fig. 2F is only used for explaining the present application and should not be construed as limiting. In practical applications, the number, size, shape, etc. of the display areas may be determined according to practical requirements, and are not limited herein.
In some optional embodiments, the display area may also be a three-dimensional space display area, and the terminal may respectively display information representing the at least one external display screen in at least one of the three-dimensional space display areas. In particular, the at least one three-dimensional spatial display area may be generated by the terminal by 3D projection over the terminal.
As shown in fig. 2G, the terminal 200 includes a first display screen 201, the first display screen 201 displays an icon 207, a first projection area 221 is generated by the terminal 200 through 3D projection above the terminal 200, and the first projection area 221 includes a plurality of display areas such as a three-dimensional space display area 222. The first input may be a somatosensory operation for each three-dimensional space display region, and assuming that the first input is a long-press input of the user's finger 218, when the user's finger 218 presses the target icon 207, the terminal 200 projects the first projection region 221 above the position of the terminal 200. Specifically, the three-dimensional space display area 222 displays an appearance picture and size information of the external display screen 9. Fig. 2G is merely illustrative of the present application and should not be construed as limiting. In practical applications, the number, size, shape, etc. of the display areas may be determined according to practical requirements, and are not limited herein.
In some optional embodiments, the terminal may display information of an external display screen matched with an object displaying the target icon representation differently according to the type of the object represented by the target icon, where the information of the external display screen may be the size, color, and brand type of the external display screen.
As shown in FIG. 2H, the object represented by the target icon 207 selected by the user's finger 218 is video playback software. At this time, the information of the external display screen matched with the video playing software is highlighted (i.e., differentially displayed), and the terminal 200 can recommend the user to open the video playing software by using the appropriate external display screen through the information of the external display screen 3 matched with the video playing software in the highlighted three-dimensional space display area 222, because the external display screen 3 is a wide screen, the requirement of the user for watching a movie can be met. This implementation is applicable not only to the embodiment corresponding to fig. 2H, but also to the embodiments corresponding to fig. 2A-2F, respectively. Fig. 2H is merely illustrative of the present application and should not be construed as limiting. In practical applications, the number, size, shape, etc. of the display areas may be determined according to practical requirements, and are not limited herein.
S103, detecting a second input aiming at the target icon.
The second input is further described below in conjunction with embodiments corresponding to fig. 2A-2H, respectively. Specifically, the second input may be at least one of: touch operation (such as click operation, sliding operation and the like), motion sensing operation (including gesture operation, wherein the gesture operation can be selection gesture, sliding gesture and the like), voice input operation, image input operation and the like, and the second input can be used for selecting the object represented by the target to be displayed through the external display screen corresponding to the display area.
In the embodiments respectively corresponding to fig. 2A-2C, the second input may be a drag operation for the target icon. For example, the user drags the target icon 202 to the display area 203, and the dragging operation is the second input.
In the embodiments corresponding to fig. 2D-2E, respectively, the second input may be a sliding operation for the target icon. For example, sliding the target icon 207 in the direction of the display area 208.
In the embodiment corresponding to fig. 2F, the second input may be a release operation for the target icon. For example, the user performs a release operation after long-pressing the target icon 207.
In the embodiments respectively corresponding to fig. 2G-2H, the second input may be a somatosensory operation (including a gesture operation) for the target icon.
The second input is not limited to the above modes, but may be other types of input, and is not limited herein.
And S104, displaying the object represented by the target icon in the external display screen corresponding to the display area targeted by the second input.
In this application, the target icon may represent a file, program, web page or command, etc., which facilitates the user to quickly execute the command and open the program file. As shown in fig. 2A, an object (e.g., a movie) represented by the target icon 202 is displayed in the external display screen 3 corresponding to the display area 203 targeted by the second input (e.g., dragging), that is, the movie can be played in the external display screen 3 corresponding to the display area 203 targeted by the second input.
Optionally, when the object represented by the first icon is already displayed in the external display screen corresponding to the display area targeted by the second input, a display frame for the user to select is popped up from the display area, and a decision statement whether to continue displaying the object represented by the second icon on the display screen corresponding to the display area targeted by the second input is output in the display frame. And if the object represented by the first icon is not displayed, displaying the object represented by the second icon through the external display screen of the object represented by the second icon.
The method comprises the steps that firstly, a first input (such as long pressing) aiming at a target icon is detected, and the target icon is displayed in a first display screen; further, information for representing at least one external display screen is displayed in the at least one display area, respectively; that is, the terminal provides a display area for displaying information representing at least one external display screen, thereby facilitating a user to select an appropriate external display screen to display an object represented by an icon according to the information of the external display screen in the display area. Furthermore, according to the type of the object represented by the target icon, the information of the external display screen matched with the object represented by the display target icon is displayed in a distinguishing mode, and therefore a user can conveniently and quickly select a proper object represented by the display icon on the external display screen according to the information of the external display screen in the display area. Optionally, the information of the external display screen matched with the object represented by the display target icon is output in the display frame popped up in the display area, so that a user can conveniently and quickly select a proper object represented by the display icon on the external display screen according to the information of the external display screen in the display area. This greatly enhances the user experience.
Referring to fig. 3, fig. 3 is a functional block diagram of a terminal provided in the present application. The functional blocks of the terminal can be implemented by hardware, software or a combination of hardware and software. Those skilled in the art will appreciate that the functional blocks described in fig. 3 may be combined or separated into sub-blocks to implement the application scheme. Thus, the above description in this application may support any possible combination or separation or further definition of the functional blocks described below.
As shown in fig. 3, the terminal 300 may include: a first detection unit 301, a first display unit 302, a second detection unit 303, and a second display unit 304. Wherein:
a first detection unit 301, configured to detect a first input (such as a long-press, a drag, a slide, or a somatosensory operation) for a target icon, where the target icon is displayed in the first display screen;
a first display unit 302 for displaying information representing the at least one external display screen in at least one display area, respectively;
a second detecting unit 303, configured to detect a second input (e.g., a drag, a slide, a motion, or a release operation) for the target icon;
a second display unit 304, configured to display an object represented by the target icon in an external display screen corresponding to the display area targeted by the second input.
The first detection unit 301 is configured to detect a first input for a target icon, which is displayed in the first display screen. Specifically, the method comprises the following steps:
the terminal comprises a first display screen, and the target icon, namely the icon targeted by the first input, can be any icon in the first display screen.
Alternatively, when the first input is a touch operation, for example, a single click, a double click, or a long press, the first detection unit 301 may detect the first input through the touch screen.
Alternatively, the first input may be a somatosensory operation (including a gesture operation), such as an operation of a hand selecting an object in the air. The terminal may detect the first input through an infrared ranging sensor in the camera.
Optionally, the first input may also be a voice input, and the terminal may detect the first input through a voice recognition module.
The first display unit 302 is configured to display information representing the at least one external display screen in at least one display area, respectively. Specifically, the method comprises the following steps:
specifically, the terminal includes a first display screen and at least one external display screen, and the display area may include at least one of a two-dimensional display area or a three-dimensional display area on the first display screen.
In some optional embodiments, when the external display screen corresponding to the display area reaches the display threshold of the external display screen, the information of the external display screen is removed from the display area. The display threshold of the external display screen is related to the display capability of the external display screen. Display capability is how many objects an external display screen can display simultaneously. The examples are merely illustrative of the present application and should not be construed as limiting.
In some optional embodiments, an object with a target icon representation displayed thereon and an object without the target icon representation displayed thereon may be displayed in different display areas. And displaying information for representing the first external display screen at a first position, and displaying information for representing the second external display screen at a second position, wherein the first external display screen displays an object represented by the target icon, the second display screen does not display the object represented by the target icon, and the first position is different from the second position. The object represented by the icon may be a file, program, web page or command, etc.
In some alternative embodiments, the information representing the first external display screen and the information representing the second external display screen may also be displayed differently. Here, the differential display may mean that the display manner of the information is different, such as font, color, and the like.
In some alternative embodiments, the at least one display area may also be distributed in one annular display area.
In some alternative embodiments, the plurality of display areas may also be distributed among a nested plurality of annular display areas.
In some optional embodiments, the terminal may record the duration of the first input in real time, and distinguish information of the external display screen in the display area corresponding to the duration in real time according to the duration, where the information of the external display screen may be a name, a size, a color, and a brand type of the external display screen.
In some optional embodiments, the display area may also be a three-dimensional space display area, and the first display unit 302 may respectively display information representing the at least one external display screen in at least one three-dimensional space display area. In particular, the at least one three-dimensional spatial display area may be generated by the terminal by 3D projection over the terminal.
In some alternative embodiments, the first display unit 302 may display information of an external display screen matched with an object represented by the target icon according to the type of the object represented by the target icon, wherein the information of the external display screen may be the size, color and brand type of the external display screen.
The second detection unit 303 is configured to detect a second input (such as a drag, a slide, a move, a gesture, or a release operation) for the target icon. Specifically, the method comprises the following steps:
the second input is further described below in conjunction with embodiments corresponding to fig. 2A-2H, respectively. Specifically, the second input may be at least one of: touch operation (such as click operation, sliding operation and the like), motion sensing operation (including gesture operation, wherein the gesture operation can be selection gesture, sliding gesture and the like), voice input operation, image input operation and the like, and the second input can be used for selecting the object represented by the target to be displayed through the external display screen corresponding to the display area.
In the embodiments respectively corresponding to fig. 2A-2C, the second input may be a drag operation for the target icon.
In the embodiments corresponding to fig. 2D-2E, respectively, the second input may be a sliding operation for the target icon.
In the embodiment corresponding to fig. 2F, the second input may be a release operation for the target icon.
In the embodiments respectively corresponding to fig. 2G-2H, the second input may be a somatosensory operation (including a gesture operation) for the target icon.
The second display unit 304 is configured to display an object represented by the target icon in an external display screen corresponding to the display area targeted by the second input. Specifically, the method comprises the following steps:
when the object represented by the first icon is displayed in the external display screen corresponding to the display area targeted by the second input, a display frame selected by the user is popped out from the display area, and a judgment statement whether to continue displaying the object represented by the second icon on the display screen corresponding to the display area targeted by the second input is output in the display frame. And if the object represented by the first icon is not displayed, displaying the object represented by the second icon through the external display screen of the object represented by the second icon.
It can be understood that, with regard to the specific implementation manner of the functional blocks included in the terminal 300 of fig. 3, reference may be made to the foregoing embodiments, which are not described herein again.
Fig. 4 is a schematic block diagram of a terminal provided in the present application. In this embodiment of the application, the terminal Device may include various terminal devices such as a Mobile phone, a tablet computer, a Personal Digital Assistant (PDA), a Mobile Internet Device (MID), and an intelligent wearable Device (e.g., an intelligent watch and an intelligent bracelet), which is not limited in this embodiment of the application. As shown in fig. 4, the terminal 400 may include: a baseband chip 410, memory 415 (one or more computer-readable storage media), a Radio Frequency (RF) module 416, and a peripheral system 417. These components may communicate over one or more communication buses 414.
The peripheral system 417 is mainly used to implement an interactive function between the terminal 410 and a user/external environment, and mainly includes an input and output device of the terminal 400. In a specific implementation, the peripheral system 417 may include: a touch screen controller 418, a camera controller 419, an audio controller 420, and a sensor management module 421. Wherein each controller may be coupled to a respective peripheral device (e.g., touch screen 423, camera 424, audio circuitry 425, and sensors 426). In some embodiments, the touch screen 423 may be a touch screen configured with a self-capacitive floating touch panel, or may be a touch screen configured with an infrared floating touch panel. In some embodiments, camera 424 may be a 3D camera. It should be noted that the peripheral system 417 may also include other I/O peripherals.
The baseband chip 410 may integrally include: one or more processors 411, a clock module 412, and a power management module 413. The clock module 412 integrated in the baseband chip 410 is mainly used for generating clocks required for data transmission and timing control for the processor 411. The power management module 413 integrated in the baseband chip 410 is mainly used for providing stable and high-precision voltage for the processor 411, the rf module 416 and peripheral systems.
A Radio Frequency (RF) module 416 for receiving and transmitting radio frequency signals mainly integrates a receiver and a transmitter of the terminal 100. Radio Frequency (RF) module 416 communicates with communication networks and other communication devices via radio frequency signals. In particular implementations, the Radio Frequency (RF) module 416 may include, but is not limited to: an antenna system, an RF transceiver, one or more amplifiers, a tuner, one or more oscillators, a digital signal processor, a CODEC chip, a SIM card, a storage medium, and the like. In some embodiments, the Radio Frequency (RF) module 416 may be implemented on a separate chip.
Memory 415 is coupled to processor 411 for storing various software programs and/or sets of instructions. In particular implementations, memory 415 may include high-speed random access memory and may also include non-volatile memory, such as one or more magnetic disk storage devices, flash memory devices, or other non-volatile solid-state storage devices. The memory 415 may store an operating system (hereinafter referred to simply as a system), such as an embedded operating system like ANDROID, IOS, WINDOWS, or LINUX. The memory 415 may also store network communication programs that may be used to communicate with one or more additional devices, one or more terminal devices, and one or more network devices. The memory 415 may further store a user interface program, which may vividly display the content of the application program through a graphical operation interface, and receive a control operation of the application program from a user through input controls such as menus, dialog boxes, and buttons.
The memory 415 may also store one or more application programs. As shown in fig. 4, these applications may include: social applications (e.g., Facebook), image management applications (e.g., photo album), map-like applications (e.g., Google map), browsers (e.g., Safari, Google Chrome), and so forth.
It should be understood that terminal 400 is only one example provided for the embodiments of the present application and that terminal 400 may have more or fewer components than shown, may combine two or more components, or may have a different configuration implementation of components.
It can be understood that, with regard to the specific implementation manner of the functional blocks included in the terminal 400 of fig. 4, reference may be made to the foregoing embodiments, which are not described herein again.
In another embodiment of the present application, a computer-readable storage medium is provided, storing a computer program that when executed by a processor implements:
the computer readable storage medium may be an internal storage unit of the terminal according to any of the foregoing embodiments, for example, a hard disk or a memory of the terminal. The computer readable storage medium may also be an external storage device of the terminal, such as a plug-in hard disk, a Smart Media Card (SMC), a Secure Digital (SD) Card, a Flash memory Card (Flash Card), and the like provided on the terminal. Further, the computer-readable storage medium may also include both an internal storage unit and an external storage device of the terminal. The computer-readable storage medium is used for storing the computer program and other programs and data required by the terminal. The computer readable storage medium may also be used to temporarily store data that has been output or is to be output.
Those of ordinary skill in the art will appreciate that the various illustrative elements and algorithm steps described in connection with the embodiments disclosed herein may be implemented as electronic hardware, computer software, or combinations of both, and that the foregoing description is generally in terms of functions for clarity of illustration.
It can be clearly understood by those skilled in the art that, for convenience and brevity of description, the specific working processes of the terminal and the unit described above may refer to the corresponding processes in the foregoing method embodiments, and are not described herein again.
In the several embodiments provided in the present application, it should be understood that the disclosed terminal and method can be implemented in other manners. For example, the components and steps of the various examples are described. Whether such functionality is implemented as hardware or software depends upon the particular application and design constraints imposed on the implementation. Skilled artisans may implement the described functionality in varying ways for each particular application, but such implementation decisions should not be interpreted as causing a departure from the scope of the present application.
The above-described terminal embodiments are merely illustrative, and for example, the division of the units is only one logical function division, and other division manners may be available in actual implementation, for example, a plurality of units or components may be combined or may be integrated into another system, or some features may be omitted, or may not be executed. Further, the shown or discussed mutual coupling or direct coupling or communication connection may be an indirect coupling or communication connection through some interfaces, terminals or units, and may also be an electrical, mechanical or other form of connection.
The units described as separate parts may or may not be physically separate, and parts displayed as units may or may not be physical units, may be located in one place, or may be distributed on a plurality of network units. Some or all of the units can be selected according to actual needs to achieve the purpose of the solution of the embodiments of the present application.
In addition, functional units in the embodiments of the present application may be integrated into one processing unit, or each unit may exist alone physically, or two or more units are integrated into one unit. The integrated unit can be realized in a form of hardware, and can also be realized in a form of a software functional unit.
The integrated unit, if implemented in the form of a software functional unit and sold or used as a stand-alone product, may be stored in a computer readable storage medium. Based on such understanding, the technical solution of the present application may be substantially or partially contributed by the prior art, or all or part of the technical solution may be embodied in a software product, which is stored in a storage medium and includes instructions for causing a computer device (which may be a personal computer, a server, or a network device) to execute all or part of the steps of the method according to the embodiments of the present application. And the aforementioned storage medium includes: various media capable of storing program codes, such as a usb disk, a removable hard disk, a Read-Only Memory (ROM), a Random Access Memory (RAM), a magnetic disk, or an optical disk.
While the invention has been described with reference to specific embodiments, the scope of the invention is not limited thereto, and those skilled in the art can easily conceive various equivalent modifications or substitutions within the technical scope of the invention. Therefore, the protection scope of the present application shall be subject to the protection scope of the claims.

Claims (10)

  1. A multi-screen-based information display method is characterized in that a terminal comprises a first display screen and is connected with at least one external display screen, and the method comprises the following steps:
    detecting a first input (e.g., a long press) for a target icon, the target icon being displayed in the first display screen;
    displaying information representing the at least one external display screen in at least one display area, respectively;
    detecting a second input (such as a drag, slide, move, gesture, or release operation) to the target icon;
    and displaying the object represented by the target icon in the external display screen corresponding to the display area targeted by the second input.
  2. The method of claim 1, further comprising:
    and when the external display screen corresponding to the display area reaches the display threshold value of the external display screen, removing the display area.
  3. The method of claim 1, further comprising:
    displaying information representing a first external display screen at a first position and displaying information representing a second external display screen at a second position; the first external display screen displays objects represented by icons, and the second display screen does not display the objects represented by the icons; and, the first position is different from the second position.
  4. The method of claim 1 or 3, further comprising:
    displaying information representing the first external display screen and information representing the second external display screen differently; and the object represented by the target icon is displayed in the first external display screen, and the object represented by the target icon is not displayed in the second display screen.
  5. The method of claim 1, further comprising:
    recording the duration of the first input in real time, and distinguishing and displaying information of an external display screen in a display area corresponding to the duration in real time according to the duration. When a second input aiming at the target icon is detected, determining information of an external display screen in a display area corresponding to the total duration according to the total duration of the first input; and the information of the external display screen in the display area corresponding to the total duration is the information of the external display screen corresponding to the display area targeted by the second input.
  6. The method of claim 1, further comprising:
    at least one display area is distributed in one annular display area.
  7. The method of claim 6, further comprising
    The second input comprises a sliding operation, and the display area targeted by the second input is the display area pointed by the sliding operation.
  8. The method of claim 1,
    the display area includes at least one of a three-dimensional spatial display area or a two-dimensional display area on the first display screen.
  9. The method of claim 1, further comprising
    And according to the type of the object represented by the target icon, distinguishing and displaying information of an external display screen matched with the object represented by the target icon.
  10. A terminal, comprising:
    a memory, and a processor coupled to the memory, wherein the processor is configured to perform the method of any of claims 1-9.
CN201780095813.5A 2017-10-10 2017-10-10 Information display method and terminal Active CN111201507B (en)

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
PCT/CN2017/105490 WO2019071419A1 (en) 2017-10-10 2017-10-10 Multiscreen-based information display method, and terminal

Publications (2)

Publication Number Publication Date
CN111201507A true CN111201507A (en) 2020-05-26
CN111201507B CN111201507B (en) 2023-11-03

Family

ID=66100200

Family Applications (1)

Application Number Title Priority Date Filing Date
CN201780095813.5A Active CN111201507B (en) 2017-10-10 2017-10-10 Information display method and terminal

Country Status (2)

Country Link
CN (1) CN111201507B (en)
WO (1) WO2019071419A1 (en)

Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20030025678A1 (en) * 2001-08-04 2003-02-06 Samsung Electronics Co., Ltd. Apparatus with touch screen and method for displaying information through external display device connected thereto
CN102396291A (en) * 2009-04-14 2012-03-28 高通股份有限公司 System and method for mobile device display power savings
CN105302285A (en) * 2014-08-01 2016-02-03 福州瑞芯微电子股份有限公司 Multi-screen display method, equipment and system
CN106325650A (en) * 2015-06-19 2017-01-11 深圳创锐思科技有限公司 3D dynamic display method based on human-computer interaction and mobile terminal

Family Cites Families (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
KR101663474B1 (en) * 2009-10-13 2016-10-10 삼성전자주식회사 A mobile terminal, method for displaying background in a mobile terminal and storage medium
KR102003742B1 (en) * 2012-03-26 2019-07-25 삼성전자주식회사 Method and apparatus for managing screens in a portable terminal
CN104199552B (en) * 2014-09-11 2017-10-27 福州瑞芯微电子股份有限公司 Multi-display method, equipment and system
CN104915096A (en) * 2015-05-29 2015-09-16 努比亚技术有限公司 Application interface displaying method and device
CN105975142A (en) * 2015-10-26 2016-09-28 乐视移动智能信息技术(北京)有限公司 Method and device for icon moving
CN106648329A (en) * 2016-12-30 2017-05-10 维沃移动通信有限公司 Application icon display method and mobile terminal

Patent Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20030025678A1 (en) * 2001-08-04 2003-02-06 Samsung Electronics Co., Ltd. Apparatus with touch screen and method for displaying information through external display device connected thereto
CN102396291A (en) * 2009-04-14 2012-03-28 高通股份有限公司 System and method for mobile device display power savings
CN105302285A (en) * 2014-08-01 2016-02-03 福州瑞芯微电子股份有限公司 Multi-screen display method, equipment and system
CN106325650A (en) * 2015-06-19 2017-01-11 深圳创锐思科技有限公司 3D dynamic display method based on human-computer interaction and mobile terminal

Also Published As

Publication number Publication date
WO2019071419A1 (en) 2019-04-18
CN111201507B (en) 2023-11-03

Similar Documents

Publication Publication Date Title
US11314804B2 (en) Information search method and device and computer readable recording medium thereof
US10915225B2 (en) User terminal apparatus and method of controlling the same
US9952681B2 (en) Method and device for switching tasks using fingerprint information
US9261995B2 (en) Apparatus, method, and computer readable recording medium for selecting object by using multi-touch with related reference point
KR102010955B1 (en) Method for controlling preview of picture taken in camera and mobile terminal implementing the same
AU2017358278B2 (en) Method of displaying user interface related to user authentication and electronic device for implementing same
US20140368442A1 (en) Apparatus and associated methods for touch user input
KR102044826B1 (en) Method for providing function of mouse and terminal implementing the same
US20180018067A1 (en) Electronic device having touchscreen and input processing method thereof
CN107918563A (en) A kind of method, data processing equipment and user equipment replicated and paste
KR20140126140A (en) Mobile apparatus providing with changed-shortcut icon responding to status of mobile apparatus and control method thereof
KR20140045698A (en) Method for creating an task-recommendation-icon in mobile-apparatus and apparatus thereof
EP3143772A1 (en) Systems and methods for remote control of a television
KR20140111088A (en) Mobile apparatus providing preview by detecting rub gesture and control method thereof
US9792183B2 (en) Method, apparatus, and recording medium for interworking with external terminal
KR20140111171A (en) Electronic apparatus displaying representative information when scrolling content and control methode thereof
US20150370786A1 (en) Device and method for automatic translation
KR20150095523A (en) Electronic apparatus and method for extracting color in electronic apparatus
KR102203131B1 (en) Method for management file and electronic device thereof
KR20160096645A (en) Binding of an apparatus to a computing device
CN111226190A (en) List switching method and terminal
CN111201507B (en) Information display method and terminal
KR102149481B1 (en) Mobile terminal and method for controlling the same
WO2019051648A1 (en) Touch operation response method and terminal
US20150142797A1 (en) Electronic device and method for providing messenger service in the electronic device

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant