CN111201507B - Information display method and terminal - Google Patents

Information display method and terminal Download PDF

Info

Publication number
CN111201507B
CN111201507B CN201780095813.5A CN201780095813A CN111201507B CN 111201507 B CN111201507 B CN 111201507B CN 201780095813 A CN201780095813 A CN 201780095813A CN 111201507 B CN111201507 B CN 111201507B
Authority
CN
China
Prior art keywords
display screen
external display
display area
input
terminal
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Active
Application number
CN201780095813.5A
Other languages
Chinese (zh)
Other versions
CN111201507A (en
Inventor
张献中
黄成钟
郑雪瑞
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Shenzhen Transsion Communication Co Ltd
Original Assignee
Shenzhen Transsion Communication Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Shenzhen Transsion Communication Co Ltd filed Critical Shenzhen Transsion Communication Co Ltd
Publication of CN111201507A publication Critical patent/CN111201507A/en
Application granted granted Critical
Publication of CN111201507B publication Critical patent/CN111201507B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]
    • G06F3/0487Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser
    • G06F3/0488Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser using a touch-screen or digitiser, e.g. input of commands through traced gestures

Abstract

The application discloses an information display method and a terminal, wherein the terminal comprises a first display screen, and the terminal is also connected with at least one external display screen, and the method comprises the following steps: detecting a first input (such as a long press) for a target icon, the target icon being displayed in the first display screen; displaying information representing the at least one external display screen in at least one display area, respectively; detecting a second input (e.g., a drag, swipe, gesture, or release operation) for the target icon; and displaying the object represented by the target icon in an external display screen corresponding to the display area aimed by the second input. The application provides the information display method and the terminal, so that the user experience is greatly improved.

Description

Information display method and terminal
Technical Field
The present application relates to the field of electronic display technologies, and in particular, to a method and a terminal for displaying information.
Background
Mobile terminals integrate communication, entertainment, reading and working, and more people like to browse news, play games, watch movies or use social applications such as WeChat, QQ, etc. with the mobile terminals, so that the mobile terminals with at least one external display screen connected thereto are strongly pursued and widely used.
Currently, in a mobile terminal having multiple screens, fingers are not able to move across the screen like a mouse, making the operation of displaying information in at least one external display screen to which the mobile terminal is connected less friendly.
Disclosure of Invention
The application provides an information display method and a terminal, which are convenient for a user to quickly select a proper external display screen to display information in a multi-screen mobile terminal application scene.
In a first aspect, the present application provides an information display method, including:
when a first input aiming at a target icon is detected, the target icon is displayed in the first display screen;
displaying information of the at least one external display screen in at least one display area;
and when a second input aiming at the target icon is detected, displaying an object characterized by the target icon in the external display screen.
In a second aspect, the present application provides a terminal comprising:
the first detection unit is used for detecting a first input aiming at a target icon, and the target icon is displayed in the first display screen;
a first display unit for displaying information of the at least one external display screen in at least one display area;
a second detection unit configured to detect a second input for the target icon;
and the second display unit is used for displaying an object represented by the target icon in the external display screen when the second input aiming at the target icon is detected.
In a third aspect, the present application provides another terminal comprising a processor, an input device, an output device and a memory, the processor, the input device, the output device and the memory being interconnected, wherein the memory is adapted to store application code supporting the terminal to perform the above method, the processor being configured to perform the method of the first aspect.
In a fourth aspect, the present application provides a computer readable storage medium storing instructions which, when executed by a processor, cause the processor to perform the method of the first aspect described above.
In the application, a terminal detects a first input aiming at a target icon, wherein the target icon is displayed in a first display screen; further, information representing the at least one external display screen is displayed in at least one display area, respectively; then, a second input for the target icon is detected; and finally, displaying the object represented by the target icon in an external display screen corresponding to the display area aimed by the second input. It will be appreciated that the terminal provides a display area for displaying information representing at least one external display screen, thereby facilitating the user's selection of an appropriate external display screen to display the object represented by the icon based on the information of the corresponding external display screen in the display area. Further, according to the type of the object represented by the target icon, information of the external display screen matched with the object represented by the target icon is displayed in a distinguishing mode, and therefore a user can conveniently and quickly select a proper external display screen to display the object represented by the icon according to the information of the external display screen in the display area. Optionally, information of an external display screen matched with the object represented by the display target icon is output in a display frame popped up in the display area, so that a user can conveniently and quickly select a proper external display screen to display the object represented by the icon according to the information of the external display screen in the display area. Thereby greatly improving the user experience.
Drawings
In order to more clearly illustrate the technical solutions of the embodiments of the present application, the drawings required for the description of the embodiments will be briefly described below, and it is obvious that the drawings in the following description are some embodiments of the present application, and other drawings may be obtained according to these drawings without inventive effort for a person skilled in the art.
FIG. 1 is a schematic flow chart of an information display method provided by the application;
FIG. 2A is an interface display diagram of another information display method provided by the present application;
FIG. 2B is an interface display diagram of yet another information display method provided by the present application;
FIG. 2C is an interface display diagram of yet another information display method provided by the present application;
FIG. 2D is an interface display diagram of yet another information display method provided by the present application;
FIG. 2E is an interface display diagram of yet another information display method provided by the present application;
FIG. 2F is an interface display diagram of yet another information display method provided by the present application;
FIG. 2G is an interface display diagram of yet another information display method provided by the present application;
FIG. 2H is an interface display diagram of yet another information display method provided by the present application;
FIG. 3 is a functional block diagram of a terminal provided by the present application;
fig. 4 is a schematic block diagram of a terminal provided by the present application.
Detailed Description
The following description of the embodiments of the present application will be made clearly and fully with reference to the accompanying drawings, in which it is evident that the embodiments described are some, but not all embodiments of the application. All other embodiments, which can be made by those skilled in the art based on the embodiments of the application without making any inventive effort, are intended to be within the scope of the application.
It should be understood that the terms "comprises" and "comprising," when used in this specification and the appended claims, specify the presence of stated features, integers, steps, operations, elements, and/or components, but do not preclude the presence or addition of one or more other features, integers, steps, operations, elements, components, and/or groups thereof.
It is also to be understood that the terminology used in the description of the application herein is for the purpose of describing particular embodiments only and is not intended to be limiting of the application. As used in this specification and the appended claims, the singular forms "a," "an," and "the" are intended to include the plural forms as well, unless the context clearly indicates otherwise.
It should be further understood that the term "and/or" as used in the present specification and the appended claims refers to any and all possible combinations of one or more of the associated listed items, and includes such combinations.
As used in this specification and the appended claims, the term "if" may be construed as "when..once" or "in response to a determination" or "in response to detection" depending on the context. Similarly, the phrase "if a determination" or "if a [ described condition or event ] is detected" may be interpreted in the context of meaning "upon determination" or "in response to determination" or "upon detection of a [ described condition or event ]" or "in response to detection of a [ described condition or event ]".
In particular implementations, the terminals described in embodiments of the application include, but are not limited to, other portable devices such as mobile phones, laptop computers, or tablet computers having a touch-sensitive surface (e.g., a touch screen display and/or a touch pad). It should also be appreciated that in some embodiments, the device is not a portable communication device, but a desktop computer having a touch-sensitive surface (e.g., a touch screen display and/or a touch pad).
In the following discussion, a terminal including a display and a touch sensitive surface is described. However, it should be understood that the terminal may include one or more other physical user interface devices such as a physical keyboard, mouse, and/or joystick.
The terminal supports various applications, such as one or more of the following: drawing applications, presentation applications, word processing applications, website creation applications, disk burning applications, spreadsheet applications, gaming applications, telephony applications, video conferencing applications, email applications, instant messaging applications, workout support applications, photo management applications, digital camera applications, digital video camera applications, web browsing applications, digital music player applications, and/or digital video player applications.
Various applications that may be executed on the terminal may use at least one common physical user interface device such as a touch sensitive surface. One or more functions of the touch-sensitive surface and corresponding information displayed on the terminal may be adjusted and/or changed between applications and/or within the corresponding applications. In this way, the common physical architecture (e.g., touch-sensitive surface) of the terminal may support various applications with user interfaces that are intuitive and transparent to the user.
Referring to fig. 1, fig. 1 is a schematic flow chart of an information display method provided by the present application, as shown in fig. 1, the method includes:
and S101, when the terminal detects a first input aiming at a target icon, the target icon is displayed in the first display screen.
In the present application, the target icon may represent a file, program, web page, command, or the like. The user may execute a command (e.g., open an application represented by an icon) by clicking, double clicking, dragging, or long pressing the icon. The terminal comprises a first display screen, and the target icon, namely the icon aimed by the first input, can be any icon in the first display screen, wherein the first display screen can be at least one touch screen of the terminal.
Alternatively, the first input may be a touch operation, such as a single click, a double click, or a long press, and the terminal may detect the first input through the touch screen.
Alternatively, the first input may be a somatosensory operation (including a gesture operation), such as an operation of a hand to select a target in the air. The terminal may detect the first input through an infrared ranging sensor in the camera.
Alternatively, the first input may also be a voice input, for example, "open WeChat in external display screen" for a specific voice input, the target icon for which is a WeChat icon. The terminal may detect the first input through the voice recognition module (i.e., the voice recognition module recognizes each voice message).
Without limitation to the several implementations described above, the first input may also be other forms of user input, without limitation.
S102, displaying information of the at least one external display screen in at least one display area, and specifically displaying information used for representing the at least one external display screen in at least one display area respectively.
Specifically, the terminal includes a first display screen and at least one external display screen, and the display area may include at least one of a two-dimensional display area or a three-dimensional display area on the first display screen.
A scheme of how information representing the at least one external display screen is displayed in the at least one display area, respectively, is further described below in connection with fig. 2A-2H.
Fig. 2A illustrates a scheme of how information representing at least one external display screen is displayed in display areas of at least one first display screen, respectively. The information of the external display screen may include parameters such as an appearance picture, a brand name, a size, a color, and a shape of the external display screen.
As shown in fig. 2A, the first display 201 displays an icon 202 and includes at least one display area 203, and the display area 203 displays the name "external display 3" of the external display 3. The examples are merely illustrative of the application and should not be construed as limiting.
In some alternative embodiments, when the external display screen corresponding to the display area reaches a display threshold of the external display screen, information of the external display screen is removed from the display area. The display threshold of the external display screen is related to the display capability of the external display screen. Display capability, i.e. how many and what objects an external display screen is capable of simultaneously displaying.
For example, in fig. 2A, it is assumed that the display capability of the external display screen 3 is such that 2 documents can be displayed simultaneously. That is, the display threshold of the external display 3 is 2 documents displayed simultaneously, and when the external display 3 displays 2 documents simultaneously, the external display 3 reaches the display threshold. This implementation is not only applicable to the embodiment of fig. 2A, but also to the embodiment of fig. 2B-2H, and fig. 2A is merely for explaining the present application and should not be construed as limiting.
In some alternative embodiments, the object with the target icon representation and the object without the target icon representation can be displayed in different display areas respectively. Information for representing a first external display screen is displayed at a first location, information for representing a second external display screen is displayed at a second location, wherein an object represented by a target icon is displayed in the first external display screen, an object represented by the target icon is not displayed in the second display screen, and the first location is different from the second location. The object represented by the icon may be a file, program, web page, command, or the like.
As shown in fig. 2B, the first display 201 displays an icon 202, which includes a first position 204 and a second position 205, where the first position 204 is distributed with 6 display areas such as the display area 203, and the second position 205 is distributed with 6 display areas such as the display area 9. Information for representing a first external display screen including 6 external display screens such as external display screen 1, external display screen 2, external display screen 3, external display screen 4, external display screen 5, and external display screen 6 is displayed at a first position 204, and information for representing a second external display screen including 6 external display screens such as external display screen 7, external display screen 8, external display screen 9, external display screen 10, external display screen 11, and external display screen 12, in which an object represented by a target icon is displayed, is displayed at a second position 205, and in which an object represented by the target icon is not displayed. The above features are applicable not only to the embodiment corresponding to fig. 2A, but also to the annular area, as shown in fig. 2E, where the first display 201 displays icons 207, including an inner annular display area 210 and an outer annular display area 211. In the annular region, the first position may be an inner ring display region 210, the second position may be an outer ring display region 211, information for representing a first external display screen is displayed in the inner ring display region 210, information for representing a second external display screen is displayed in the outer ring display region 211, and an object represented by a target icon is displayed in the first external display screen, and an object represented by the target icon is not displayed in the second display screen. Fig. 2B and 2E are merely for explaining the present application, and should not be construed as limiting. In practical applications, the number, size, shape, etc. of the display areas may be determined according to practical requirements, which are not limited herein.
In some alternative embodiments, information representing the first external display screen and information representing the second external display screen may also be displayed for distinction. Here, the differential display may refer to a display mode of information in which display modes of information are different, such as fonts and colors.
As shown in fig. 2C, the first display 201 displays icons 202 and includes at least one display area 203, and the name of the external display 3 in the highlighted (i.e., differently displayed) display area 203 is "external display 3" and the names of the external displays in the other display areas in the first display. This implementation is not only applicable to the present embodiment, but also to the embodiments respectively corresponding to fig. 2D-2H. Fig. 2C is merely illustrative of the application and should not be construed as limiting.
In some alternative embodiments, at least one display area may also be distributed in one annular display area.
As shown in fig. 2D, the first display screen 201 displays icons 207 and includes a first ring 209, and the first ring 209 is distributed with 8 display areas such as the display area 203. The display area 203 displays the name "external display 3" of the external display 3. Fig. 2D is merely illustrative of the application and should not be construed as limiting.
In some alternative embodiments, the plurality of display regions may also be distributed among a nested plurality of annular display regions.
As shown in fig. 2E, the first display 201 displays an icon 207 and includes an inner ring display area 210 and an outer ring display area 211. The inner ring display area 210 displays information representing a first external display screen in which an object represented by a target icon is displayed, and the outer ring display area 211 displays information representing a second external display screen in which an object represented by the target icon is not displayed.
In some alternative embodiments, the terminal may record the duration of the first input in real time, and distinguish, in real time, information of the external display screen in the display area corresponding to the time according to the duration, where the information of the external display screen may be a name, a size, a color, and a brand type of the external display screen.
As shown in fig. 2F, the first display 201 displays an icon 207 and includes a first ring 209, and the first ring 209 is distributed with eight display areas 203, 219, and 220. Assuming that the first input is a long press input of the user's finger 218, the duration of the first input, i.e., the duration for which the user's finger 218 presses the target icon 207, for example, when the duration for which the user's finger 218 presses the target icon 202 reaches 1s, the name of the external display screen 1 in the display area 219 is highlighted (i.e., displayed differently) ' external display screen 1 ' and the names of the external display screens in the other display areas in the first ring 209. When the user's finger 218 presses the target icon 207 for a time of up to 2s, the name of the external display screen 2 in the display area 220 is highlighted (i.e., differently displayed) with the names of the external display screens in the other display areas in the first ring 209. When the user's finger 218 presses the target icon 207 for a time of up to 1s, the name of the external display 3 in the display area 203 is highlighted (i.e., differently displayed) ' external display 3 ' and the names of the external display screens in the other display areas in the first ring 209. This implementation is not only applicable to the present embodiment, but also to the embodiments corresponding to fig. 2A-2C, 2F and 2G-2H, respectively, and fig. 2F is merely for explaining the present application, and should not be construed as limiting. In practical applications, the number, size, shape, etc. of the display areas may be determined according to practical requirements, which are not limited herein.
In some alternative embodiments, the display area may also be a three-dimensional display area, and the terminal may
Information representing the at least one external display screen is displayed in the at least one three-dimensional spatial display region, respectively. In particular, the at least one three-dimensional spatial display region may be generated by the terminal over the terminal by 3D projection.
As shown in fig. 2G, the terminal 200 includes a first display screen 201, the first display screen 201 displaying an icon 207, and a first projection area 221 is generated by the terminal 200 over the terminal 200 through 3D projection, and the first projection area 221 includes a plurality of display areas such as a three-dimensional space display area 222. The first input may be a somatosensory operation for each three-dimensional space display region, assuming that the first input is a long press input of the user's finger 218, when the user's finger 218 presses the target icon 207, the terminal 200 projects the first projection region 221 above the position of the terminal 200. Specifically, the three-dimensional space display region 222 displays an appearance picture and size information of the external display screen 9. Fig. 2G is merely illustrative of the application and should not be construed as limiting. In practical applications, the number, size, shape, etc. of the display areas may be determined according to practical requirements, which are not limited herein.
In some alternative embodiments, the terminal may distinguish and display information of the external display screen matched with the object represented by the display target icon according to the type of the object represented by the target icon, where the information of the external display screen may be size, color and brand type of the external display screen.
As shown in fig. 2H, the object characterized by the target icon 207 selected by the user's finger 218 is video playing software. At this time, the terminal 200 highlights (i.e., distinctively displays) the information of the external display screen matched with the video playing software, and by highlighting the information of the external display screen 3 matched with the video playing software in the three-dimensional space display area 222, it can recommend the user to open the video playing software using the appropriate external display screen, because the external display screen 3 is a wide screen, and thus, the user's requirement of watching a movie can be more satisfied. This implementation is not only applicable to the embodiment corresponding to fig. 2H, but also to the embodiments corresponding to fig. 2A-2F, respectively. Fig. 2H is merely illustrative of the application and should not be construed as limiting. In practical applications, the number, size, shape, etc. of the display areas may be determined according to practical requirements, which are not limited herein.
S103, detecting a second input aiming at the target icon.
The second input is further described in connection with the respective embodiments of fig. 2A-2H. In particular, the second input may be at least one of: touch operations (e.g., tap operations, swipe operations, etc.), somatosensory operations (including gesture operations, which may be selection gestures, swipe gestures, etc.), voice input operations, image input operations, etc., the second input may be used to select through which display area the object represented by the target is displayed on the corresponding external display screen.
In the embodiments of fig. 2A-2C, respectively, the second input may be a drag operation for the target icon. For example, the user drags the target icon 202 to the display area 203, and the drag operation is the second input.
In the embodiments of fig. 2D-2E, respectively, the second input may be a sliding operation for the target icon. For example, the target icon 207 is slid in the direction of the display area 208.
In the corresponding embodiment of fig. 2F, the second input may be a release operation for the target icon. For example, the user presses the target icon 207 for a long time.
In the embodiments respectively corresponding to fig. 2G-2H, the second input may be a somatosensory operation (including a gesture operation) for the target icon.
The second input may also be other forms of input, not limited to the above-described several ways.
And S104, displaying the object represented by the target icon in the external display screen, and particularly displaying the object represented by the target icon in the external display screen corresponding to the display area aimed by the second input.
In the present application, the target icon may represent a file, program, web page, command, etc., and the icon facilitates the user to quickly execute the command and open the program file. As shown in fig. 2A, an object (e.g., a movie) represented by the target icon 202 is displayed on the external display screen 3 corresponding to the display area 203 for which the second input (e.g., drag) is directed, that is, the movie may be played on the external display screen 3 corresponding to the display area 203 for which the second input is directed.
Optionally, when the external display screen corresponding to the display area for the second input has displayed the object represented by the first icon, a display frame for user selection is popped up in the display area, and a decision statement of whether to continue displaying the object represented by the second icon on the display screen corresponding to the display area for the second input is output in the display frame. If yes, displaying the object represented by the first icon through an external display screen of the object not displayed with the icon representation, and if not, displaying the object represented by the second icon through an external display screen of the object not displayed with the icon representation.
The method comprises the steps of firstly, detecting a first input (such as long press) aiming at a target icon, wherein the target icon is displayed in a first display screen; further, information indicating at least one external display screen is displayed in each of the at least one display area; that is, the terminal provides one display area for displaying information representing at least one external display screen, thereby facilitating the user to select an appropriate external display screen to display the object represented by the icon according to the information of the external display screen in the display area. Further, according to the type of the object represented by the target icon, information of the external display screen matched with the object represented by the target icon is displayed in a distinguishing mode, and therefore a user can conveniently and quickly select a proper external display screen to display the object represented by the icon according to the information of the external display screen in the display area. Optionally, information of an external display screen matched with the object represented by the display target icon is output in a display frame popped up in the display area, so that a user can conveniently and quickly select a proper external display screen to display the object represented by the icon according to the information of the external display screen in the display area. This greatly improves the user experience.
Referring to fig. 3, fig. 3 is a functional block diagram of a terminal provided by the present application. The functional blocks of the terminal may be implemented by hardware, software or a combination of hardware and software. Those skilled in the art will appreciate that the functional blocks described in fig. 3 may be combined or separated into sub-blocks to implement the present inventive scheme. Thus, the above-described matters in the present application may support any possible combination or separation or further definition of the following functional modules.
As shown in fig. 3, the terminal 300 may include: a first detection unit 301, a first display unit 302, a second detection unit 303, and a second display unit 304. Wherein:
a first detection unit 301, configured to, when a first input for a target icon is detected, display the target icon in the first display screen, specifically configured to detect a first input (such as a long press, drag, slide, or somatosensory operation) for the target icon, where the target icon is displayed in the first display screen;
a first display unit 302 for displaying information of the at least one external display screen in at least one display area, specifically for displaying information representing the at least one external display screen in at least one display area, respectively;
a second detecting unit 303, configured to detect a second input for the target icon, specifically, to detect a second input (such as drag, sliding, somatosensory, or release operation) for the target icon;
the second display unit 304 is configured to display, when a second input for the target icon is detected, an object represented by the target icon in the external display screen, and specifically is configured to display, in the external display screen corresponding to the display area for which the second input is intended, the object represented by the target icon.
The first detection unit 301 is configured to detect a first input for a target icon, where the target icon is displayed on the first display screen. Specific:
the terminal comprises a first display screen, and the target icon, namely the icon aimed by the first input, can be any icon in the first display screen.
Alternatively, when the first input is a touch operation, for example, a single click, a double click, or a long press, the first detection unit 301 may detect the first input through the touch screen.
Alternatively, the first input may be a somatosensory operation (including a gesture operation), such as an operation of a hand to select a target in the air. The terminal may detect the first input through an infrared ranging sensor in the camera.
Alternatively, the first input may be a voice input, and the terminal may detect the first input through the voice recognition module.
The first display unit 302 is configured to display information representing the at least one external display screen in at least one display area, respectively. Specific:
specifically, the terminal includes a first display screen and at least one external display screen, and the display area may include at least one of a two-dimensional display area or a three-dimensional display area on the first display screen.
In some alternative embodiments, when the external display screen corresponding to the display area reaches a display threshold of the external display screen, information of the external display screen is removed from the display area. The display threshold of the external display screen is related to the display capability of the external display screen. Display capability, i.e. how many and what objects an external display screen is capable of simultaneously displaying. The examples are merely illustrative of the application and should not be construed as limiting.
In some alternative embodiments, the object with the target icon representation and the object without the target icon representation can be displayed in different display areas respectively. Information representing a first external display screen is displayed at a first location, information representing a second external display screen is displayed at a second location, wherein objects represented by target icons are displayed in the first external display screen, objects represented by the target icons are not displayed in the second display screen, and the first location is different from the second location. The object represented by the icon may be a file, program, web page, command, or the like.
In some alternative embodiments, information representing the first external display screen and information representing the second external display screen may also be displayed for distinction. Here, the differential display may refer to a display mode of information in which display modes of information are different, such as fonts and colors.
In some alternative embodiments, at least one display area may also be distributed in one annular display area.
In some alternative embodiments, the plurality of display regions may also be distributed among a nested plurality of annular display regions.
In some alternative embodiments, the terminal may record the duration of the first input in real time, and distinguish, in real time, information of the external display screen in the display area corresponding to the time according to the duration, where the information of the external display screen may be a name, a size, a color, and a brand type of the external display screen.
In some alternative embodiments, the display area may also be a three-dimensional space display area, and the first display unit 302 may display information representing the at least one external display screen in at least one three-dimensional space display area, respectively. In particular, the at least one three-dimensional spatial display region may be generated by the terminal over the terminal by 3D projection.
In some alternative embodiments, the first display unit 302 may distinguish and display information of the external display screen matched with the object represented by the display target icon according to the type of the object represented by the target icon, and the information of the external display screen may be the size, color and brand type of the external display screen.
The second detection unit 303 is configured to detect a second input (such as drag, swipe, move, gesture, or release operation) for the target icon. Specific:
the second input is further described in connection with the respective embodiments of fig. 2A-2H. In particular, the second input may be at least one of: touch operations (e.g., tap operations, swipe operations, etc.), somatosensory operations (including gesture operations, which may be selection gestures, swipe gestures, etc.), voice input operations, image input operations, etc., the second input may be used to select through which display area the object represented by the target is displayed on the corresponding external display screen.
In the embodiments of fig. 2A-2C, respectively, the second input may be a drag operation for the target icon.
In the embodiments of fig. 2D-2E, respectively, the second input may be a sliding operation for the target icon.
In the corresponding embodiment of fig. 2F, the second input may be a release operation for the target icon.
In the embodiments respectively corresponding to fig. 2G-2H, the second input may be a somatosensory operation (including a gesture operation) for the target icon.
The second display unit 304 is configured to display, on an external display screen corresponding to the display area for which the second input is directed, an object represented by the target icon. Specific:
when the object represented by the first icon is displayed in the external display screen corresponding to the display area aimed at by the second input, a display frame for the user to select is popped up in the display area, and a judgment statement of whether to continue displaying the object represented by the second icon on the display screen corresponding to the display area aimed at by the second input is output in the display frame. If yes, displaying the object represented by the first icon through an external display screen of the object not displayed with the icon representation, and if not, displaying the object represented by the second icon through an external display screen of the object not displayed with the icon representation.
It can be appreciated that, regarding the specific implementation of the functional blocks included in the terminal 300 of fig. 3, reference may be made to the foregoing embodiments, and a detailed description is omitted here.
Fig. 4 is a schematic block diagram of a terminal provided by the present application. In the embodiment of the present application, the terminal device may include various terminal devices such as a mobile phone, a tablet computer, a personal digital assistant (Personal Digital Assistant, PDA), a mobile internet device (Mobile Internet Device, MID), an intelligent wearable device (such as an intelligent watch and an intelligent bracelet), and the embodiment of the present application is not limited. As shown in fig. 4, the terminal 400 may include: baseband chip 410, memory 415 (one or more computer-readable storage media), radio Frequency (RF) module 416, peripheral system 417. These components may communicate over one or more communication buses 414.
The peripheral system 417 is mainly used to implement an interactive function between the terminal 410 and the user/external environment, and mainly includes input and output devices of the terminal 400. In particular implementations, the peripheral system 417 may include: a touch screen controller 418, a camera controller 419, an audio controller 420, and a sensor management module 421. Wherein each controller may be coupled to a respective peripheral device (e.g., touch screen 423, camera 424, audio circuitry 425, and sensor 426). In some embodiments, the touch screen 423 may be configured with a self-capacitive floating touch panel, or may be configured with an infrared floating touch panel. In some embodiments, camera 424 may be a 3D camera. It should be noted that peripheral system 417 may also include other I/O peripherals.
The baseband chip 410 may be integrated to include: one or more processors 411, a clock module 412, and a power management module 413. The clock module 412 integrated in the baseband chip 410 is mainly used for generating clocks required for data transmission and timing control for the processor 411. The power management module 413 integrated in the baseband chip 410 is mainly used for providing stable and high-precision voltage to the processor 411, the radio frequency module 416 and the peripheral system.
A Radio Frequency (RF) module 416 is used to receive and transmit radio frequency signals, and mainly integrates a receiver and a transmitter of the terminal 100. Radio Frequency (RF) module 416 communicates with communication networks and other communication devices via radio frequency signals. In particular implementations, the Radio Frequency (RF) module 416 may include, but is not limited to: an antenna system, an RF transceiver, one or more amplifiers, a tuner, one or more oscillators, a digital signal processor, a CODEC chip, a SIM card, a storage medium, and so forth. In some embodiments, the Radio Frequency (RF) module 416 may be implemented on a separate chip.
A memory 415 is coupled to the processor 411 for storing various software programs and/or sets of instructions. In particular implementations, memory 415 may include high-speed random access memory, and may also include non-volatile memory, such as one or more magnetic disk storage devices, flash memory devices, or other non-volatile solid state storage devices. The memory 415 may store an operating system (hereinafter referred to as a system), such as ANDROID, IOS, WINDOWS, or an embedded operating system, such as LINUX. Memory 415 may also store network communication programs that may be used to communicate with one or more additional devices, one or more terminal devices, and one or more network devices. The memory 415 may also store a user interface program that can vividly display the content image of the application program through a graphical operation interface, and receive control operations of the application program from a user through input controls such as menus, dialog boxes, and buttons.
Memory 415 may also store one or more application programs. As shown in fig. 4, these applications may include: social applications (e.g., facebook), image management applications (e.g., album), map class applications (e.g., google map), browsers (e.g., safari, google Chrome), and so forth.
It should be understood that terminal 400 is merely one example provided for embodiments of the present application, and that terminal 400 may have more or fewer components than shown, may combine two or more components, or may have different configuration implementations of the components.
It can be appreciated that, regarding the specific implementation of the functional blocks included in the terminal 400 of fig. 4, reference may be made to the foregoing embodiments, and a detailed description is omitted here.
In another embodiment of the present application, there is provided a computer-readable storage medium storing a computer program which, when executed by a processor, implements:
the computer readable storage medium may be an internal storage unit of the terminal according to any of the foregoing embodiments, for example, a hard disk or a memory of the terminal. The computer readable storage medium may also be an external storage device of the terminal, such as a plug-in hard disk, a Smart Media Card (SMC), a Secure Digital (SD) Card, a Flash memory Card (Flash Card) or the like, which are provided on the terminal. Further, the computer-readable storage medium may also include both an internal storage unit and an external storage device of the terminal. The computer-readable storage medium is used to store the computer program and other programs and data required by the terminal. The computer-readable storage medium may also be used to temporarily store data that has been output or is to be output.
Those of ordinary skill in the art will appreciate that the various illustrative elements and algorithm steps described in connection with the embodiments disclosed herein may be implemented as electronic hardware, computer software, or combinations of both, and in order to best explain the interchangeability of hardware and software, the foregoing description has been presented generally in terms of functionality.
It will be clearly understood by those skilled in the art that, for convenience and brevity of description, the specific working procedures of the terminal and the unit described above may refer to the corresponding procedures in the foregoing method embodiments, which are not repeated herein.
In several embodiments provided by the present application, it should be understood that the disclosed terminal and method may be implemented in other manners. For example, the compositions and steps of the examples are described. Whether such functionality is implemented as hardware or software depends upon the particular application and design constraints imposed on the solution. Skilled artisans may implement the described functionality in varying ways for each particular application, but such implementation decisions should not be interpreted as causing a departure from the scope of the present application.
The above-described terminal embodiments are merely illustrative, for example, the division of the units is merely a logical function division, and there may be other manners of division in actual implementation, for example, multiple units or components may be combined or integrated into another system, or some features may be omitted, or not performed. In addition, the coupling or direct coupling or communication connection shown or discussed with each other may be an indirect coupling or communication connection via some interfaces, terminals or units, or may be an electrical, mechanical or other form of connection.
The units described as separate units may or may not be physically separate, and units shown as units may or may not be physical units, may be located in one place, or may be distributed on a plurality of network units. Some or all of the units may be selected according to actual needs to achieve the purpose of the embodiment of the present application.
In addition, each functional unit in the embodiments of the present application may be integrated in one processing unit, or each unit may exist alone physically, or two or more units may be integrated in one unit. The integrated units may be implemented in hardware or in software functional units.
The integrated units, if implemented in the form of software functional units and sold or used as stand-alone products, may be stored in a computer readable storage medium. Based on such understanding, the technical solution of the present application is essentially or a part contributing to the prior art, or all or part of the technical solution may be embodied in the form of a software product stored in a storage medium, comprising several instructions for causing a computer device (which may be a personal computer, a server, or a network device, etc.) to perform all or part of the steps of the method according to the embodiments of the present application. And the aforementioned storage medium includes: a U-disk, a removable hard disk, a Read-Only Memory (ROM), a random access Memory (Random Access Memory, RAM), a magnetic disk, or an optical disk, or other various media capable of storing program codes.
While the application has been described with reference to certain preferred embodiments, it will be understood by those skilled in the art that various changes and substitutions of equivalents may be made and equivalents will be apparent to those skilled in the art without departing from the scope of the application. Therefore, the protection scope of the application is subject to the protection scope of the claims.

Claims (8)

1. An information display method applied to a terminal, wherein the terminal comprises a first display screen, and at least one external display screen is further connected to the terminal, and the method comprises the following steps:
when a first input aiming at a target icon is detected, the target icon is displayed in the first display screen;
displaying information of at least one external display screen in at least one display area, distinguishing and displaying information used for representing a first external display screen from information used for representing a second external display screen, and distinguishing and displaying information of an external display screen matched with an object represented by a display target icon according to the type of the object represented by the target icon; the display area comprises a three-dimensional space display area, wherein the first external display screen is provided with an object represented by the target icon, and the second external display screen is provided with no object represented by the target icon;
and when a second input aiming at the target icon is detected, displaying an object characterized by the target icon in the external display screen.
2. The method according to claim 1, wherein the method further comprises:
and removing the display area when the external display screen corresponding to the display area reaches the display threshold value of the external display screen.
3. The method according to claim 1, wherein the method further comprises:
information representing a first external display screen in which the object represented by the icon is displayed and/or information representing a second external display screen in which the object represented by the icon is not displayed are displayed in a first position.
4. A method according to any one of claims 1 to 3, wherein the method further comprises:
recording the duration of the first input, and distinguishing and displaying information of an external display screen in a display area corresponding to the duration according to the duration; and/or the number of the groups of groups,
when a second input for the target icon is detected, determining information of an external display screen in a display area corresponding to the total duration according to the total duration of the first input.
5. The method according to claim 1, wherein the method further comprises:
the at least one display area is distributed in an annular display area.
6. A method according to any one of claims 1 to 3, further comprising
The second input comprises a sliding operation, and the display area pointed by the second input is the display area pointed by the sliding operation.
7. A method according to any one of claim 1 to 3,
the display area comprises a three-dimensional space display area and a two-dimensional display area on the first display screen, or the two-dimensional display area on the first display screen.
8. A terminal, the terminal comprising: a memory, and a processor coupled to the memory, wherein the processor is to perform the method of any of claims 1-7.
CN201780095813.5A 2017-10-10 2017-10-10 Information display method and terminal Active CN111201507B (en)

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
PCT/CN2017/105490 WO2019071419A1 (en) 2017-10-10 2017-10-10 Multiscreen-based information display method, and terminal

Publications (2)

Publication Number Publication Date
CN111201507A CN111201507A (en) 2020-05-26
CN111201507B true CN111201507B (en) 2023-11-03

Family

ID=66100200

Family Applications (1)

Application Number Title Priority Date Filing Date
CN201780095813.5A Active CN111201507B (en) 2017-10-10 2017-10-10 Information display method and terminal

Country Status (2)

Country Link
CN (1) CN111201507B (en)
WO (1) WO2019071419A1 (en)

Citations (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN102396291A (en) * 2009-04-14 2012-03-28 高通股份有限公司 System and method for mobile device display power savings
CN105302285A (en) * 2014-08-01 2016-02-03 福州瑞芯微电子股份有限公司 Multi-screen display method, equipment and system
CN106325650A (en) * 2015-06-19 2017-01-11 深圳创锐思科技有限公司 3D dynamic display method based on human-computer interaction and mobile terminal

Family Cites Families (7)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
KR100474724B1 (en) * 2001-08-04 2005-03-08 삼성전자주식회사 Apparatus having touch screen and external display device using method therefor
KR101663474B1 (en) * 2009-10-13 2016-10-10 삼성전자주식회사 A mobile terminal, method for displaying background in a mobile terminal and storage medium
KR102003742B1 (en) * 2012-03-26 2019-07-25 삼성전자주식회사 Method and apparatus for managing screens in a portable terminal
CN104199552B (en) * 2014-09-11 2017-10-27 福州瑞芯微电子股份有限公司 Multi-display method, equipment and system
CN104915096A (en) * 2015-05-29 2015-09-16 努比亚技术有限公司 Application interface displaying method and device
CN105975142A (en) * 2015-10-26 2016-09-28 乐视移动智能信息技术(北京)有限公司 Method and device for icon moving
CN106648329A (en) * 2016-12-30 2017-05-10 维沃移动通信有限公司 Application icon display method and mobile terminal

Patent Citations (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN102396291A (en) * 2009-04-14 2012-03-28 高通股份有限公司 System and method for mobile device display power savings
CN105302285A (en) * 2014-08-01 2016-02-03 福州瑞芯微电子股份有限公司 Multi-screen display method, equipment and system
CN106325650A (en) * 2015-06-19 2017-01-11 深圳创锐思科技有限公司 3D dynamic display method based on human-computer interaction and mobile terminal

Also Published As

Publication number Publication date
CN111201507A (en) 2020-05-26
WO2019071419A1 (en) 2019-04-18

Similar Documents

Publication Publication Date Title
US10915225B2 (en) User terminal apparatus and method of controlling the same
US9952681B2 (en) Method and device for switching tasks using fingerprint information
EP3901756B1 (en) Electronic device including touch sensitive display and method for operating the same
US10705682B2 (en) Sectional user interface for controlling a mobile terminal
KR102311221B1 (en) operating method and electronic device for object
US10705702B2 (en) Information processing device, information processing method, and computer program
US11188192B2 (en) Information processing device, information processing method, and computer program for side menus
JP6055961B2 (en) Text selection and input
CN116055610B (en) Method for displaying graphical user interface and mobile terminal
US9910584B2 (en) Method for manipulating folders and apparatus thereof
US20180018067A1 (en) Electronic device having touchscreen and input processing method thereof
KR20140126140A (en) Mobile apparatus providing with changed-shortcut icon responding to status of mobile apparatus and control method thereof
KR101251761B1 (en) Method for Data Transferring Between Applications and Terminal Apparatus Using the Method
CN107479818B (en) Information interaction method and mobile terminal
US20150063785A1 (en) Method of overlappingly displaying visual object on video, storage medium, and electronic device
US9792183B2 (en) Method, apparatus, and recording medium for interworking with external terminal
KR20150039552A (en) Display manipulating method of electronic apparatus and electronic apparatus thereof
US10908868B2 (en) Data processing method and mobile device
KR20170042953A (en) Display apparatus and method of controling thereof
US10319338B2 (en) Electronic device and method of extracting color in electronic device
KR102203131B1 (en) Method for management file and electronic device thereof
CN111201507B (en) Information display method and terminal
CN111226190A (en) List switching method and terminal
CN111213354A (en) Screen brightness adjusting method and terminal
US20140365936A1 (en) Apparatus and method for displaying content in mobile terminal

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant