WO2020001604A1 - Procédé d'affichage et dispositif terminal - Google Patents

Procédé d'affichage et dispositif terminal Download PDF

Info

Publication number
WO2020001604A1
WO2020001604A1 PCT/CN2019/093629 CN2019093629W WO2020001604A1 WO 2020001604 A1 WO2020001604 A1 WO 2020001604A1 CN 2019093629 W CN2019093629 W CN 2019093629W WO 2020001604 A1 WO2020001604 A1 WO 2020001604A1
Authority
WO
WIPO (PCT)
Prior art keywords
screen
target
sliding input
sub
program
Prior art date
Application number
PCT/CN2019/093629
Other languages
English (en)
Chinese (zh)
Inventor
杨其豪
Original Assignee
维沃移动通信有限公司
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by 维沃移动通信有限公司 filed Critical 维沃移动通信有限公司
Publication of WO2020001604A1 publication Critical patent/WO2020001604A1/fr

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]
    • G06F3/0487Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser
    • G06F3/0488Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser using a touch-screen or digitiser, e.g. input of commands through traced gestures
    • G06F3/04883Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser using a touch-screen or digitiser, e.g. input of commands through traced gestures for inputting data by handwriting, e.g. gesture or text
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/14Digital output to display device ; Cooperation and interconnection of the display device with other functional units
    • G06F3/1423Digital output to display device ; Cooperation and interconnection of the display device with other functional units controlling a plurality of local displays, e.g. CRT and flat panel display

Definitions

  • Embodiments of the present disclosure relate to the field of communications technologies, and in particular, to a display method and a terminal device.
  • Terminal devices with dual displays such as double-sided mobile phones have emerged at the historic moment.
  • Double-sided screen phones have two displays, and users can display different or the same content on the two displays depending on the usage.
  • the way to control the display of predetermined content in the two display screens of a double-sided mobile phone is generally to flip the phone to trigger the content on the target display screen to be displayed in the other display screen. This operation mode is complicated and easy to cause False trigger, poor operation accuracy.
  • the present disclosure provides a display method and a terminal device, so as to solve the problem that the terminal device with a double-sided screen in the related art is relatively complicated to operate when switching display content.
  • an embodiment of the present disclosure provides a display method applied to a terminal device, where the terminal device includes a first screen, a second screen, and a control area, where the first screen, the second screen, and The control areas are respectively located on different surfaces of the terminal device; the method includes:
  • display content of a target screen in the first screen and the second screen is updated.
  • an embodiment of the present disclosure further provides a terminal device, where the terminal device includes a first screen, a second screen, and a control area, wherein the first screen, the second screen, and the control area Respectively located on different surfaces of the terminal device; the terminal device further includes:
  • a receiving module configured to receive a sliding input of a user on the control area
  • the response module is configured to update the display content of the target screen in the first screen and the second screen in response to the sliding input.
  • an embodiment of the present disclosure further provides a terminal device including a processor, a memory, and a program stored on the memory and executable on the processor.
  • a terminal device including a processor, a memory, and a program stored on the memory and executable on the processor.
  • an embodiment of the present disclosure further provides a computer-readable storage medium.
  • the computer-readable storage medium stores a program, and when the program is executed by a processor, the steps of the display method described above are implemented.
  • the dual-screen display function of the terminal device can be quickly started. It can also achieve separate or synchronous quick operation of dual screens, and its operation is simple, triggering precision is high, and false triggering can be avoided.
  • FIG. 1 shows a flowchart of a display method according to an embodiment of the present disclosure
  • FIG. 2 is a schematic structural diagram of a terminal device according to an embodiment of the present disclosure
  • FIG. 3 shows one of the schematic diagrams of sliding input according to an embodiment of the present disclosure
  • FIG. 4 is a second schematic diagram of sliding input according to an embodiment of the present disclosure.
  • FIG. 5 shows the third schematic diagram of sliding input according to an embodiment of the present disclosure
  • FIG. 6 is a schematic diagram of a fifth sub-sliding input according to an embodiment of the present disclosure.
  • FIG. 7 is a schematic diagram of a sixth sub-sliding input according to an embodiment of the present disclosure.
  • FIG. 8 is a schematic diagram of a touch area according to an embodiment of the present disclosure.
  • FIG. 9 shows a fourth schematic diagram of sliding input according to an embodiment of the present disclosure.
  • FIG. 10 shows a structural block diagram of a terminal device according to an embodiment of the present disclosure
  • FIG. 11 is a schematic diagram of a hardware structure of a terminal device according to an embodiment of the present disclosure.
  • an embodiment of the present disclosure provides an example of a display method.
  • the display method is applied to a terminal device, and the terminal device includes a first screen, a second screen, and a control area.
  • the second screen and the control area are respectively located on different surfaces of the terminal device.
  • an embodiment of the present disclosure provides an example of a terminal device.
  • the terminal device includes at least a first screen 101 located on a first surface of the terminal device, a second screen 102 located on a second surface of the terminal device, and a first screen A control region 103 on a third surface between the surface and the second surface.
  • the first surface and the second surface are two surfaces opposite to each other on the terminal device.
  • the control area 103 is set on the left side of the first screen 101 is shown in FIG. 2, the control area 103 can also be set on the upper side, the lower side, or the right side of the first screen 101 and can be extended to the second screen. Any surface of the screen 102 is possible.
  • the terminal device includes a display screen extending from a first surface to a second surface opposite to the first surface, wherein a portion of the display screen on the first surface is a first screen, A portion on the second surface is a second screen, and a portion between the first surface and the second surface is a control region.
  • the control area may not have a display function.
  • the display method specifically includes steps 11 and 12.
  • Step 11 Receive a sliding input from the user on the control area.
  • the sliding input may be a sliding input triggered by two fingers on the control area at the same time and sliding back to the control area along the vertical direction (the length direction of the terminal device).
  • FIG. 3 an example of a sliding input is provided.
  • the sliding input may be a two-finger sliding input that is triggered by two fingers at the same time in the middle position of the control area 103 and slides back in the vertical direction of the control area. (As shown by the arrow in Figure 3).
  • Step 12 In response to the sliding input, update the display content of the target screen in the first screen and the second screen.
  • the first screen and the second screen are determined as the target screen, and the display function of the target screen is activated, that is, a bright screen operation is performed, and the first screen and the second screen are displayed respectively.
  • Scheduled content when the display function is activated on the first screen, in response to the sliding input, the second screen is determined as the target screen and the display function of the target screen is activated; or when the display is activated on the second screen.
  • the first screen in response to the sliding input, the first screen is determined as a target screen, and a display function of the target screen is activated.
  • the predetermined content includes, but is not limited to, the following: an unlock interface displayed after the terminal device performs a bright screen operation, and a history interface displayed after the terminal device performs a bright screen operation (for example, the history interface may be the last time the terminal device performed The interface displayed before the screen operation), or other preset display interface.
  • an unlock interface displayed after the terminal device performs a bright screen operation for example, the history interface may be the last time the terminal device performed The interface displayed before the screen operation
  • the history interface may be the last time the terminal device performed The interface displayed before the screen operation
  • the predetermined content includes, but is not limited to, the following: an unlock interface displayed after the terminal device performs a bright screen operation, and a history interface displayed after the terminal device performs a bright screen operation (for example, the history interface may be the last time the terminal device performed The interface displayed before the screen operation), or other preset display interface.
  • FIG. 3 when the display functions of the first screen 101 and the second screen 102 are activated, an example is shown in which the first screen 101
  • the target interface of the target application may be displayed on the target screen according to the received trigger operation for opening the target application to lock the target application in the target screen.
  • the target screen is one of the first screen and the second screen, and may be, for example, the first screen 101 in FIG. 2.
  • the sliding input includes a first sub-sliding input in a first direction and a second sub-sliding input in a second direction, wherein the first direction is opposite to the second direction.
  • the first sub-sliding input and the second sub-sliding input may be two fingers simultaneously triggered on the control area and sliding back to the control area in the lateral direction (the width direction of the terminal device). Swipe to type.
  • An example of a sliding input is shown in FIG. 4.
  • the sliding input may be a two-finger sliding input (such as two fingers triggering on the middle position of the control area 103 at the same time and sliding back in the lateral direction of the control area, such as (Indicated by arrows in FIG. 4).
  • the above step 12 specifically includes: in response to the sliding input, determining the first screen and the second screen as the target screen, and exchanging display contents of the first screen and the second screen.
  • a first object is displayed on the first screen (as shown in FIG. 3, the first object displayed on the first screen 101 is a desktop) ),
  • a second object is displayed on the second screen (as shown in FIG. 3, the second object displayed on the second screen 102 is the playback interface of the audio player); and the first sub-sliding input in the first direction and the second
  • the second sub-swipe input in the direction displays the second object in the first screen (as shown in FIG. 4, the playback interface of the audio player is displayed on the first screen 101), and the first object is displayed in the second screen (such as FIG. 4 shows the desktop on the second screen 102).
  • the terminal device includes three display screens
  • the following display content exchange methods can be adopted:
  • Method 1 Exchange the content displayed on two of the display screens (the exchange method is as described above and will not be repeated here), and the content displayed on the other display screen remains unchanged.
  • Method 2 Exchange the contents displayed on the three display screens, so that the contents displayed on the three display screens after the exchange are different from those displayed before the exchange.
  • the first screen displays a first object
  • the second screen displays a second object
  • the third screen displays a first object.
  • Three objects in response to the first sub-sliding input in the first direction and the second sub-sliding input in the second direction, displaying the second object on the first screen, displaying the third object on the second screen, and The first object is displayed in three screens.
  • the first screen displays the first object
  • the second screen displays the second object
  • the first sub-sliding input in the first direction and the second sub-direction in the second direction are received on the control area.
  • the sliding input includes a third sub-sliding input in a third direction and a fourth sub-sliding input in a fourth direction, wherein the third direction is opposite to the fourth direction.
  • the third sub-sliding input and the fourth sub-sliding input may be two sliding inputs that are triggered simultaneously on the control area with two fingers and slide toward each other in a lateral direction of the control area.
  • An example of a sliding input is shown in FIG. 5.
  • the third sub-sliding input in the third direction and the fourth sub-sliding input in the fourth direction may be simultaneously triggered by two fingers at both ends in the lateral direction of the control area 103.
  • Two-finger swipe input (as shown by the arrow in FIG. 5) in the position and sliding along the lateral direction of the control area to the middle position of the control area.
  • the above step 12 specifically includes: in response to the sliding input, obtaining a first object currently displayed on the first screen and a second object currently displayed on the second screen; determining the first object and the second object One of them is a target object; when the target object is the first object, determining that the second screen is the target screen, and updating the display content of the target screen to the target object; When the target object is the second object, determine that the first screen is the target screen, and update the display content of the target screen to the target object.
  • one of the first screen and the second screen is determined as the target screen; in a case where the target screen is the first screen, while the first object is displayed on the first screen, Displaying the first object on the second screen; when the target screen is the second screen, displaying the second object on the first screen while displaying the second object on the second screen Object.
  • FIG. 5 an example is shown in which the display content of the target screen is updated to the target object.
  • the first object displayed on the first screen 101 is the desktop
  • the second object displayed on the second screen 102 is the playback interface of the audio player
  • in response to the sliding Enter determine that the first screen 101 is the target screen, and display the playback interface of the audio player on the first screen 101 while displaying the playback interface of the audio player on the second screen 102.
  • the target object is displayed on the target object.
  • determining the second screen as the target screen, and updating the display content of the target screen to the target object; where the target object is the second object In the case, it is determined that the first screen is the target screen, and the display content of the target screen is updated to the target object, so as to realize the simultaneous display of the dual screens in the terminal device, thereby achieving the sharing of the dual screen display content.
  • the sliding input includes a fifth sub-sliding input in a fifth direction or a sixth sub-sliding input in a sixth direction, wherein the fifth direction is opposite to the sixth direction.
  • the fifth sub-sliding input may be an operation of sliding from the first end to the second end along the vertical direction of the control area; the sixth sub-sliding input may be from the first end to the first direction along the vertical direction of the control area. The operation of sliding the two ends to the first end. The first end and the second end are opposite ends of the control area in a vertical direction.
  • An example of a fifth sub-sliding input is shown in FIG. 6.
  • the fifth sub-sliding input may be a single finger triggered on the first end of the control area 103 and sliding down to the second end in a vertical direction. Refers to sliding input (shown as arrows in Figure 6).
  • An example of a sixth sub-sliding input is also shown in FIG. 7.
  • the sixth sub-sliding input may be a single-finger trigger on the second end of the control area 103 and slide up to the first end along the vertical direction.
  • Single-finger swipe input shown by the arrow in Figure 7).
  • the first end refers to the upper end of the control area in FIG. 6 and FIG. 7
  • the second end refers to the lower end of the control area in FIG. 6 and FIG. 7.
  • the above step 12 specifically includes: when the sliding input is the fifth sub-sliding input, determining the first screen and the second screen as the target screen, and displaying the display content of the target screen It is updated to the first program and the second program that are arranged after the currently running program in the corresponding background running program list respectively.
  • the sliding input is the sixth sub-sliding input
  • the third program and the fourth program are arranged in the background running program list before the currently running program.
  • the background running program corresponding to the first screen and the background running program corresponding to the second screen are independently managed.
  • the background running programs corresponding to the first screen include: program A, program B, and program C; the background running programs corresponding to the second screen include: program D, program E, and program F. It should be noted that the number of background running programs respectively corresponding to the first screen and the second screen is not limited to three.
  • the content displayed on the current first screen is program B (as shown in FIG. 3, the content displayed on the first screen 101 is the desktop)
  • the display order of the first screen Program C following program B (first program, such as the conversation interface of the instant messaging application displayed on the first screen 101 in FIG. 6);
  • the content currently displayed on the second screen is program E (such as FIG. 3, the content displayed on the second screen 102 is the playback interface of the audio player)
  • the second screen controls the second screen to display the program F following the program E (
  • the second program is the playback interface of the video player shown in the second screen 102 in FIG. 6).
  • the first screen is controlled in response to the sixth sub-sliding input shown in FIG.
  • Program A (the third program, as shown in the first screen 101 in FIG. 7 is the shooting screen of the camera) is displayed before the program B; correspondingly, if the content displayed on the second screen is program E (As shown in FIG. 3, the content displayed on the second screen 102 is the playback interface of the audio player), in response to the sixth sub-sliding input shown in FIG. 7, control the display order of the second screen before the program E Program D (fourth program, as shown in the second screen 102 in FIG. 7 is a display interface of a weather application program).
  • the first screen and the second screen are controlled to be updated to respective ones by receiving a fifth sliding input on the control area.
  • the third and fourth programs arranged in front of the currently running programs in their respective background running program lists are used to realize the synchronous and fast switching of the respective background running programs in the first screen and the second screen, which is beneficial to optimize the operation time, and its Simple operation, high trigger accuracy, and avoid false triggering.
  • control region includes a first sub-control region and a second sub-control region.
  • the sliding input includes a seventh sub-sliding input in a seventh direction on the first sub-control area, an eighth sub-sliding input in an eighth direction sliding on the first sub-control area, and a second sub-control area.
  • the control area in order to facilitate independent control of the content displayed on the first screen and the second screen, the control area may be divided into a first sub-control area corresponding to the first screen and a first sub-control area corresponding to the second screen.
  • Second child control area As shown in FIG. 8, a control area 103 is divided into a first sub-control area 1031 corresponding to the first screen, and a second sub-control area 1032 corresponding to the second screen.
  • the seventh sub-sliding input may be a sliding input that slides from a first end to a second end of the first sub-control area 1031 in a vertical direction of the first sub-control area 1031 (as indicated by an arrow in FIG. 8). (Shown); the eighth sub-sliding input may be a sliding input that slides from the second end to the first end of the first sub-control area 1031 along the vertical direction of the first sub-control area 1031.
  • the above step 12 specifically includes: when the sliding input is the seventh sub-sliding input, determining the first screen as the target screen, and updating the display content of the target screen to correspond to the target
  • the fifth program in the background running program of the screen is located after the currently running program.
  • the sliding input is the eighth sub-sliding input, determine that the first screen is the target screen, and update the display content of the target screen to a background running program corresponding to the target screen.
  • the sixth program before the currently running program.
  • the background running programs corresponding to the first screen include: Program A, Program B, and Program C. It should be noted that the number of applications running in the background is not limited to three.
  • control program B (as shown in FIG. 3, the content displayed on the first screen 101 is the desktop)
  • control program C the fifth program, such as the conversation interface of the instant messaging application program displayed on the first screen 101 in FIG. 8
  • the content displayed on the first screen is the program B
  • the ninth sub-sliding input may be a sliding input that slides from the first end to the second end of the second sub-control area 1032 along the vertical direction of the second sub-control area 1032;
  • the tenth sub-sliding input It may be a sliding input that slides from the second end to the first end of the second sub-control area 1032 along the vertical direction of the second sub-control area 1032.
  • the above step 12 further specifically includes: when the sliding input is the ninth sub sliding input, determining the second screen as the target screen, and updating the display content of the target screen to correspond to the target screen.
  • the seventh program in the background running program of the target screen is located after the currently running program.
  • the background running programs corresponding to the second screen domain include a program D, a program E, and a program F. It should be noted that the number of background running programs is not limited to three, and the background running programs corresponding to the first screen and the background running programs corresponding to the second screen are managed separately.
  • the content displayed on the current second screen is program E (as shown in FIG. 3, the content displayed on the second screen 102 is the playback interface of the audio player), in response to the ninth sub-sliding input, control the second screen display
  • the sequence F is the program F after the program E (the seventh program, as shown in the second screen 102 in FIG. 9 is the display interface of the weather application program).
  • the program D the eighth program in which the display order of the second screen precedes the program E is controlled.
  • the sliding input may be two sliding inputs that are simultaneously triggered by two fingers on the control area and slide toward each other in a vertical direction of the touch area.
  • FIG. 9 an example of a sliding input may be given.
  • the sliding input may be simultaneously triggered by two fingers at two positions in the vertical direction of the control area 103 and face each other along the vertical direction of the control area 103. Swipe to the middle position of the touch area with two-finger swipe input (as shown by the arrow in Figure 9).
  • the above-mentioned step 12 specifically includes: in response to the sliding input, determining one of the first screen and the second screen as a target screen, and controlling the target screen to be in a state of inactive screen.
  • the display screen other than the target screen in response to a predetermined operation on the power button, can be controlled to be in an inactive state; or the target can be controlled in response to a predetermined operation on the display screen other than the target screen.
  • the display screen is off the screen.
  • the sliding input is an eleventh sub-sliding input that slides in a ninth direction to a preset position.
  • the eleventh sub-sliding input may be a sliding input that is continuously triggered by the finger at the intermediate position for a predetermined duration when the finger is slid from the first end to the intermediate position of the control area along the vertical direction of the touch area.
  • the above step 12 specifically includes: determining one of the first screen and the second screen as a target screen; displaying at least one object on the target screen; receiving a user's selection input on the target object; The selection input updates the display content of the target screen to display content corresponding to the target object; wherein the at least one object includes the target object.
  • the objects include, but are not limited to, programs running in the background, application icons in the target folder, image information, text information, and the like.
  • Gesture inputs include, but are not limited to, the following inputs: a click input triggered on the target object, a slide input (such as a slide input, a slide-up input) triggered on the target object, and the like.
  • a target interface of the program M is displayed on the target screen in response to the gesture input, where the target interface is an interface corresponding to the last time the program M is displayed.
  • the target object is the program N in the target folder
  • the program N is started in response to the gesture input, and the target interface after the program N is started is displayed in the target screen.
  • the target object is image information or text information
  • the image information or text information is displayed in the target screen.
  • an embodiment of the present disclosure further provides a terminal device, where the terminal device 1000 includes a first screen, a second screen, and a control area, wherein the first screen, the second screen, and all The control areas are respectively located on different surfaces of the terminal device.
  • the terminal device 1000 further includes a receiving module 1010 and a response module 1020.
  • the receiving module 1010 is configured to receive a sliding input of a user on the control area.
  • the response module 1020 is configured to update the display content of the target screen in the first screen and the second screen in response to the sliding input.
  • the sliding input includes a first sub-sliding input in a first direction and a second sub-sliding input in a second direction, wherein the first direction is opposite to the second direction.
  • the response module 1020 includes a switching unit.
  • An exchange unit configured to determine the first screen and the second screen as the target screen, and exchange display contents of the first screen and the second screen.
  • the sliding input includes a third sub-sliding input in a third direction and a fourth sub-sliding input in a fourth direction, and the third direction is opposite to the fourth direction.
  • the response module 1020 includes an acquisition unit, a first determination unit, a first update unit, and a second update unit.
  • the obtaining unit is configured to obtain a first object currently displayed on the first screen and a second object currently displayed on the second screen.
  • the first determining unit is configured to determine one of the first object and the second object as a target object.
  • a first updating unit configured to determine that the second screen is the target screen when the target object is the first object, and update the display content of the target screen to the target object.
  • a second updating unit configured to determine that the first screen is the target screen when the target object is the second object, and update the display content of the target screen to the target object.
  • the sliding input includes a fifth sub-sliding input in a fifth direction or a sixth sub-sliding input in a sixth direction, and the fifth direction is opposite to the sixth direction.
  • the response module 1020 includes a third update unit and a fourth update unit.
  • a third updating unit configured to determine that the first screen and the second screen are the target screen when the sliding input is the fifth sub-sliding input, and display the target screen The contents are respectively updated to the first program and the second program that are arranged after the currently running program in the corresponding background running program list.
  • a fourth updating unit configured to determine that the first screen and the second screen are the target screen when the sliding input is the sixth sub-sliding input, and display the target screen The contents are respectively updated to the third program and the fourth program that are arranged before the currently running program in their respective background running program lists.
  • the control region includes a first sub-control region and a second sub-control region.
  • the sliding input includes a seventh sub-sliding input in the seventh direction on the first sub-control area, an eighth sub-sliding input in the eighth direction on the first sub-control area, and The ninth sub-sliding input in the seventh direction or the tenth sub-sliding input in the eighth direction on the second sub-control area, wherein the seventh direction is opposite to the eighth direction.
  • the response module includes 1020: a fifth update unit, a sixth update unit, a seventh update unit, and an eighth update unit.
  • a fifth updating unit configured to determine that the first screen is the target screen when the sliding input is the seventh sub-sliding input, and update the display content of the target screen to correspond to the target screen The fifth program in the background running program of the target screen that is located after the currently running program.
  • a sixth updating unit configured to determine that the first screen is the target screen when the sliding input is the eighth sub-sliding input, and update the display content of the target screen to correspond to the target screen The sixth program in the background running program of the target screen that precedes the currently running program.
  • a seventh updating unit configured to determine that the second screen is the target screen when the sliding input is the ninth sub-sliding input, and update the display content of the target screen to correspond to the target screen.
  • the seventh program in the background running program of the target screen is located after the currently running program.
  • An eighth updating unit configured to determine that the second screen is the target screen when the sliding input is the tenth sub-sliding input, and update the display content of the target screen to correspond to the target screen
  • the eighth program in the background running program of the target screen that precedes the currently running program.
  • the sliding input is an eleventh sub-sliding input that slides in a ninth direction to a preset position.
  • the response module 1020 includes a second determination unit, a display unit, a receiving unit, and a response unit.
  • a second determining unit configured to determine one of the first screen and the second screen as a target screen.
  • a display unit configured to display at least one object on the target screen.
  • the receiving unit is configured to receive a user's selection input on the target object.
  • the response unit is configured to update the display content of the target screen to display content corresponding to the target object in response to the selection input.
  • the at least one object includes the target object.
  • the terminal device provided in the embodiment of the present disclosure can implement the processes implemented by the terminal device in the method embodiments in FIG. 1 to FIG. 9. To avoid repetition, details are not described herein again.
  • the terminal device 1000 in the above solution receives the sliding input of the user on the control area of the terminal device, and updates the display content of the target screen on the first screen and the second screen of the terminal device, so that the dual screen display of the terminal device can be quickly started. Function, it can also achieve separate or synchronous fast operation on dual screens, and its operation is simple, triggering precision is high, and false triggering is avoided.
  • FIG. 11 is a schematic diagram of a hardware structure of a terminal device that implements various embodiments of the present disclosure.
  • the terminal device 1100 includes, but is not limited to, a radio frequency unit 1101, a network module 1102, an audio output unit 1103, an input unit 1104, a sensor 1105, a display unit 1106, a user input unit 1107, an interface unit 1108, a memory 1109, a processor 1110, and Power supply 1111 and other components.
  • a radio frequency unit 1101 includes, but is not limited to, a radio frequency unit 1101, a network module 1102, an audio output unit 1103, an input unit 1104, a sensor 1105, a display unit 1106, a user input unit 1107, an interface unit 1108, a memory 1109, a processor 1110, and Power supply 1111 and other components.
  • the terminal device may include more or fewer components than shown in the figure, or some components may be combined, or different components. Layout.
  • the terminal device includes, but is not limited to, a mobile phone, a tablet computer, a notebook computer, a palmtop computer, a car terminal, a wearable device, a pedometer, and
  • the terminal device 1100 includes a first screen, a second screen, and a control area, wherein the first screen, the second screen, and the control area are respectively located on different surfaces of the terminal device.
  • the user input unit 1107 is configured to receive a sliding input of a user on the control area; the processor 1110 is configured to update the target screens of the first screen and the second screen in response to the sliding input. Display content.
  • the terminal device 1100 in the above solution receives the sliding input of the user on the control area of the terminal device, and updates the display content of the target screen on the first screen and the second screen of the terminal device, so that the dual screen display of the terminal device can be quickly started Function, it can also achieve separate or synchronous fast operation on dual screens, and its operation is simple, triggering precision is high, and false triggering is avoided.
  • the radio frequency unit 1101 may be used for receiving and sending signals during the process of receiving and sending information or during a call. Specifically, after receiving the downlink data from the base station, it is processed by the processor 1110; The uplink data is sent to the base station.
  • the radio frequency unit 1101 includes, but is not limited to, an antenna, at least one amplifier, a transceiver, a coupler, a low noise amplifier, a duplexer, and the like.
  • the radio frequency unit 1101 can also communicate with a network and other devices through a wireless communication system.
  • the terminal device provides users with wireless broadband Internet access through the network module 1102, such as helping users to send and receive email, browse web pages, and access streaming media.
  • the audio output unit 1103 may convert audio data received by the radio frequency unit 1101 or the network module 1102 or stored in the memory 1109 into audio signals and output them as sound. Moreover, the audio output unit 1103 may also provide audio output (for example, call signal reception sound, message reception sound, etc.) related to a specific function performed by the terminal device 1100.
  • the audio output unit 1103 includes a speaker, a buzzer, a receiver, and the like.
  • the input unit 1104 is used to receive audio or video signals.
  • the input unit 1104 may include a graphics processing unit (GPU) 11041 and a microphone 11042.
  • the graphics processor 11041 pairs images of still pictures or videos obtained by an image capture device (such as a camera) in a video capture mode or an image capture mode. Data is processed.
  • the processed image frames may be displayed on the display unit 1106.
  • the image frames processed by the graphics processor 11041 may be stored in the memory 1109 (or other storage medium) or transmitted via the radio frequency unit 1101 or the network module 1102.
  • the microphone 11042 can receive sound, and can process such sound into audio data.
  • the processed audio data can be converted into a format that can be transmitted to a mobile communication base station via the radio frequency unit 1101 in the case of a telephone call mode and output.
  • the terminal device 1100 further includes at least one sensor 1105, such as a light sensor, a motion sensor, and other sensors.
  • the light sensor includes an ambient light sensor and a proximity sensor.
  • the ambient light sensor can adjust the brightness of the display panel 11061 according to the brightness of the ambient light.
  • the proximity sensor can close the display panel 11061 and the display panel 11061 when the terminal device 1100 moves to the ear. / Or backlight.
  • the accelerometer sensor can detect the magnitude of acceleration in various directions (usually three axes).
  • sensor 1105 can also include fingerprint sensor, pressure sensor, iris sensor, molecular sensor, gyroscope, barometer, hygrometer, thermometer, Infrared sensors, etc. are not repeated here.
  • the display unit 1106 is used to display information input by the user or information provided to the user.
  • the display unit 1106 may include a display panel 11061.
  • the display panel 11061 may be configured in the form of a liquid crystal display (LCD), an organic light-emitting diode (OLED), or the like.
  • the user input unit 1107 may be used to receive inputted numeric or character information, and generate key signal inputs related to user settings and function control of the terminal device.
  • the user input unit 1107 includes a touch panel 11071 and other input devices 11072.
  • Touch panel 11071 also known as touch screen, can collect user's touch operations on or near it (for example, the user uses a finger, stylus or any suitable object or accessory on touch panel 11071 or near touch panel 11071 operating).
  • the touch panel 11071 may include two parts, a touch detection device and a touch controller.
  • the touch detection device detects the user's touch position, and detects the signal brought by the touch operation, and transmits the signal to the touch controller; the touch controller receives touch information from the touch detection device, converts it into contact coordinates, and sends it
  • the processor 1110 receives a command sent by the processor 1110 and executes the command.
  • the touch panel 11071 may be implemented in various types such as a resistive type, a capacitive type, an infrared type, and a surface acoustic wave.
  • the user input unit 1107 may also include other input devices 11072.
  • other input devices 11072 may include, but are not limited to, a physical keyboard, function keys (such as volume control keys, switch keys, etc.), a trackball, a mouse, and a joystick, and details are not described herein again.
  • the touch panel 11071 may be overlaid on the display panel 11061. After the touch panel 11071 detects a touch operation on or near the touch panel 11071, it is transmitted to the processor 1110 to determine the type of the touch event, and the processor 1110 then The type of event provides corresponding visual output on the display panel 11061.
  • the touch panel 11071 and the display panel 11061 are implemented as two independent components to implement input and output functions of the terminal device, in some embodiments, the touch panel 11071 and the display panel 11061 may be integrated. The implementation of the input and output functions of the terminal device is not specifically limited here.
  • the interface unit 1108 is an interface through which an external device is connected to the terminal device 1100.
  • the external device may include a wired or wireless headset port, an external power (or battery charger) port, a wired or wireless data port, a memory card port, a port for connecting a device with an identification module, audio input / output (I / O) port, video I / O port, headphone port, and more.
  • the interface unit 1108 may be used to receive an input (e.g., data information, power, etc.) from an external device and transmit the received input to one or more elements within the terminal device 1100 or may be used to connect the terminal device 1100 and an external device. Transfer data between devices.
  • the memory 1109 may be used to store software programs and various data.
  • the memory 1109 may mainly include a storage program area and a storage data area, where the storage program area may store an operating system, at least one application required by a function (such as a sound playback function, an image playback function, and the like), etc .; the storage data area may store data according to Data (such as audio data, phone book, etc.) created by the use of mobile phones.
  • the memory 1109 may include a high-speed random access memory, and may further include a non-volatile memory, such as at least one magnetic disk storage device, a flash memory device, or other volatile solid-state storage device.
  • the processor 1110 is a control center of the terminal device, and uses various interfaces and lines to connect various parts of the entire terminal device.
  • the processor 1110 runs or executes software programs and / or modules stored in the memory 1109 and calls data stored in the memory 1109. , To perform various functions of the terminal device and process data, so as to monitor the terminal device as a whole.
  • the processor 1110 may include one or more processing units; optionally, the processor 1110 may integrate an application processor and a modem processor, wherein the application processor mainly processes an operating system, a user interface, and an application program, etc.
  • the tuning processor mainly handles wireless communication. It can be understood that the foregoing modem processor may not be integrated into the processor 1110.
  • the terminal device 1100 may further include a power source 1111 (such as a battery) for supplying power to various components.
  • a power source 1111 such as a battery
  • the power source 1111 may be logically connected to the processor 1110 through a power management system, thereby implementing management of charging, discharging, and power consumption through the power management system. Management and other functions.
  • terminal device 1100 includes some functional modules that are not shown, and details are not described herein again.
  • an embodiment of the present disclosure further provides a terminal device, including a processor 1110 and a memory 1109, and a program stored in the memory 1109 and executable on the processor 1110. The program is implemented when the processor 1110 is executed.
  • An embodiment of the present disclosure also provides a computer-readable storage medium.
  • a program is stored on the computer-readable storage medium.
  • the computer-readable storage medium is, for example, a read-only memory (ROM), a random access memory (RAM), a magnetic disk or an optical disk.
  • the methods in the above embodiments can be implemented by means of software plus a necessary universal hardware platform, and of course, also by hardware, but in many cases the former is better.
  • Implementation Based on this understanding, the technical solution of the present disclosure that is essentially or contributes to the existing technology can be embodied in the form of a software product that is stored in a storage medium (such as ROM / RAM, magnetic disk, The optical disc) includes several instructions for causing a terminal (which may be a mobile phone, a computer, a server, an air conditioner, or a network device, etc.) to execute the methods described in the embodiments of the present disclosure.
  • a terminal which may be a mobile phone, a computer, a server, an air conditioner, or a network device, etc.

Landscapes

  • Engineering & Computer Science (AREA)
  • Theoretical Computer Science (AREA)
  • General Engineering & Computer Science (AREA)
  • Human Computer Interaction (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • User Interface Of Digital Computer (AREA)

Abstract

La présente invention concerne un procédé d'affichage et un dispositif terminal. Le dispositif terminal comprend un premier écran, un second écran et une zone de commande, le premier écran, le second écran et la zone de commande étant respectivement situés à des surfaces différentes du dispositif terminal. Le procédé comprend les étapes consistant à : recevoir une entrée par coulissement d'un utilisateur sur la zone de commande ; en réponse à l'entrée par coulissement, mettre à jour un contenu d'affichage d'un écran cible dans le premier écran et le second écran.
PCT/CN2019/093629 2018-06-28 2019-06-28 Procédé d'affichage et dispositif terminal WO2020001604A1 (fr)

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
CN201810686435.4A CN108897486B (zh) 2018-06-28 2018-06-28 一种显示方法及终端设备
CN201810686435.4 2018-06-28

Publications (1)

Publication Number Publication Date
WO2020001604A1 true WO2020001604A1 (fr) 2020-01-02

Family

ID=64346700

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/CN2019/093629 WO2020001604A1 (fr) 2018-06-28 2019-06-28 Procédé d'affichage et dispositif terminal

Country Status (2)

Country Link
CN (1) CN108897486B (fr)
WO (1) WO2020001604A1 (fr)

Families Citing this family (7)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN108897486B (zh) * 2018-06-28 2021-04-13 维沃移动通信有限公司 一种显示方法及终端设备
CN109819106B (zh) * 2018-12-27 2022-01-14 努比亚技术有限公司 双面屏多任务执行方法、移动终端及计算机可读存储介质
KR102572268B1 (ko) * 2019-01-02 2023-08-30 삼성디스플레이 주식회사 표시 장치 및 그 구동 방법
WO2020171318A1 (fr) 2019-02-19 2020-08-27 엘지전자 주식회사 Terminal mobile et dispositif électronique contenant le terminal mobile
CN112400306A (zh) * 2019-02-19 2021-02-23 Lg电子株式会社 移动终端和具有移动终端的电子装置
CN110069305A (zh) * 2019-03-28 2019-07-30 南京维沃软件技术有限公司 一种屏幕界面切换方法及终端设备
US12015732B2 (en) * 2019-05-31 2024-06-18 Apple Inc. Device, method, and graphical user interface for updating a background for home and wake screen user interfaces

Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20120229399A1 (en) * 2011-03-11 2012-09-13 Hiroki Kobayashi Electronic device
CN204650426U (zh) * 2015-05-29 2015-09-16 东莞颠覆产品设计有限公司 移动终端设备
CN106168891A (zh) * 2016-06-30 2016-11-30 努比亚技术有限公司 屏幕内容切换装置、双屏终端及屏幕内容切换方法
CN108897486A (zh) * 2018-06-28 2018-11-27 维沃移动通信有限公司 一种显示方法及终端设备

Family Cites Families (10)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US8930846B2 (en) * 2010-10-01 2015-01-06 Z124 Repositioning applications in a stack
EP3734407A1 (fr) * 2011-02-10 2020-11-04 Samsung Electronics Co., Ltd. Dispositif portable comprenant un affichage à écran tactile et son procédé de commande
CN103399794B (zh) * 2013-07-30 2017-10-13 努比亚技术有限公司 移动终端切换应用的方法及装置
US10254863B2 (en) * 2014-12-19 2019-04-09 Lg Electronics Inc. Mobile terminal
US20180032536A1 (en) * 2016-08-01 2018-02-01 Barbara Carey Stachowski Method of and system for advertising real estate within a defined geo-targeted audience
CN107329650A (zh) * 2017-06-21 2017-11-07 珠海市魅族科技有限公司 交互方法、交互装置、终端设备及计算机可读存储介质
CN107678664A (zh) * 2017-08-28 2018-02-09 中兴通讯股份有限公司 一种终端界面切换、手势处理的方法、装置及终端
CN107870716B (zh) * 2017-11-07 2020-12-29 Oppo广东移动通信有限公司 调用后台应用程序的方法和装置
CN108008927A (zh) * 2017-11-30 2018-05-08 努比亚技术有限公司 多屏会议实现方法、移动终端及计算机可读存储介质
CN108012001A (zh) * 2017-11-30 2018-05-08 努比亚技术有限公司 一种基于双屏移动终端的操作方法、移动终端和存储介质

Patent Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20120229399A1 (en) * 2011-03-11 2012-09-13 Hiroki Kobayashi Electronic device
CN204650426U (zh) * 2015-05-29 2015-09-16 东莞颠覆产品设计有限公司 移动终端设备
CN106168891A (zh) * 2016-06-30 2016-11-30 努比亚技术有限公司 屏幕内容切换装置、双屏终端及屏幕内容切换方法
CN108897486A (zh) * 2018-06-28 2018-11-27 维沃移动通信有限公司 一种显示方法及终端设备

Also Published As

Publication number Publication date
CN108897486A (zh) 2018-11-27
CN108897486B (zh) 2021-04-13

Similar Documents

Publication Publication Date Title
US11210049B2 (en) Display control method and terminal
CN108762954B (zh) 一种对象分享方法及移动终端
US12105935B2 (en) Method for displaying background application and mobile terminal
WO2019154181A1 (fr) Procédé de commande d'affichage et terminal mobile
WO2020001604A1 (fr) Procédé d'affichage et dispositif terminal
JP7403648B2 (ja) 同期方法及び電子機器
EP4071606A1 (fr) Procédé de partage d'application, premier dispositif électronique et support de stockage lisible par ordinateur
WO2021012931A1 (fr) Procédé et terminal de gestion d'icônes
WO2020143663A1 (fr) Procédé d'affichage et terminal mobile
WO2020238449A1 (fr) Procédé de traitement de messages de notification et terminal
WO2019228297A1 (fr) Procédé de commande d'affichage et terminal
WO2020238497A1 (fr) Procédé de déplacement d'icône et dispositif terminal
WO2019206036A1 (fr) Procédé et terminal de gestion de messages
WO2019174459A1 (fr) Procédé de traitement d'application et terminal mobile
WO2019196864A1 (fr) Procédé de commande de bouton virtuel et terminal mobile
CN108762705B (zh) 一种信息显示方法、移动终端和计算机可读存储介质
WO2021068885A1 (fr) Procédé de commande et dispositif électronique
WO2020233323A1 (fr) Procédé de commande d'affichage, dispositif terminal, et support de stockage lisible par ordinateur
CN109032486B (zh) 一种显示控制方法及终端设备
WO2020042892A1 (fr) Procédé de changement de mode d'appel et dispositif terminal
WO2019228296A1 (fr) Procédé de traitement d'affichage et dispositif terminal
WO2019114522A1 (fr) Procédé de commande d'écran, appareil de commande d'écran et terminal mobile
US12073057B2 (en) Electronic device and storage medium for obtaining, via screenshot comparison, display information for windows in background running state
WO2020215950A1 (fr) Procédé d'affichage d'interface et dispositif terminal
CN109558046B (zh) 一种信息显示方法及终端设备

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 19825211

Country of ref document: EP

Kind code of ref document: A1

NENP Non-entry into the national phase

Ref country code: DE

122 Ep: pct application non-entry in european phase

Ref document number: 19825211

Country of ref document: EP

Kind code of ref document: A1