US20240012605A1 - Data Processing Method and Mobile Device - Google Patents
Data Processing Method and Mobile Device Download PDFInfo
- Publication number
- US20240012605A1 US20240012605A1 US18/149,838 US202318149838A US2024012605A1 US 20240012605 A1 US20240012605 A1 US 20240012605A1 US 202318149838 A US202318149838 A US 202318149838A US 2024012605 A1 US2024012605 A1 US 2024012605A1
- Authority
- US
- United States
- Prior art keywords
- display
- gesture
- touch
- touch panel
- application
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Pending
Links
- 238000003672 processing method Methods 0.000 title description 3
- 230000004044 response Effects 0.000 claims abstract description 21
- 238000000034 method Methods 0.000 claims description 25
- 238000004590 computer program Methods 0.000 claims description 9
- 239000004973 liquid crystal related substance Substances 0.000 claims description 4
- 238000012545 processing Methods 0.000 abstract description 20
- 210000003811 finger Anatomy 0.000 description 31
- 230000006870 function Effects 0.000 description 17
- 238000010586 diagram Methods 0.000 description 16
- 238000007726 management method Methods 0.000 description 13
- 238000004891 communication Methods 0.000 description 10
- 230000008901 benefit Effects 0.000 description 7
- 230000008569 process Effects 0.000 description 7
- 238000010079 rubber tapping Methods 0.000 description 7
- 238000001514 detection method Methods 0.000 description 4
- 230000001133 acceleration Effects 0.000 description 3
- 238000013500 data storage Methods 0.000 description 3
- 239000012769 display material Substances 0.000 description 3
- 230000000694 effects Effects 0.000 description 3
- 239000000463 material Substances 0.000 description 3
- 230000000007 visual effect Effects 0.000 description 3
- 238000013459 approach Methods 0.000 description 2
- 230000008859 change Effects 0.000 description 2
- 238000005516 engineering process Methods 0.000 description 2
- 238000012986 modification Methods 0.000 description 2
- 230000004048 modification Effects 0.000 description 2
- 230000003287 optical effect Effects 0.000 description 2
- 230000005236 sound signal Effects 0.000 description 2
- 210000003813 thumb Anatomy 0.000 description 2
- 230000005540 biological transmission Effects 0.000 description 1
- 238000007599 discharging Methods 0.000 description 1
- 230000005484 gravity Effects 0.000 description 1
- 230000009191 jumping Effects 0.000 description 1
- 230000007774 longterm Effects 0.000 description 1
- 238000010295 mobile communication Methods 0.000 description 1
- 238000012544 monitoring process Methods 0.000 description 1
- 239000010813 municipal solid waste Substances 0.000 description 1
- 239000013307 optical fiber Substances 0.000 description 1
- 230000000284 resting effect Effects 0.000 description 1
- 239000004065 semiconductor Substances 0.000 description 1
- 239000007787 solid Substances 0.000 description 1
- 230000003068 static effect Effects 0.000 description 1
Images
Classifications
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F3/00—Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
- G06F3/14—Digital output to display device ; Cooperation and interconnection of the display device with other functional units
- G06F3/147—Digital output to display device ; Cooperation and interconnection of the display device with other functional units using display panels
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F1/00—Details not covered by groups G06F3/00 - G06F13/00 and G06F21/00
- G06F1/16—Constructional details or arrangements
- G06F1/1613—Constructional details or arrangements for portable computers
- G06F1/1626—Constructional details or arrangements for portable computers with a single-body enclosure integrating a flat display, e.g. Personal Digital Assistants [PDAs]
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F1/00—Details not covered by groups G06F3/00 - G06F13/00 and G06F21/00
- G06F1/16—Constructional details or arrangements
- G06F1/1613—Constructional details or arrangements for portable computers
- G06F1/1633—Constructional details or arrangements of portable computers not specific to the type of enclosures covered by groups G06F1/1615 - G06F1/1626
- G06F1/1637—Details related to the display arrangement, including those related to the mounting of the display in the housing
- G06F1/1643—Details related to the display arrangement, including those related to the mounting of the display in the housing the display being associated to a digitizer, e.g. laptops that can be used as penpads
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F1/00—Details not covered by groups G06F3/00 - G06F13/00 and G06F21/00
- G06F1/16—Constructional details or arrangements
- G06F1/1613—Constructional details or arrangements for portable computers
- G06F1/1633—Constructional details or arrangements of portable computers not specific to the type of enclosures covered by groups G06F1/1615 - G06F1/1626
- G06F1/1637—Details related to the display arrangement, including those related to the mounting of the display in the housing
- G06F1/1647—Details related to the display arrangement, including those related to the mounting of the display in the housing including at least an additional display
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F1/00—Details not covered by groups G06F3/00 - G06F13/00 and G06F21/00
- G06F1/16—Constructional details or arrangements
- G06F1/1613—Constructional details or arrangements for portable computers
- G06F1/1633—Constructional details or arrangements of portable computers not specific to the type of enclosures covered by groups G06F1/1615 - G06F1/1626
- G06F1/1684—Constructional details or arrangements related to integrated I/O peripherals not covered by groups G06F1/1635 - G06F1/1675
- G06F1/169—Constructional details or arrangements related to integrated I/O peripherals not covered by groups G06F1/1635 - G06F1/1675 the I/O peripheral being an integrated pointing device, e.g. trackball in the palm rest area, mini-joystick integrated between keyboard keys, touch pads or touch stripes
- G06F1/1692—Constructional details or arrangements related to integrated I/O peripherals not covered by groups G06F1/1635 - G06F1/1675 the I/O peripheral being an integrated pointing device, e.g. trackball in the palm rest area, mini-joystick integrated between keyboard keys, touch pads or touch stripes the I/O peripheral being a secondary touch screen used as control interface, e.g. virtual buttons or sliders
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F3/00—Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
- G06F3/01—Input arrangements or combined input and output arrangements for interaction between user and computer
- G06F3/048—Interaction techniques based on graphical user interfaces [GUI]
- G06F3/0481—Interaction techniques based on graphical user interfaces [GUI] based on specific properties of the displayed interaction object or a metaphor-based environment, e.g. interaction with desktop elements like windows or icons, or assisted by a cursor's changing behaviour or appearance
- G06F3/0482—Interaction with lists of selectable items, e.g. menus
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F3/00—Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
- G06F3/01—Input arrangements or combined input and output arrangements for interaction between user and computer
- G06F3/048—Interaction techniques based on graphical user interfaces [GUI]
- G06F3/0484—Interaction techniques based on graphical user interfaces [GUI] for the control of specific functions or operations, e.g. selecting or manipulating an object, an image or a displayed text element, setting a parameter value or selecting a range
- G06F3/0485—Scrolling or panning
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F3/00—Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
- G06F3/01—Input arrangements or combined input and output arrangements for interaction between user and computer
- G06F3/048—Interaction techniques based on graphical user interfaces [GUI]
- G06F3/0487—Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser
- G06F3/0488—Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser using a touch-screen or digitiser, e.g. input of commands through traced gestures
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F3/00—Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
- G06F3/01—Input arrangements or combined input and output arrangements for interaction between user and computer
- G06F3/048—Interaction techniques based on graphical user interfaces [GUI]
- G06F3/0487—Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser
- G06F3/0488—Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser using a touch-screen or digitiser, e.g. input of commands through traced gestures
- G06F3/04883—Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser using a touch-screen or digitiser, e.g. input of commands through traced gestures for inputting data by handwriting, e.g. gesture or text
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F3/00—Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
- G06F3/14—Digital output to display device ; Cooperation and interconnection of the display device with other functional units
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F3/00—Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
- G06F3/14—Digital output to display device ; Cooperation and interconnection of the display device with other functional units
- G06F3/1423—Digital output to display device ; Cooperation and interconnection of the display device with other functional units controlling a plurality of local displays, e.g. CRT and flat panel display
- G06F3/1431—Digital output to display device ; Cooperation and interconnection of the display device with other functional units controlling a plurality of local displays, e.g. CRT and flat panel display using a single graphics controller
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F3/00—Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
- G06F3/01—Input arrangements or combined input and output arrangements for interaction between user and computer
- G06F3/048—Interaction techniques based on graphical user interfaces [GUI]
- G06F3/0481—Interaction techniques based on graphical user interfaces [GUI] based on specific properties of the displayed interaction object or a metaphor-based environment, e.g. interaction with desktop elements like windows or icons, or assisted by a cursor's changing behaviour or appearance
- G06F3/0483—Interaction with page-structured environments, e.g. book metaphor
-
- G—PHYSICS
- G09—EDUCATION; CRYPTOGRAPHY; DISPLAY; ADVERTISING; SEALS
- G09G—ARRANGEMENTS OR CIRCUITS FOR CONTROL OF INDICATING DEVICES USING STATIC MEANS TO PRESENT VARIABLE INFORMATION
- G09G2354/00—Aspects of interface with display user
Definitions
- the disclosure relates to the field of data processing, and in particular, to a data processing method and a mobile device.
- Page flip is usually implemented by the user by touching a screen of the mobile device with a finger, to turn to a next page or return to a previous page.
- Page flip is usually implemented by the user by touching a screen of the mobile device with a finger, to turn to a next page or return to a previous page.
- the user needs to perform many complex operations. As shown in FIG.
- embodiments provide a data processing method and a mobile device, to reduce operation steps of a user, improve transaction processing efficiency of a mobile device, and further enhance user experience.
- an embodiment provides a method for performing data processing on a mobile device.
- the mobile device includes a touch display screen disposed on a front face of the mobile device, and a low power display screen and a touch panel that are disposed on a rear face of the mobile device, the touch display screen is a capacitive touchscreen, and the low power display screen is an electronic ink screen.
- the method includes: detecting, by the touch display screen, a touch operation of a user; determining, by the mobile device in response to the touch operation, whether the touch operation works on an application program of a reading type; when the mobile device determines that the touch operation works on the application program of a reading type, instructing, by the mobile device, the low power display screen to display a first graphical user interface (first GUI) of the application program, and instructing to turn off a power supply of the touch display screen and activate the touch panel, where the first GUI is an interface, in the application program, for displaying content of an ebook; and when the touch panel detects a touch event of the user, determining, by the mobile device based on a speed and a direction of the gesture, an instruction corresponding to the gesture, and executing the instruction, where a result of executing the instruction is: displaying a second graphical user interface (second GUI) of the application program on the low power display screen based on the speed and the direction of the gesture, where the second GUI is used to display a catalogue of the
- the second touch display screen and the touch panel are added on the rear face of the mobile device, and the touch event of the user on the touch panel is detected to execute the corresponding instruction, so that different graphical user interfaces are displayed for the ebook displayed on the second touch display screen. This simplifies operation steps when the user reads the ebook, and also improves transaction processing efficiency of the mobile device.
- the displaying a second graphical user interface of the application program on the low power display screen may be specifically: displaying a part of the catalogue of the ebook on the low power display screen based on the speed and the direction of the gesture.
- an embodiment provides a method for performing data processing on a mobile device.
- the mobile device includes a first touch display screen disposed on a front face of the mobile device, and a second touch display screen and a touch panel that are disposed on a rear face of the mobile device.
- the method includes: when the first touch display screen detects a touch operation of a user, determining, by the mobile device, whether the touch operation works on an application program; when determining that the touch operation works on the application program, determining, by the mobile device, a type of the application program; when the mobile device determines that the type of the application program is reading, instructing, by the mobile device, the second touch display screen to display a first graphical user interface of the application program; and when the touch panel detects a touch event of the user, determining, by the mobile device based on the touch event, an instruction corresponding to the touch event, and executing the instruction, where a result of executing the instruction is: displaying a second graphical user interface of the application program on the second touch display screen based on the touch event.
- the determining, by the mobile device based on the touch event, an instruction corresponding to the touch event, and executing the instruction may be specifically: detecting, by the touch panel at different time points t 1 , t 2 , and t 3 , that positions of three touch points A, B, and C in the touch event are (x 1 , y 1 ), (x 2 , y 2 ), and (x 3 , y 3 ); determining, by the mobile device, a speed and/or a direction of the touch event based on X-axis coordinate values x 1 , x 2 , and x 3 of the three touch points A, B, and C and the time points t 1 , t 2 , and t 3 ; and performing, by the mobile device based on the speed and/or the direction, a fast page flip operation on content of an ebook displayed on the first graphical user interface, and displaying the second graphical user interface of the fast page flip operation on the second touch display screen.
- the second graphical user interface may be used to display a catalogue of the ebook.
- the touch event is a gesture of sliding on the touch panel by a single finger
- the displaying a second graphical user interface of the application program on the second touch display screen based on the touch event may be specifically: displaying a fast page flip graphical user interface of the application program on the second touch display screen based on a speed and a direction of the sliding gesture.
- a processor 380 may alternatively instruct a first display panel 341 to display some graphical user interfaces of the application program, and other graphical user interfaces are displayed on a second display panel 342 .
- a user can focus more on content of interest, and a screen splitting function can be implemented.
- the some graphical user interfaces displayed on the first display panel 341 may be specifically iconic controls of the application program and the other graphical user interfaces displayed on the second display panel 342 may mainly be substantial content of the application program.
- the mobile device further turns off a power supply of the first touch display screen while the mobile device instructs the second touch display screen to display the first graphical user interface of the application program.
- an embodiment provides a mobile device for performing data processing.
- the mobile device includes a first display panel and a first touch panel that are disposed on a front face of the mobile device, and a second display panel and a second touch panel that are disposed on a rear face of the mobile device; and the mobile device further includes a processor, a memory, and a power supply management system.
- the first display panel is configured to display an icon, a graphical user interface, and a component of an application program; when the first touch panel detects a touch operation of a user, the processor determines whether the touch operation works on application program displayed on the first display panel; when the processor determines that the touch operation works on the application program, the processor determines a type of the application program; when the processor determines that the type of the application program is reading, the processor instructs the second display panel to display a first graphical user interface of the application program, and instructs the power supply management system to turn off a power supply of the first display panel and activate the second display panel, where the first graphical user interface is an interface, in the application program, for displaying content of an ebook; and when the second touch panel detects a touch event of the user, the processor determines, based on a speed and a direction of the touch event, an instruction corresponding to the touch event, and executes the instruction, where a result of executing the instruction is: displaying a second graphical user interface of the application program on the
- the first display panel is a liquid crystal display
- the second display panel is an electronic ink screen.
- the first display panel may be mainly configured for daily operations of the user
- the second display panel is a display panel of a low power material, and therefore may be configured to read an ebook.
- the mobile device may be a mobile phone or a tablet computer.
- the touch operation of the user on the touch panel is detected, to determine to perform processing (for example, page flip or returning to the catalogue) on the content of the ebook displayed on the second display panel.
- processing for example, page flip or returning to the catalogue
- an embodiment provides a method for performing data processing on a mobile device.
- the mobile device includes a first touch display screen disposed on a front face of the mobile device, and a second touch display screen and a touch panel that are disposed on a rear face of the mobile device, the first touch display screen may be a capacitive touchscreen, and the second touch display screen may be an electronic ink screen.
- the method may specifically include: detecting a touch operation of a user on the first touch display screen; displaying, in response to the touch operation, a control for an application program on the first touch display screen; and in response to a touch operation performed by the user on the control, displaying a first graphical user interface of the application program on the first touch display screen, and displaying a second graphical user interface of the application program on the second touch display screen, where the first graphical user interface includes a plurality of controls, the second graphical user interface displays content of the application program, and the plurality of controls are used to control the content of the application program.
- the application program is an application program of a reading type
- the content of the application program may be an ebook.
- an embodiment provides a method for performing data processing on a mobile device.
- the mobile device includes a first touch display screen disposed on a front face of the mobile device, and a second touch display screen and a touch panel that are disposed on a rear face of the mobile device, the first touch display screen is a capacitive touchscreen, and the second touch display screen is an electronic ink screen.
- the method includes: detecting a touch operation of a user on the first touch display screen; displaying, in response to the touch operation, a control for an application program on the first touch display screen; and in response to a touch operation performed by the user on the control, displaying a first graphical user interface of the application program on the first touch display screen, and displaying a second graphical user interface of the application program on the second touch display screen, where the first graphical user interface includes a plurality of controls, the second graphical user interface displays content of the application program, and the plurality of controls are used to control the content of the application program.
- the application program is an application program of a reading type
- the content of the application program is an ebook.
- FIG. 1 is a schematic diagram of a user interface of a page flip or a page jump in the prior art
- FIG. 2 A is a schematic structural diagram of a front face of a mobile phone in some embodiments
- FIG. 2 B is a schematic structural diagram of a rear face of a mobile phone in some embodiments.
- FIG. 3 is a schematic diagram of a hardware structure of a mobile phone in some embodiments.
- FIG. 4 is a schematic diagram of a graphical user interface displayed by a first display panel 341 on a front face of a mobile phone in some embodiments;
- FIG. 5 is a schematic diagram of a graphical user interface displayed by a second display panel 342 on a rear face of a mobile phone in some embodiments;
- FIG. 6 is a schematic diagram of a track of a touch event on a third touch panel 333 in some embodiments.
- FIG. 7 A to FIG. 7 D are schematic diagrams of some graphical user interfaces displayed by a second display panel 342 on a rear face of a mobile device in some embodiments;
- FIG. 8 is a schematic flowchart of a method in some embodiments.
- FIG. 9 is a schematic structural diagram of a mobile device in some embodiments.
- FIG. 10 A is a schematic diagram of a graphical user interface displayed by a first display panel on a front face of a mobile device in some embodiments;
- FIG. 10 B is a schematic diagram of a graphical user interface displayed by a second display panel on a rear face of a mobile device in some embodiments;
- FIG. 11 A and FIG. 11 B are schematic diagrams of a position of a third touch panel in some other embodiments.
- FIG. 12 is a schematic diagram of another graphical user interface displayed by a first display panel on a front face of a mobile device in some embodiments.
- a mobile device in the following embodiments may be any device having a wireless communication function, for example, may be a wearable electronic device (for example, a smartwatch) having the wireless communication function, may be a mobile phone 300 shown in FIG. 3 , or may be a tablet computer. No special limitation is imposed on a specific form of the mobile device in the following embodiments.
- a mobile phone is used as an example to describe how a mobile device implements a specific technical solution in this embodiment.
- the mobile device in this embodiment may be a mobile phone 300 .
- FIG. 2 A and FIG. 2 B are schematic appearance diagrams of the mobile phone 300 .
- FIG. 2 A is a schematic diagram of a front face of the mobile phone 300 .
- FIG. 2 B is a schematic diagram of a rear face of the mobile phone 300 .
- the following uses the mobile phone 300 as an example to specifically describe this embodiment.
- the mobile phone 300 in the figure is merely an example of the mobile device, and the mobile phone 300 may have more or fewer components than those shown in the figure, a combination of two or more components, or components disposed differently.
- the components shown in the figure may be implemented by hardware, software, or a combination of hardware and software, where the hardware, the software, or the combination of hardware and software includes one or more signal processing and/or application-specific integrated circuits.
- the mobile phone 300 includes components such as an RF (Radio Frequency, radio frequency) circuit 310 , a memory 320 , an input unit 330 , a display unit 340 , a sensor 350 , an audio frequency circuit 360 , a Wi-Fi module 370 , a processor 380 , and a power supply 390 .
- RF Radio Frequency, radio frequency
- FIG. 3 A person skilled in the art may understand that a structure of the mobile phone shown in FIG. 3 does not constitute a limitation on the mobile phone, and the mobile phone may include more or fewer components than those shown in the figure, a combination of some components, or components disposed differently.
- the RF circuit 310 may be configured to receive and transmit information, or receive and send signals in a call process.
- the RF circuit 310 may receive downlink information of a base station and then provide the received downlink information for the processor 380 for processing, and send uplink data to the base station.
- the RF circuit includes but is not limited to devices such as an antenna, at least one amplifier, a transceiver, a coupler, a low noise amplifier, and a duplexer.
- the RF circuit 310 may further communicate with a network and another mobile device through wireless communication.
- the wireless communication may use any communications standard or protocol, including but not limited to a Global System for Mobile Communications, a general packet radio service, code division multiple access, wideband code division multiple access, Long Term Evolution, an email, a short message service, and the like.
- the memory 320 may be configured to store a software program and data.
- the processor 380 runs the software program and the data that are stored in the memory 320 , to perform various functions of the mobile phone 300 and process data.
- the memory 320 may mainly include a program storage area and a data storage area.
- the program storage area may store an operating system, an application program required by at least one function (for example, a sound playback function and an image playback function), and the like.
- the data storage area may store data (for example, audio data and a phone book) created based on use of the mobile phone 300 , and the like.
- the memory 320 may include a high-speed random access memory, and may further include a non-volatile memory, for example, at least one disk storage device, a flash memory device, or another volatile solid-state storage device.
- the memory 320 stores an operating system that enables the mobile phone 300 to run, for example, an iOS® operating system developed by Apple Inc., an Android® open source operating system developed by Google Inc., and a Windows® operating system developed by Microsoft Corporation.
- the input unit 330 may be configured to receive entered numeral or character information, and generate signal input related to user setting and function control of the mobile phone 300 .
- the input unit 330 may include a first touch panel 331 disposed on a front face of the mobile phone 300 .
- the first touch panel 331 is also referred to as a first touchscreen, and may collect a touch operation of a user on or near the first touch panel 331 (such as an operation performed by the user on the first touch panel 331 or near the first touch panel 331 by using a finger or any proper object or accessory such as a stylus), and drive a corresponding connection apparatus according to a preset program.
- the first touch panel 331 may include two parts: a touch detection apparatus and a touch controller (not shown in the figure).
- the touch detection apparatus detects a touch orientation of the user, detects a signal brought by the touch operation, and sends the signal to the touch controller.
- the touch controller receives touch information from the touch detection apparatus, converts the touch information into touch point coordinates, and sends the touch point coordinates to the processor 380 , and can receive and execute an instruction sent by the processor 380 .
- the first touch panel 331 may be implemented in various types, such as a resistive type, a capacitive type, an infrared type, and a face acoustic wave type.
- the input unit 330 may further include a second touch panel 332 (also referred to as a second touchscreen) disposed on a rear face of the mobile phone 300 .
- a second touch panel 332 also referred to as a second touchscreen
- the first touch panel 331 and the second touch panel 332 are disposed on two opposite faces of the mobile phone 300 .
- the second touch panel 332 may use a same structure as the first touch panel 331 . Details are not described herein again.
- the display unit 340 may be configured to display information entered by the user or information provided for the user, and graphical user interfaces (referred to as GUIs in the following) of various menus of the mobile phone 300 .
- the display unit 340 may include a first display panel 341 (also referred to as a first display screen) disposed on the front face of the mobile phone 300 , and a second display panel 342 (also referred to as a second display screen) disposed on the rear face of the mobile phone 300 .
- the first display panel 341 and the second display panel 342 are disposed on the two opposite faces of the mobile phone 300 .
- the first display panel 341 may be configured by using a liquid crystal display, an organic light-emitting diode, or a like form.
- the second display panel 342 may be a screen made by using an electronic paper display technology or another low power display material, for example, an electronic ink screen. Therefore, the second display panel 342 may be used to read an ebook, a magazine, and the like. Certainly, the second display panel 342 may alternatively use a same display material as the first display panel 341 .
- the mobile phone 300 includes the front face A and the rear face B.
- three optical touch keys 201 , 202 , and 203 are disposed at the bottom, and the first touch panel 331 and the first display panel 341 are further disposed.
- the first touch panel 331 covers the first display panel 341 .
- the processor 380 After detecting a touch operation on or near the first touch panel 331 , the first touch panel 331 transfers the touch operation to the processor 380 , to determine a touch event.
- the processor 380 provides corresponding visual output on the first display panel 341 based on a type of the touch event.
- first touch panel 331 and the first display panel 341 are used as two separate components to implement input and output functions of the mobile phone 300 in FIG. 3
- the first touch panel 331 and the first display panel 341 may be integrated to implement the input and output functions of the mobile phone 300 .
- the integrated first touch panel 331 and first display panel 341 may be referred to as a first touch display screen.
- the second touch panel 332 and the second display panel 342 are disposed, and the second touch panel 332 covers the second display panel 342 . Functions of the second touch panel 332 and the second display panel 342 are similar to those of the first touch panel 331 and the first display panel 341 .
- a third touch panel 333 may be further included on the rear face B of the mobile phone 300 .
- the third touch panel 333 may not overlap with the second touch panel 332 or the second display panel 342 (as shown in FIG. 5 ).
- the third touch panel 333 may alternatively be configured on a side face of the mobile phone 300 , as shown in FIG. 11 A and FIG. 11 B .
- the third touch panel 333 may be in a strip shape and is applicable to the narrow side face. In this way, the display panel on the rear face of the mobile phone may be made larger. It may be understood that the third touch panel 333 on the side face may alternatively be integrated with a volume key.
- the third touch panel 333 transfers the touch operation to the processor 380 , to determine a type of a touch event.
- the processor 380 provides corresponding visual output on the first display panel 341 and/or the second display panel 342 based on the touch event.
- the second touch panel 332 and the second display panel 342 may be integrated to implement the input and output functions of the mobile phone 300 .
- the integrated second touch panel 332 and second display panel 342 may be referred to as a second touch display screen.
- pressure sensors may be further disposed on the first touch panel 331 , the second touch panel 332 , and the third touch panel 333 . In this way, when the user performs a touch operation on the foregoing touch panels, the touch panels can further detect a pressure of the touch operation, so that the mobile phone 300 can more accurately detect the touch operation.
- the mobile phone 300 may further include at least one sensor 350 , for example, a light sensor, a motion sensor, and another sensor.
- the light sensor may include an ambient light sensor and a proximity sensor.
- an ambient light sensor 351 is capable of adjusting a luminance of the first display panel 341 and/or the second display panel 342 based on brightness of ambient light.
- a proximity sensor 352 is disposed on the front face of the mobile phone 300 . When the mobile phone 300 moves to an ear, based on detection by the proximity sensor 352 , the mobile phone 300 turns off a power supply of the first display panel 341 , and may also turn off a power supply of the second display panel 342 at the same time. In this way, the mobile phone 300 may further save power.
- an accelerometer sensor is capable of detecting a magnitude of an acceleration in each direction (usually three axes), and may detect, in a static state, a magnitude and a direction of gravity.
- the accelerometer sensor may be applied to a mobile phone posture recognition application (for example, screen switching between a landscape mode and a portrait mode, a related game, or magnetometer posture calibration), a vibration recognition-related function (for example, a pedometer and tapping), and the like.
- a gyroscope for example, screen switching between a landscape mode and a portrait mode, a related game, or magnetometer posture calibration
- a vibration recognition-related function for example, a pedometer and tapping
- the audio frequency circuit 360 , a speaker 361 , and a microphone 362 may provide an audio interface between the user and the mobile phone 300 .
- the audio frequency circuit 360 may transmit, to the speaker 361 , a signal converted from received audio data, and the speaker 361 converts the signal into a sound signal for output.
- the microphone 362 converts a collected sound signal into an electrical signal, and the audio frequency circuit 360 receives the electrical signal, converts the electrical signal into audio data, and then outputs the audio data to the RF circuit 310 , to send the audio data to, for example, another mobile phone, or outputs the audio data to the memory 320 for further processing.
- Wi-Fi is a short-range wireless transmission technology.
- the mobile phone 300 may allow the user to receive and send an email, browse a web page, access streaming media, and the like.
- the Wi-Fi module 370 provides wireless broadband Internet access for the user.
- the processor 380 is a control center of the mobile phone 300 , and is connected to all components of the entire mobile phone by using various interfaces and lines.
- the processor 380 runs or executes the software program stored in the memory 320 and invokes the data stored in the memory 320 , to perform the functions of the mobile phone 300 and process data, so as to perform overall monitoring on the mobile phone.
- the processor 380 may include one or more processing units.
- An application processor and a modem processor may be further integrated in the processor 380 .
- the application processor primarily processes an operating system, a user interface, an application program, and the like, and the modem processor primarily processes wireless communication. It may be understood that the modem processor may not be integrated into the processor 380 .
- a Bluetooth module 381 is configured to exchange information with another device by using Bluetooth, a short-range communications protocol.
- the mobile phone 300 may establish, by using the Bluetooth module 381 , a Bluetooth connection to a wearable electronic device (for example, a smartwatch) that also has a Bluetooth module, to exchange data.
- a wearable electronic device for example, a smartwatch
- the mobile phone 300 further includes the power supply 390 (for example, a battery) that supplies power to each component.
- the power supply may be logically connected to the processor 380 by using a power supply management system, to implement functions such as charging management, discharging management, and power consumption management by using the power supply management system.
- the power supply 390 may include two power supplies: One power supply is mainly configured to supply power to the first display panel 341 and the first touch panel 331 , and the other power supply is mainly configured to supply power to the second display panel 342 , the second touch panel 332 , and the third touch panel 333 .
- the mobile phone 300 may further include a front-facing camera 353 , a rear-facing camera 354 , and the like. Details are not described herein.
- FIG. 4 there is a first display panel 341 on a front face of a mobile device 400 (for example, a mobile phone 300 ).
- the first display panel 341 displays icons of various application programs.
- an icon 401 represents an application program named reading
- an icon 403 represent system setting
- an icon 404 represents Facebook, an existing popular social application program.
- a display material used by the second display panel 342 may be different from that used by the first display panel 331 .
- the first display panel 341 may use a light-emitting diode material commonly used in a mobile device in the prior art
- the second display panel 342 may be a low power-consumption electronic ink screen or the like.
- a second touch panel 332 may be further disposed on the mobile device.
- a material used by the second touch panel 332 may be the same as or similar to that used by a first touch panel 331 , and the second touch panel 332 may also cover the second display panel 342 .
- the mobile device 400 further includes a third touch panel 333 .
- the third touch panel 333 transfers the touch operation to a processor 380 , to determine a gesture type of the touch operation.
- the processor 380 provides corresponding visual output on the first display panel 341 and/or the second display panel 342 based on the gesture type.
- the first touch panel 331 of the mobile device 400 detects a touch event on or near the first touch panel 331 , and transfers the touch event to the processor 380 , to determine an instruction corresponding to the touch event.
- the processor 380 invokes, according to the instruction, an application program stored in a memory 320 , and instructs the first display panel 341 to display a graphical user interface (GUI) of the application program (reading), so that the user can perform a specific operation.
- GUI graphical user interface
- the processor 380 may also instruct the second display panel 342 to display the graphical user interface of the application program.
- the GUI of the application program (reading) corresponding to the icon 401 is displayed on the second display panel 342 on the rear face of the mobile device 400 , so that the user performs reading on the low power second display panel 342 .
- the processor 380 may alternatively instruct the first display panel 341 to display some GUIs of the application program, and other GUIs are displayed on the second display panel 342 . In this way, the user can focus more on content of interest, and a screen splitting function can be implemented.
- the some GUIs displayed on the first display panel 341 may be specifically iconic controls of the application program and the other GUIs displayed on the second display panel 342 may mainly be substantial content of the application program.
- FIG. 10 A when the user touches or approaches the icon 401 on the first display panel 341 by using the finger 402 , if an application program corresponding to the icon is “reading”, some controls such as “page up” and “page down” that are not related to substantial content of an ebook (for example, controls such as an icon 1001 , an icon 1002 , and an icon 1003 representing a catalogue of the ebook) are displayed on the first display panel 341 .
- FIG. 10 B specific content 1004 of the ebook in the “reading” application program is displayed on the second display panel 342 .
- the processor 380 may send, to a power supply management system in a power supply 390 , an instruction for interrupting a power supply of the first display panel 341 .
- the power supply management system turns off the power supply of the first display panel 341 , so that the first display panel 341 is in a screen-off state.
- the processor 380 may send, to the power supply management system, an instruction for making the first display panel 341 sleep. After receiving the instruction, the power supply management system makes the first display panel 341 sleep, so that the first display panel 341 is in a screen-off state. This can further save the power supply of the mobile device 400 .
- the mobile device may alternatively display a GUI of the application program on the second display panel 342 only when determining that the application program is an application program for reading an ebook or the like.
- the second display panel 342 may allow displaying of only an application program of a specific type, for example, an application program of a reading type.
- the mobile device may determine a type of an application program in a plurality of manners, for example, determines a type of an application program based on a category attribute (for example, a game, reading, or communication) of the application program.
- the mobile device displays, only when determining that an application program corresponding to an icon tapped by a user is an application program of the reading type, a GUI of the application program on the second display panel 342 . If determining that the application program corresponding to the icon tapped by the user is not an application program of the reading type, the mobile device displays the GUI of the application program on the first display panel 341 , to facilitate a further operation of the user.
- the third touch panel 333 When there is content (for example, the GUI of the “reading” application program) displayed on the second display panel 342 , the third touch panel 333 is activated, to receive a touch operation of the user. That the third touch panel 333 is activated may be specifically: the third touch panel 333 is powered on by the power supply 390 and can detect the touch operation of the user on or near the third touch panel 333 . It may be understood that the third touch panel 333 may alternatively be activated only when a particular GUI is displayed on the second display panel 342 . For example, the third touch panel 333 is activated only when the second display panel 342 displays content of a particular chapter of an ebook (for example, chapters displayed in FIG. 7 A and FIG. 7 B ) in the “reading” application program, so that the mobile device saves more power.
- a particular chapter of an ebook for example, chapters displayed in FIG. 7 A and FIG. 7 B
- the processor 380 invokes, based on the touch event, an instruction corresponding to the touch event, and executes the instruction.
- the touch event may be a rightward (or leftward) sliding gesture or the like made by the user on the third touch panel 333 .
- the third touch panel 333 sends data related to the touch event to the processor 380 .
- the processor 380 determines and invokes the instruction corresponding to the touch event. If the instruction corresponding to the touch event is an instruction used for fast page flip, a result of executing the instruction is: fast turning the ebook displayed on the second display panel 342 to a particular page and displaying the particular page.
- the processor 380 may further calculate a speed of the touch event based on the related data of the touch event, and then determine, based on the speed, a specific speed for performing page flip on the displayed ebook.
- the processor 380 may alternatively calculate a sliding track length of the touch event on the third touch panel 333 based on the related data of the touch event, and determine, based on the calculated length, a specific speed for performing page flip on the displayed ebook. For example, a longer sliding track length of a sliding gesture leads to a faster page flip speed, in other words, more pages are flipped.
- the third touch panel 333 detects positions of three touch points A, B, and C of the finger at different time points t 1 , t 2 , and t 3 .
- Position coordinates of the three touch points on the third touch panel 333 may be respectively expressed as (x 1 , y 1 ), (x 2 , y 2 ), and (x 3 , y 3 ).
- the processor 380 may determine a speed or an acceleration of the touch event in an X-axis direction based on the different time points and the position coordinates of the three different touch points, so as to determine, based on the calculated speed or acceleration, the speed for performing page flip on the displayed ebook. It may be understood that a person skilled in the art may also determine a specific gesture of a touch event by using another technical solution. This is not limited in this embodiment.
- the processor 380 may first determine, based on the touch event, a gesture corresponding to the touch event, and then execute a corresponding instruction based on the determined gesture.
- the touch event may be a flick gesture, that is, slightly tapping a touch panel (for example, the first touch panel 331 or the second touch panel 332 ) with a single finger, then quickly sliding, and then quickly leaving the touch panel, for example, scrolling a screen up or down or switching to a left or right picture.
- the touch event may alternatively be a slide gesture, that is, slightly tapping a touch panel with a single finger, keeping the finger in contact with the touch panel, and then moving, for example, sliding to unlock.
- the touch event may alternatively be a swipe gesture, that is, touching a touch panel with a plurality of fingers, keeping the fingers in contact with the touch panel, and then moving, for example, pinching with three fingers to return to a home screen.
- the touch event may alternatively be a tap gesture, that is, slightly tapping a touch panel with a single finger and then leaving the touch panel immediately.
- the touch event may alternatively be a double tap gesture, that is, performing a tap gesture operation twice within an extremely short time.
- the touch event may alternatively be a touch & hold gesture, that is, slightly tapping a touch panel with a finger and keeping the finger resting on the touch panel.
- the touch event may alternatively be a drag gesture, that is, slightly tapping a touch panel with a finger and slowly moving the finger without leaving the touch panel (usually, to a determined target position, for example, dragging a file to a trash box to delete the file).
- the touch event may alternatively be a pinch gesture, that is, pinching on a touch panel with two fingers (usually a thumb and an index finger).
- the touch event may alternatively be an unpinch gesture, that is, stretching on a touch panel with two fingers (usually a thumb and an index finger). It may be understood that, in addition to the foregoing listed gestures, the touch event may be further a gesture in another form.
- a form of the touch event is not limited in this embodiment.
- a pressure sensor may be further disposed on the third touch panel 333 .
- the third touch panel 333 can detect a pressure imposed by the user on the touch panel, so that the processor 380 performs more complex and accurate processing.
- the mobile device may determine a page flip direction based on a gesture direction, and then determine, based on a pressure value of a touch event, a specific quantity of pages that need to be flipped; and may also determine a change of a page flip speed based on a pressure value change, and display different animation effects on the second display panel 342 .
- the mobile device may determine the gesture of the touch event by using different technical solutions.
- the third touch panel 333 may collect a touch position of a finger on the third touch panel 333 once every 10 milliseconds (ms) after a touch by the user finger is detected, and send collected related data to the processor 380 .
- the processor 380 determines a gesture of the touch event based on the collected related data.
- the third touch panel 333 determines the gesture of the touch event based on collected different touch positions of the finger at different time points.
- the third touch panel 333 not only can detect touch positions of the user on the third touch panel 333 at different time points, but also can detect pressures imposed by the user on the third touch panel 333 at different time points. In this way, the mobile device 400 can more accurately determine the gesture of the touch event by using parameters collected by the third touch panel 333 , such as a time point, a touch position, and a pressure. It may be understood that the processor 380 of the mobile device 400 may further determine a gesture speed based on the foregoing related data detected by the third touch panel 333 .
- the processor 380 may invoke and execute different instructions based on different gestures. For example, when determining that the touch event is the flick gesture, the processor 380 invokes an instruction corresponding to the flick gesture.
- a result of executing the instruction is fast flipping pages of the ebook displayed on the second display panel 342 (referred to as a “fast page flip” operation), for example, fast flipping the ebook to page 8 from page 1 .
- a specific page flip speed of the page flip operation may depend on a sliding speed of the flick gesture on the third touch panel 333 .
- a rightward sliding track shown by a dotted arrow 701 is made on the third touch panel 333 by using the finger 501 of the user.
- the third touch panel 333 may send collected data (for example, a time point and a touch position) related to the sliding track to the processor 380 .
- the processor 380 determines, based on the related data, that the touch event (namely, the sliding track) is the flick gesture, and calculates a speed and/or a direction of the gesture based on the related data.
- the related data may be data shown in FIG. 6 .
- the third touch panel 333 detects positions of three touch points A, B, and C of the finger at different time points t 1 , t 2 , and t 3 .
- Position coordinates of the three touch points on the third touch panel 333 may be respectively expressed as (x 1 , y 1 ), (x 2 , y 2 ), and (x 3 , y 3 ).
- the processor 380 may calculate the speed of the flick gesture based on changes of the three touch points A, B, and C on X and Y axes. To improve efficiency of the processor 380 , the speed and/or the direction (leftward or rightward) of the gesture may be calculated based on only X-axis coordinate values x 1 , x 2 , and x 3 of the three touch points and the time points t 1 , t 2 , and t 3 .
- the processor 380 then fast turns a currently displayed page of an ebook The Little Prince forward to page 8 (as shown by an icon 702 in FIG. 7 B ), and displays content of page 8 on the second display panel 342 .
- an animation or a like form may be used in a fast page flip process, to inform the user that fast page flip is being performed.
- a progress bar shown by an icon 705 in FIG. 7 B may be displayed on the second display panel 342 , so that the user can visually and directly perceive a rough quantity of pages that have been flipped currently and a page flip speed.
- the progress bar 705 may further display a specific page flip speed.
- the processor 380 when the processor 380 determines that the touch event is the swipe gesture, the processor 380 invokes an instruction corresponding to the swipe gesture.
- a result of executing the instruction is enabling the ebook displayed on the second display panel 342 to return to a catalogue page from a specific content page.
- the user uses two fingers 704 to make a rightward sliding track shown by a dotted arrow 703 on the third touch panel 333 .
- the third touch panel 333 may send collected data (for example, a time point and touch positions of the two fingers) related to the sliding track to the processor 380 .
- the processor 380 determines, based on the related data, that the touch event (namely, the sliding track) is the swipe gesture.
- the processor 380 then switches currently displayed content of an ebook The Little Prince to a catalogue (as shown by an icon 702 in FIG. 7 D ) of the ebook according to the instruction corresponding to the gesture, and displays the catalogue of the ebook on the second display panel 342 .
- an animation or a like form may be used in a process of switching to the catalogue, to inform the user that returning to the catalogue is being performed.
- the touch operation of the user on the third touch panel is detected, to determine to perform processing (for example, page flip or returning to the catalogue) on the content of the ebook displayed on the second display panel. This greatly simplifies operation steps of the user, improves transaction processing efficiency of the mobile device, and further enhances user experience.
- the processor 380 when the processor 380 determines that the touch event is the pinch gesture, the processor 380 invokes an instruction corresponding to the pinch gesture.
- a result of executing the instruction may be zooming in, to a given multiple, text of specific content of the ebook displayed on the second display panel 342 , to enhance a display effect of the ebook.
- the processor 380 determines that the touch event is the unpinch gesture, the processor 380 invokes an instruction corresponding to the unpinch gesture.
- a result of executing the instruction may be zooming out, to a given level, text of specific content of the ebook displayed on the second display panel 342 , to reduce a display effect of the ebook and further save power of the mobile device.
- the second touch panel 332 is further disposed on the rear face of the mobile device. The user may directly perform, on the second touch panel 332 , a simple touch operation on an icon displayed on the second display panel 342 .
- the mobile device for example, a mobile phone
- a wearable device for example, a smartwatch
- the third touch panel 333 may not be disposed on the mobile phone 300 , but a third touch panel is disposed on the smartwatch.
- the user may perform a touch gesture operation on the third touch panel of the smartwatch.
- the touch gesture operation is compiled into an instruction, and the instruction is sent to the mobile device by using a short-range communications protocol (for example, Wi-Fi or Bluetooth) or the like.
- a short-range communications protocol for example, Wi-Fi or Bluetooth
- the mobile device After receiving the instruction, the mobile device executes the instruction, so that the second display panel 342 on the rear face of the mobile device displays the specific content of the ebook. It may be understood that the mobile device may execute different instructions based on different gestures in the smartwatch, as described in the foregoing embodiments. Details are not described herein.
- FIG. 8 is a flowchart of a method in some embodiments.
- a mobile device receives a touch operation of a user on a first touch panel 331 (step S 801 ), where the touch operation may be an operation of tapping, by the user, an icon 401 displayed on a first display panel 341 .
- the mobile device determines whether the touch operation works on an application program (step S 802 ). When the mobile device determines that the touch operation does not work on the application program, the mobile device does not respond to the touch operation (step S 812 ); or when the mobile device determines that the touch operation works on the application program, the mobile device determines a type of the application program (step S 803 ).
- the mobile device determines that the type of the application program is a reading type
- the mobile device sends a display instruction to a second display panel 342 based on the touch operation (step S 804 ), where the display instruction may be specifically instructing the second display panel 342 to display a GUI of the application program corresponding to the icon touched by the user.
- the mobile device may send a display instruction to the first display panel 341 based on the touch operation (step S 810 ), to display a GUI of the application program on the first display panel 341 (step S 811 ).
- the second display panel 342 of the mobile device displays the GUI of the application program according to the instruction in step 804 (step S 805 ).
- a third touch panel 333 is activated (step S 806 ).
- the third touch panel 333 of the mobile device receives a touch event of the user (step S 807 ), where the touch event may be any gesture in the foregoing embodiments.
- the mobile device determines an instruction corresponding to the touch event, and invokes and sends the instruction to the second display panel 342 (step S 808 ).
- the second display panel 342 displays, according to the instruction, a GUI corresponding to the instruction (step S 809 ). For example, if the instruction is a page flip instruction, a page flip GUI is displayed on the second display panel 342 ; or if the instruction is an instruction used for returning to a catalogue, a GUI for returning to the catalogue is displayed on the second display panel 342 .
- an embodiment provides a mobile device 900 for performing data processing.
- the technical solutions in the foregoing embodiments may be implemented by the mobile device in this embodiment.
- the mobile device 900 includes a first display panel 901 and a first touch panel 902 that are disposed on a front face of the mobile device, and a second display panel 903 and a second touch panel 904 that are disposed on a rear face of the mobile device.
- the mobile device further includes a processor 905 , a memory 906 , and a power supply management system 907 .
- the foregoing hardware may be connected by using a communications bus 908 .
- the first display panel 901 is configured to display an icon, a graphical user interface, and a widget of an application program stored in the memory 906 .
- the processor 905 determines whether the touch operation works on an application program displayed on the first display panel 901 .
- the processor 905 determines a type of the application program.
- the processor 905 determines that the type of the application program is reading, the processor 905 instructs the second display panel 903 to display a first graphical user interface (first GUI) of the application program, and instructs the power supply management system 907 to turn off a power supply of the first display panel and activate the second display panel 903 , where the first GUI is an interface, in the application program, for displaying content of an ebook.
- first GUI graphical user interface
- the processor 905 determines, based on a speed and a direction of the touch event, an instruction corresponding to the touch event, and executes the instruction.
- a result of executing the instruction is: displaying a second graphical user interface (second GUI) of the application program on the second display panel 903 based on the speed and the direction of the touch event, where the second GUI is used to display a catalogue of the ebook.
- second GUI second graphical user interface
- the first display panel 901 may be a liquid crystal display, and the second display panel 903 may be an electronic ink screen.
- a prompt box may be displayed for the user to choose to display a graphical user interface of the application program on the first display panel 341 (as shown by an icon 1201 in FIG. 12 ), or display a graphical user interface of the application program on the second display panel 342 (as shown by an icon 1202 in FIG. 12 ), or display a graphical user interface of the application program on both the first display panel 341 and the second display panel 342 (as shown by an icon 1203 in FIG. 12 ).
- the mobile device receives a selection instruction of the user, the mobile device performs displaying according to a selection of the user.
- the mobile device When the selection of the user is starting the application program on the front face, the mobile device displays the GUI of the application program only on the first display panel 341 .
- the mobile device displays the GUI of the application program only on the second display panel 342 , and the power supply of the first display panel 341 may be turned off.
- the graphical user interfaces shown in FIG. 10 A and FIG. 10 B may be displayed, that is, related controls are displayed on a front face (the first display panel) of a mobile phone, and specific content of an ebook is displayed on a rear face (the second display panel 342 ) of the mobile phone.
- the user may control displayed content of the ebook on the rear face by using some controls (shown in FIG. 10 A ) on the front face of the mobile phone or a third touch panel on a side face of the mobile phone, so that the user can share related content to another person.
- the term “if” used in the embodiments may be interpreted as a meaning of “when” or “after” or “in response to determining” or “in response to detecting”.
- the phrase “if it is determined that” or “if (a stated condition or event) is detected” may be interpreted as a meaning of “when it is determined that” or “in response to determining” or “when (a stated condition or event) is detected” or “in response to detecting (a stated condition or event)”.
- All or some of the foregoing embodiments may be implemented by using software, hardware, firmware, or any combination thereof.
- the embodiments may be implemented completely or partially in a form of a computer program product.
- the computer program product includes one or more computer instructions.
- the computer may be a general-purpose computer, a dedicated computer, a computer network, or another programmable apparatus.
- the computer instructions may be stored in a computer-readable storage medium or may be transmitted from a computer-readable storage medium to another computer-readable storage medium.
- the computer instructions may be transmitted from a website, computer, server, or data center to another website, computer, server, or data center in a wired (for example, a coaxial cable, an optical fiber, or a digital sub scriber line) or wireless (for example, infrared, radio, and microwave, or the like) manner.
- the computer-readable storage medium may be any usable medium accessible by a computer, or a data storage device, such as a server or a data center, integrating one or more usable media.
- the usable medium may be a magnetic medium (for example, a floppy disk, a hard disk, or a magnetic tape), an optical medium (for example, a DVD), a semiconductor medium (for example, a solid state disk), or the like.
Landscapes
- Engineering & Computer Science (AREA)
- Theoretical Computer Science (AREA)
- General Engineering & Computer Science (AREA)
- Human Computer Interaction (AREA)
- Physics & Mathematics (AREA)
- General Physics & Mathematics (AREA)
- Computer Hardware Design (AREA)
- Computer Graphics (AREA)
- User Interface Of Digital Computer (AREA)
Abstract
Performing data processing on a mobile device, where the mobile device includes a first display, a second display and a second touch panel; the first display is configured to: display a home screen which comprises an icon of a first application; display iconic controls of a first application in response to detecting a first gesture on the icon of the first application; the second display is configured to: display a first GUI of the first application in response to detecting the first gesture; the second touch panel is configured to: detect a second gesture on the second touch panel; display a second GUI of the first application on the second display in response to detecting the second gesture, wherein maintaining display iconic controls of the first application on the first display when displaying the second GUI on the second display.
Description
- This is a continuation of U.S. patent application Ser. No. 17/130,317 filed on Dec. 22, 2020, which is a continuation of U.S. patent application Ser. No. 16/481,417 filed on Jul. 26, 2019, now U.S. Pat. No. 10,908,868, which is a national stage entry of Int'l Patent App. No. PCT/CN2017/075633, filed on Mar. 3, 2017, which claims priority to Int'l Patent App. No. PCT/CN2017/073173, filed on Feb. 9, 2017 and Chinese Patent App. No. 201710061746.7, filed on Jan. 26, 2017, all of which are incorporated by reference.
- The disclosure relates to the field of data processing, and in particular, to a data processing method and a mobile device.
- With popularity of the mobile Internet, a growing quantity of users are accustomed to reading ebooks on mobile devices (such as mobile phones, tablet computers, and ebook readers). When reading an ebook on a mobile device, a user needs to perform page flip. Page flip is usually implemented by the user by touching a screen of the mobile device with a finger, to turn to a next page or return to a previous page. To enable the ebook to accurately jump to a particular chapter that the user desires to read, the user needs to perform many complex operations. As shown in
FIG. 1 , in a Kindleebook reader 100 manufactured by Amazon US, if a user needs to jump to a particular chapter, the user needs to touch and hold, by using a finger, acircle dot 101 on aprogress bar 102 displayed on ascreen 103, and make the finger slide leftwards or rightwards along theprogress bar 102, so as to jump to the specific chapter required by the user. It can be learned from above that such page flip and jumping operations are very complex and it is difficult to accurately locate a chapter in a book. This greatly reduces transaction processing efficiency of the mobile device. - To resolve the foregoing technical problem, embodiments provide a data processing method and a mobile device, to reduce operation steps of a user, improve transaction processing efficiency of a mobile device, and further enhance user experience.
- According to a first aspect, an embodiment provides a method for performing data processing on a mobile device. The mobile device includes a touch display screen disposed on a front face of the mobile device, and a low power display screen and a touch panel that are disposed on a rear face of the mobile device, the touch display screen is a capacitive touchscreen, and the low power display screen is an electronic ink screen. The method includes: detecting, by the touch display screen, a touch operation of a user; determining, by the mobile device in response to the touch operation, whether the touch operation works on an application program of a reading type; when the mobile device determines that the touch operation works on the application program of a reading type, instructing, by the mobile device, the low power display screen to display a first graphical user interface (first GUI) of the application program, and instructing to turn off a power supply of the touch display screen and activate the touch panel, where the first GUI is an interface, in the application program, for displaying content of an ebook; and when the touch panel detects a touch event of the user, determining, by the mobile device based on a speed and a direction of the gesture, an instruction corresponding to the gesture, and executing the instruction, where a result of executing the instruction is: displaying a second graphical user interface (second GUI) of the application program on the low power display screen based on the speed and the direction of the gesture, where the second GUI is used to display a catalogue of the ebook. In this way, the second touch display screen and the touch panel are added on the rear face of the mobile device, and the touch event of the user on the touch panel is detected to execute the corresponding instruction, so that different graphical user interfaces are displayed for the ebook displayed on the second touch display screen. This simplifies operation steps when the user reads the ebook, and also improves transaction processing efficiency of the mobile device.
- In some embodiments, the displaying a second graphical user interface of the application program on the low power display screen may be specifically: displaying a part of the catalogue of the ebook on the low power display screen based on the speed and the direction of the gesture.
- According to a second aspect, an embodiment provides a method for performing data processing on a mobile device. The mobile device includes a first touch display screen disposed on a front face of the mobile device, and a second touch display screen and a touch panel that are disposed on a rear face of the mobile device. The method includes: when the first touch display screen detects a touch operation of a user, determining, by the mobile device, whether the touch operation works on an application program; when determining that the touch operation works on the application program, determining, by the mobile device, a type of the application program; when the mobile device determines that the type of the application program is reading, instructing, by the mobile device, the second touch display screen to display a first graphical user interface of the application program; and when the touch panel detects a touch event of the user, determining, by the mobile device based on the touch event, an instruction corresponding to the touch event, and executing the instruction, where a result of executing the instruction is: displaying a second graphical user interface of the application program on the second touch display screen based on the touch event.
- In some embodiments, the determining, by the mobile device based on the touch event, an instruction corresponding to the touch event, and executing the instruction may be specifically: detecting, by the touch panel at different time points t1, t2, and t3, that positions of three touch points A, B, and C in the touch event are (x1, y1), (x2, y2), and (x3, y3); determining, by the mobile device, a speed and/or a direction of the touch event based on X-axis coordinate values x1, x2, and x3 of the three touch points A, B, and C and the time points t1, t2, and t3; and performing, by the mobile device based on the speed and/or the direction, a fast page flip operation on content of an ebook displayed on the first graphical user interface, and displaying the second graphical user interface of the fast page flip operation on the second touch display screen.
- In some embodiments, the second graphical user interface may be used to display a catalogue of the ebook.
- In some other embodiments, the touch event is a gesture of sliding on the touch panel by a single finger, and the displaying a second graphical user interface of the application program on the second touch display screen based on the touch event may be specifically: displaying a fast page flip graphical user interface of the application program on the second touch display screen based on a speed and a direction of the sliding gesture.
- In some other embodiments, a
processor 380 may alternatively instruct afirst display panel 341 to display some graphical user interfaces of the application program, and other graphical user interfaces are displayed on asecond display panel 342. In this way, a user can focus more on content of interest, and a screen splitting function can be implemented. It may be understood that the some graphical user interfaces displayed on thefirst display panel 341 may be specifically iconic controls of the application program and the other graphical user interfaces displayed on thesecond display panel 342 may mainly be substantial content of the application program. For example, if the application program is “reading”, content of an ebook that is being read is displayed on thesecond display panel 342, and some controls such as “page up” and “page down” that are not related to substantial content of the ebook are displayed on thefirst display panel 341. - In some other embodiments, the mobile device further turns off a power supply of the first touch display screen while the mobile device instructs the second touch display screen to display the first graphical user interface of the application program.
- According to a third aspect, an embodiment provides a mobile device for performing data processing. The mobile device includes a first display panel and a first touch panel that are disposed on a front face of the mobile device, and a second display panel and a second touch panel that are disposed on a rear face of the mobile device; and the mobile device further includes a processor, a memory, and a power supply management system. The first display panel is configured to display an icon, a graphical user interface, and a component of an application program; when the first touch panel detects a touch operation of a user, the processor determines whether the touch operation works on application program displayed on the first display panel; when the processor determines that the touch operation works on the application program, the processor determines a type of the application program; when the processor determines that the type of the application program is reading, the processor instructs the second display panel to display a first graphical user interface of the application program, and instructs the power supply management system to turn off a power supply of the first display panel and activate the second display panel, where the first graphical user interface is an interface, in the application program, for displaying content of an ebook; and when the second touch panel detects a touch event of the user, the processor determines, based on a speed and a direction of the touch event, an instruction corresponding to the touch event, and executes the instruction, where a result of executing the instruction is: displaying a second graphical user interface of the application program on the second display panel based on the speed and the direction of the touch event, where the second graphical user interface is used to display a catalogue of the ebook.
- In some embodiments, the first display panel is a liquid crystal display, and the second display panel is an electronic ink screen. In this way, the first display panel may be mainly configured for daily operations of the user, and the second display panel is a display panel of a low power material, and therefore may be configured to read an ebook.
- In some other embodiments, the mobile device may be a mobile phone or a tablet computer.
- In the foregoing embodiments, the touch operation of the user on the touch panel is detected, to determine to perform processing (for example, page flip or returning to the catalogue) on the content of the ebook displayed on the second display panel. This greatly simplifies operation steps of the user, improves transaction processing efficiency of the mobile device, and further enhances user experience.
- According to a fourth aspect, an embodiment provides a method for performing data processing on a mobile device. The mobile device includes a first touch display screen disposed on a front face of the mobile device, and a second touch display screen and a touch panel that are disposed on a rear face of the mobile device, the first touch display screen may be a capacitive touchscreen, and the second touch display screen may be an electronic ink screen. The method may specifically include: detecting a touch operation of a user on the first touch display screen; displaying, in response to the touch operation, a control for an application program on the first touch display screen; and in response to a touch operation performed by the user on the control, displaying a first graphical user interface of the application program on the first touch display screen, and displaying a second graphical user interface of the application program on the second touch display screen, where the first graphical user interface includes a plurality of controls, the second graphical user interface displays content of the application program, and the plurality of controls are used to control the content of the application program.
- In some other embodiments, the application program is an application program of a reading type, and the content of the application program may be an ebook.
- According to a fifth aspect, an embodiment provides a method for performing data processing on a mobile device. The mobile device includes a first touch display screen disposed on a front face of the mobile device, and a second touch display screen and a touch panel that are disposed on a rear face of the mobile device, the first touch display screen is a capacitive touchscreen, and the second touch display screen is an electronic ink screen. The method includes: detecting a touch operation of a user on the first touch display screen; displaying, in response to the touch operation, a control for an application program on the first touch display screen; and in response to a touch operation performed by the user on the control, displaying a first graphical user interface of the application program on the first touch display screen, and displaying a second graphical user interface of the application program on the second touch display screen, where the first graphical user interface includes a plurality of controls, the second graphical user interface displays content of the application program, and the plurality of controls are used to control the content of the application program.
- In some other embodiments, the application program is an application program of a reading type, and the content of the application program is an ebook.
- It should be understood that descriptions about the technical features, technical solutions, and advantages or similar descriptions in this specification do not indicate that all the features and advantages may be implemented in any single embodiment. In contrast, it may be understood that descriptions about the features or advantages mean that a particular technical feature, technical solution, or advantage is included in at least one embodiment. Therefore, the descriptions about the technical features, technical solutions, or advantages in this specification do not necessarily indicate a same embodiment. Further, the technical features, technical solutions and advantages described in the following embodiments may be combined in any proper manner. A person skilled in the art should understand that an embodiment may be implemented without one or more particular technical features, technical solutions or advantages.
-
FIG. 1 is a schematic diagram of a user interface of a page flip or a page jump in the prior art; -
FIG. 2A is a schematic structural diagram of a front face of a mobile phone in some embodiments; -
FIG. 2B is a schematic structural diagram of a rear face of a mobile phone in some embodiments; -
FIG. 3 is a schematic diagram of a hardware structure of a mobile phone in some embodiments; -
FIG. 4 is a schematic diagram of a graphical user interface displayed by afirst display panel 341 on a front face of a mobile phone in some embodiments; -
FIG. 5 is a schematic diagram of a graphical user interface displayed by asecond display panel 342 on a rear face of a mobile phone in some embodiments; -
FIG. 6 is a schematic diagram of a track of a touch event on athird touch panel 333 in some embodiments; -
FIG. 7A toFIG. 7D are schematic diagrams of some graphical user interfaces displayed by asecond display panel 342 on a rear face of a mobile device in some embodiments; -
FIG. 8 is a schematic flowchart of a method in some embodiments; -
FIG. 9 is a schematic structural diagram of a mobile device in some embodiments; -
FIG. 10A is a schematic diagram of a graphical user interface displayed by a first display panel on a front face of a mobile device in some embodiments; -
FIG. 10B is a schematic diagram of a graphical user interface displayed by a second display panel on a rear face of a mobile device in some embodiments; -
FIG. 11A andFIG. 11B are schematic diagrams of a position of a third touch panel in some other embodiments; and -
FIG. 12 is a schematic diagram of another graphical user interface displayed by a first display panel on a front face of a mobile device in some embodiments. - A mobile device in the following embodiments may be any device having a wireless communication function, for example, may be a wearable electronic device (for example, a smartwatch) having the wireless communication function, may be a
mobile phone 300 shown inFIG. 3 , or may be a tablet computer. No special limitation is imposed on a specific form of the mobile device in the following embodiments. - In the following embodiment, a mobile phone is used as an example to describe how a mobile device implements a specific technical solution in this embodiment. As shown in
FIG. 3 , the mobile device in this embodiment may be amobile phone 300.FIG. 2A andFIG. 2B are schematic appearance diagrams of themobile phone 300.FIG. 2A is a schematic diagram of a front face of themobile phone 300.FIG. 2B is a schematic diagram of a rear face of themobile phone 300. The following uses themobile phone 300 as an example to specifically describe this embodiment. - It should be understood that the
mobile phone 300 in the figure is merely an example of the mobile device, and themobile phone 300 may have more or fewer components than those shown in the figure, a combination of two or more components, or components disposed differently. The components shown in the figure may be implemented by hardware, software, or a combination of hardware and software, where the hardware, the software, or the combination of hardware and software includes one or more signal processing and/or application-specific integrated circuits. - As shown in
FIG. 3 , themobile phone 300 includes components such as an RF (Radio Frequency, radio frequency)circuit 310, amemory 320, aninput unit 330, adisplay unit 340, asensor 350, anaudio frequency circuit 360, a Wi-Fi module 370, aprocessor 380, and apower supply 390. A person skilled in the art may understand that a structure of the mobile phone shown inFIG. 3 does not constitute a limitation on the mobile phone, and the mobile phone may include more or fewer components than those shown in the figure, a combination of some components, or components disposed differently. - The following specifically describes each component of the
mobile phone 300 with reference toFIG. 3 . - The
RF circuit 310 may be configured to receive and transmit information, or receive and send signals in a call process. TheRF circuit 310 may receive downlink information of a base station and then provide the received downlink information for theprocessor 380 for processing, and send uplink data to the base station. Usually, the RF circuit includes but is not limited to devices such as an antenna, at least one amplifier, a transceiver, a coupler, a low noise amplifier, and a duplexer. In addition, theRF circuit 310 may further communicate with a network and another mobile device through wireless communication. The wireless communication may use any communications standard or protocol, including but not limited to a Global System for Mobile Communications, a general packet radio service, code division multiple access, wideband code division multiple access, Long Term Evolution, an email, a short message service, and the like. - The
memory 320 may be configured to store a software program and data. Theprocessor 380 runs the software program and the data that are stored in thememory 320, to perform various functions of themobile phone 300 and process data. Thememory 320 may mainly include a program storage area and a data storage area. The program storage area may store an operating system, an application program required by at least one function (for example, a sound playback function and an image playback function), and the like. The data storage area may store data (for example, audio data and a phone book) created based on use of themobile phone 300, and the like. In addition, thememory 320 may include a high-speed random access memory, and may further include a non-volatile memory, for example, at least one disk storage device, a flash memory device, or another volatile solid-state storage device. In the following embodiments, thememory 320 stores an operating system that enables themobile phone 300 to run, for example, an iOS® operating system developed by Apple Inc., an Android® open source operating system developed by Google Inc., and a Windows® operating system developed by Microsoft Corporation. - The input unit 330 (namely, a touchscreen) may be configured to receive entered numeral or character information, and generate signal input related to user setting and function control of the
mobile phone 300. Specifically, theinput unit 330 may include afirst touch panel 331 disposed on a front face of themobile phone 300. Thefirst touch panel 331 is also referred to as a first touchscreen, and may collect a touch operation of a user on or near the first touch panel 331 (such as an operation performed by the user on thefirst touch panel 331 or near thefirst touch panel 331 by using a finger or any proper object or accessory such as a stylus), and drive a corresponding connection apparatus according to a preset program. Optionally, thefirst touch panel 331 may include two parts: a touch detection apparatus and a touch controller (not shown in the figure). The touch detection apparatus detects a touch orientation of the user, detects a signal brought by the touch operation, and sends the signal to the touch controller. The touch controller receives touch information from the touch detection apparatus, converts the touch information into touch point coordinates, and sends the touch point coordinates to theprocessor 380, and can receive and execute an instruction sent by theprocessor 380. In addition, thefirst touch panel 331 may be implemented in various types, such as a resistive type, a capacitive type, an infrared type, and a face acoustic wave type. In addition to thefirst touch panel 331, theinput unit 330 may further include a second touch panel 332 (also referred to as a second touchscreen) disposed on a rear face of themobile phone 300. In other words, thefirst touch panel 331 and thesecond touch panel 332 are disposed on two opposite faces of themobile phone 300. Specifically, thesecond touch panel 332 may use a same structure as thefirst touch panel 331. Details are not described herein again. - The display unit 340 (namely, a display screen) may be configured to display information entered by the user or information provided for the user, and graphical user interfaces (referred to as GUIs in the following) of various menus of the
mobile phone 300. Thedisplay unit 340 may include a first display panel 341 (also referred to as a first display screen) disposed on the front face of themobile phone 300, and a second display panel 342 (also referred to as a second display screen) disposed on the rear face of themobile phone 300. In other words, thefirst display panel 341 and thesecond display panel 342 are disposed on the two opposite faces of themobile phone 300. Thefirst display panel 341 may be configured by using a liquid crystal display, an organic light-emitting diode, or a like form. Thesecond display panel 342 may be a screen made by using an electronic paper display technology or another low power display material, for example, an electronic ink screen. Therefore, thesecond display panel 342 may be used to read an ebook, a magazine, and the like. Certainly, thesecond display panel 342 may alternatively use a same display material as thefirst display panel 341. - As shown in
FIG. 2A andFIG. 2B , in some embodiments, themobile phone 300 includes the front face A and the rear face B. On the front face A, threeoptical touch keys first touch panel 331 and thefirst display panel 341 are further disposed. Thefirst touch panel 331 covers thefirst display panel 341. After detecting a touch operation on or near thefirst touch panel 331, thefirst touch panel 331 transfers the touch operation to theprocessor 380, to determine a touch event. After that, theprocessor 380 provides corresponding visual output on thefirst display panel 341 based on a type of the touch event. Although thefirst touch panel 331 and thefirst display panel 341 are used as two separate components to implement input and output functions of themobile phone 300 inFIG. 3 , in some embodiments, thefirst touch panel 331 and thefirst display panel 341 may be integrated to implement the input and output functions of themobile phone 300. The integratedfirst touch panel 331 andfirst display panel 341 may be referred to as a first touch display screen. On the rear face B, thesecond touch panel 332 and thesecond display panel 342 are disposed, and thesecond touch panel 332 covers thesecond display panel 342. Functions of thesecond touch panel 332 and thesecond display panel 342 are similar to those of thefirst touch panel 331 and thefirst display panel 341. In some embodiments, on the rear face B of themobile phone 300, athird touch panel 333 may be further included. Thethird touch panel 333 may not overlap with thesecond touch panel 332 or the second display panel 342 (as shown inFIG. 5 ). In some other embodiments, thethird touch panel 333 may alternatively be configured on a side face of themobile phone 300, as shown inFIG. 11A andFIG. 11B . Thethird touch panel 333 may be in a strip shape and is applicable to the narrow side face. In this way, the display panel on the rear face of the mobile phone may be made larger. It may be understood that thethird touch panel 333 on the side face may alternatively be integrated with a volume key. After detecting a touch operation on or near thethird touch panel 333, thethird touch panel 333 transfers the touch operation to theprocessor 380, to determine a type of a touch event. After that, theprocessor 380 provides corresponding visual output on thefirst display panel 341 and/or thesecond display panel 342 based on the touch event. In some embodiments, thesecond touch panel 332 and thesecond display panel 342 may be integrated to implement the input and output functions of themobile phone 300. The integratedsecond touch panel 332 andsecond display panel 342 may be referred to as a second touch display screen. - In some other embodiments, pressure sensors may be further disposed on the
first touch panel 331, thesecond touch panel 332, and thethird touch panel 333. In this way, when the user performs a touch operation on the foregoing touch panels, the touch panels can further detect a pressure of the touch operation, so that themobile phone 300 can more accurately detect the touch operation. - The
mobile phone 300 may further include at least onesensor 350, for example, a light sensor, a motion sensor, and another sensor. Specifically, the light sensor may include an ambient light sensor and a proximity sensor. As shown inFIG. 2A , an ambientlight sensor 351 is capable of adjusting a luminance of thefirst display panel 341 and/or thesecond display panel 342 based on brightness of ambient light. Aproximity sensor 352 is disposed on the front face of themobile phone 300. When themobile phone 300 moves to an ear, based on detection by theproximity sensor 352, themobile phone 300 turns off a power supply of thefirst display panel 341, and may also turn off a power supply of thesecond display panel 342 at the same time. In this way, themobile phone 300 may further save power. As one type of motion sensor, an accelerometer sensor is capable of detecting a magnitude of an acceleration in each direction (usually three axes), and may detect, in a static state, a magnitude and a direction of gravity. The accelerometer sensor may be applied to a mobile phone posture recognition application (for example, screen switching between a landscape mode and a portrait mode, a related game, or magnetometer posture calibration), a vibration recognition-related function (for example, a pedometer and tapping), and the like. For another sensor that may be further disposed in themobile phone 300, such as a gyroscope, a barometer, a hygrometer, a thermometer, or an infrared sensor, details are not described herein. - The
audio frequency circuit 360, aspeaker 361, and amicrophone 362 may provide an audio interface between the user and themobile phone 300. Theaudio frequency circuit 360 may transmit, to thespeaker 361, a signal converted from received audio data, and thespeaker 361 converts the signal into a sound signal for output. In addition, themicrophone 362 converts a collected sound signal into an electrical signal, and theaudio frequency circuit 360 receives the electrical signal, converts the electrical signal into audio data, and then outputs the audio data to theRF circuit 310, to send the audio data to, for example, another mobile phone, or outputs the audio data to thememory 320 for further processing. - Wi-Fi is a short-range wireless transmission technology. By using the Wi-
Fi module 370, themobile phone 300 may allow the user to receive and send an email, browse a web page, access streaming media, and the like. The Wi-Fi module 370 provides wireless broadband Internet access for the user. - The
processor 380 is a control center of themobile phone 300, and is connected to all components of the entire mobile phone by using various interfaces and lines. Theprocessor 380 runs or executes the software program stored in thememory 320 and invokes the data stored in thememory 320, to perform the functions of themobile phone 300 and process data, so as to perform overall monitoring on the mobile phone. In some embodiments, theprocessor 380 may include one or more processing units. An application processor and a modem processor may be further integrated in theprocessor 380. The application processor primarily processes an operating system, a user interface, an application program, and the like, and the modem processor primarily processes wireless communication. It may be understood that the modem processor may not be integrated into theprocessor 380. - A
Bluetooth module 381 is configured to exchange information with another device by using Bluetooth, a short-range communications protocol. For example, themobile phone 300 may establish, by using theBluetooth module 381, a Bluetooth connection to a wearable electronic device (for example, a smartwatch) that also has a Bluetooth module, to exchange data. - The
mobile phone 300 further includes the power supply 390 (for example, a battery) that supplies power to each component. The power supply may be logically connected to theprocessor 380 by using a power supply management system, to implement functions such as charging management, discharging management, and power consumption management by using the power supply management system. It may be understood that, in the following embodiments, thepower supply 390 may include two power supplies: One power supply is mainly configured to supply power to thefirst display panel 341 and thefirst touch panel 331, and the other power supply is mainly configured to supply power to thesecond display panel 342, thesecond touch panel 332, and thethird touch panel 333. - The
mobile phone 300 may further include a front-facing camera 353, a rear-facingcamera 354, and the like. Details are not described herein. - All methods in the following embodiments may be implemented in the
mobile phone 300 having the foregoing hardware structure. - As shown in
FIG. 4 , there is afirst display panel 341 on a front face of a mobile device 400 (for example, a mobile phone 300). Thefirst display panel 341 displays icons of various application programs. For example, anicon 401 represents an application program named reading, anicon 403 represent system setting, and anicon 404 represents Facebook, an existing popular social application program. There is asecond display panel 342 on a rear face of themobile device 400. A display material used by thesecond display panel 342 may be different from that used by thefirst display panel 331. Specifically, thefirst display panel 341 may use a light-emitting diode material commonly used in a mobile device in the prior art, and thesecond display panel 342 may be a low power-consumption electronic ink screen or the like. In some other embodiments, asecond touch panel 332 may be further disposed on the mobile device. A material used by thesecond touch panel 332 may be the same as or similar to that used by afirst touch panel 331, and thesecond touch panel 332 may also cover thesecond display panel 342. Themobile device 400 further includes athird touch panel 333. After detecting a touch operation on or near thethird touch panel 333, thethird touch panel 333 transfers the touch operation to aprocessor 380, to determine a gesture type of the touch operation. After that, theprocessor 380 provides corresponding visual output on thefirst display panel 341 and/or thesecond display panel 342 based on the gesture type. - In some embodiments, when a user touches or approaches the
icon 401 on thefirst display panel 341 by using afinger 402, thefirst touch panel 331 of themobile device 400 detects a touch event on or near thefirst touch panel 331, and transfers the touch event to theprocessor 380, to determine an instruction corresponding to the touch event. After that, theprocessor 380 invokes, according to the instruction, an application program stored in amemory 320, and instructs thefirst display panel 341 to display a graphical user interface (GUI) of the application program (reading), so that the user can perform a specific operation. - It may be understood that the
processor 380 may also instruct thesecond display panel 342 to display the graphical user interface of the application program. In other words, when the user touches theicon 401 on the front face of themobile device 400, the GUI of the application program (reading) corresponding to theicon 401 is displayed on thesecond display panel 342 on the rear face of themobile device 400, so that the user performs reading on the low powersecond display panel 342. This also saves a power supply of themobile device 400. In some other embodiments, theprocessor 380 may alternatively instruct thefirst display panel 341 to display some GUIs of the application program, and other GUIs are displayed on thesecond display panel 342. In this way, the user can focus more on content of interest, and a screen splitting function can be implemented. It may be understood that the some GUIs displayed on thefirst display panel 341 may be specifically iconic controls of the application program and the other GUIs displayed on thesecond display panel 342 may mainly be substantial content of the application program. For example, as shown inFIG. 10A , when the user touches or approaches theicon 401 on thefirst display panel 341 by using thefinger 402, if an application program corresponding to the icon is “reading”, some controls such as “page up” and “page down” that are not related to substantial content of an ebook (for example, controls such as anicon 1001, anicon 1002, and anicon 1003 representing a catalogue of the ebook) are displayed on thefirst display panel 341. In addition, as shown inFIG. 10B ,specific content 1004 of the ebook in the “reading” application program is displayed on thesecond display panel 342. - In some other embodiments, when the GUI of the application program is displayed on the
second display panel 342 on the rear face of themobile device 400, theprocessor 380 may send, to a power supply management system in apower supply 390, an instruction for interrupting a power supply of thefirst display panel 341. After receiving the instruction, the power supply management system turns off the power supply of thefirst display panel 341, so that thefirst display panel 341 is in a screen-off state. Alternatively, theprocessor 380 may send, to the power supply management system, an instruction for making thefirst display panel 341 sleep. After receiving the instruction, the power supply management system makes thefirst display panel 341 sleep, so that thefirst display panel 341 is in a screen-off state. This can further save the power supply of themobile device 400. - In some other embodiments, the mobile device may alternatively display a GUI of the application program on the
second display panel 342 only when determining that the application program is an application program for reading an ebook or the like. In other words, thesecond display panel 342 may allow displaying of only an application program of a specific type, for example, an application program of a reading type. The mobile device may determine a type of an application program in a plurality of manners, for example, determines a type of an application program based on a category attribute (for example, a game, reading, or communication) of the application program. In this embodiment of this application, the mobile device displays, only when determining that an application program corresponding to an icon tapped by a user is an application program of the reading type, a GUI of the application program on thesecond display panel 342. If determining that the application program corresponding to the icon tapped by the user is not an application program of the reading type, the mobile device displays the GUI of the application program on thefirst display panel 341, to facilitate a further operation of the user. - When there is content (for example, the GUI of the “reading” application program) displayed on the
second display panel 342, thethird touch panel 333 is activated, to receive a touch operation of the user. That thethird touch panel 333 is activated may be specifically: thethird touch panel 333 is powered on by thepower supply 390 and can detect the touch operation of the user on or near thethird touch panel 333. It may be understood that thethird touch panel 333 may alternatively be activated only when a particular GUI is displayed on thesecond display panel 342. For example, thethird touch panel 333 is activated only when thesecond display panel 342 displays content of a particular chapter of an ebook (for example, chapters displayed inFIG. 7A andFIG. 7B ) in the “reading” application program, so that the mobile device saves more power. - As shown in
FIG. 5 , when a touch event of afinger 501 of the user is detected on thethird touch panel 333, theprocessor 380 invokes, based on the touch event, an instruction corresponding to the touch event, and executes the instruction. - Generally, the touch event may be a rightward (or leftward) sliding gesture or the like made by the user on the
third touch panel 333. After collecting the touch event, thethird touch panel 333 sends data related to the touch event to theprocessor 380. Based on the received data, theprocessor 380 determines and invokes the instruction corresponding to the touch event. If the instruction corresponding to the touch event is an instruction used for fast page flip, a result of executing the instruction is: fast turning the ebook displayed on thesecond display panel 342 to a particular page and displaying the particular page. In some embodiments, theprocessor 380 may further calculate a speed of the touch event based on the related data of the touch event, and then determine, based on the speed, a specific speed for performing page flip on the displayed ebook. For example, a faster speed of a sliding gesture leads to a faster page flip speed, in other words, more pages are flipped. On the contrary, a slower speed of a sliding gesture leads to a slower page flip speed, in other words, fewer pages are flipped. In some other embodiments, theprocessor 380 may alternatively calculate a sliding track length of the touch event on thethird touch panel 333 based on the related data of the touch event, and determine, based on the calculated length, a specific speed for performing page flip on the displayed ebook. For example, a longer sliding track length of a sliding gesture leads to a faster page flip speed, in other words, more pages are flipped. - As shown
FIG. 6 , thethird touch panel 333 detects positions of three touch points A, B, and C of the finger at different time points t1, t2, and t3. Position coordinates of the three touch points on thethird touch panel 333 may be respectively expressed as (x1, y1), (x2, y2), and (x3, y3). Theprocessor 380 may determine a speed or an acceleration of the touch event in an X-axis direction based on the different time points and the position coordinates of the three different touch points, so as to determine, based on the calculated speed or acceleration, the speed for performing page flip on the displayed ebook. It may be understood that a person skilled in the art may also determine a specific gesture of a touch event by using another technical solution. This is not limited in this embodiment. - In some other embodiments, the
processor 380 may first determine, based on the touch event, a gesture corresponding to the touch event, and then execute a corresponding instruction based on the determined gesture. - Specifically, the touch event may be a flick gesture, that is, slightly tapping a touch panel (for example, the
first touch panel 331 or the second touch panel 332) with a single finger, then quickly sliding, and then quickly leaving the touch panel, for example, scrolling a screen up or down or switching to a left or right picture. The touch event may alternatively be a slide gesture, that is, slightly tapping a touch panel with a single finger, keeping the finger in contact with the touch panel, and then moving, for example, sliding to unlock. The touch event may alternatively be a swipe gesture, that is, touching a touch panel with a plurality of fingers, keeping the fingers in contact with the touch panel, and then moving, for example, pinching with three fingers to return to a home screen. The touch event may alternatively be a tap gesture, that is, slightly tapping a touch panel with a single finger and then leaving the touch panel immediately. The touch event may alternatively be a double tap gesture, that is, performing a tap gesture operation twice within an extremely short time. The touch event may alternatively be a touch & hold gesture, that is, slightly tapping a touch panel with a finger and keeping the finger resting on the touch panel. The touch event may alternatively be a drag gesture, that is, slightly tapping a touch panel with a finger and slowly moving the finger without leaving the touch panel (usually, to a determined target position, for example, dragging a file to a trash box to delete the file). The touch event may alternatively be a pinch gesture, that is, pinching on a touch panel with two fingers (usually a thumb and an index finger). The touch event may alternatively be an unpinch gesture, that is, stretching on a touch panel with two fingers (usually a thumb and an index finger). It may be understood that, in addition to the foregoing listed gestures, the touch event may be further a gesture in another form. A form of the touch event is not limited in this embodiment. - In some other embodiments, a pressure sensor may be further disposed on the
third touch panel 333. In this way, thethird touch panel 333 can detect a pressure imposed by the user on the touch panel, so that theprocessor 380 performs more complex and accurate processing. For example, the mobile device may determine a page flip direction based on a gesture direction, and then determine, based on a pressure value of a touch event, a specific quantity of pages that need to be flipped; and may also determine a change of a page flip speed based on a pressure value change, and display different animation effects on thesecond display panel 342. - The mobile device may determine the gesture of the touch event by using different technical solutions. Specifically, the
third touch panel 333 may collect a touch position of a finger on thethird touch panel 333 once every 10 milliseconds (ms) after a touch by the user finger is detected, and send collected related data to theprocessor 380. Theprocessor 380 determines a gesture of the touch event based on the collected related data. For example, thethird touch panel 333 determines the gesture of the touch event based on collected different touch positions of the finger at different time points. It may be understood that, if the pressure sensor is disposed on thethird touch panel 333, thethird touch panel 333 not only can detect touch positions of the user on thethird touch panel 333 at different time points, but also can detect pressures imposed by the user on thethird touch panel 333 at different time points. In this way, themobile device 400 can more accurately determine the gesture of the touch event by using parameters collected by thethird touch panel 333, such as a time point, a touch position, and a pressure. It may be understood that theprocessor 380 of themobile device 400 may further determine a gesture speed based on the foregoing related data detected by thethird touch panel 333. - The
processor 380 may invoke and execute different instructions based on different gestures. For example, when determining that the touch event is the flick gesture, theprocessor 380 invokes an instruction corresponding to the flick gesture. A result of executing the instruction is fast flipping pages of the ebook displayed on the second display panel 342 (referred to as a “fast page flip” operation), for example, fast flipping the ebook topage 8 frompage 1. It may be understood that a specific page flip speed of the page flip operation may depend on a sliding speed of the flick gesture on thethird touch panel 333. As shown inFIG. 7A andFIG. 7B , a rightward sliding track shown by adotted arrow 701 is made on thethird touch panel 333 by using thefinger 501 of the user. According to the technical solution in the foregoing embodiments, thethird touch panel 333 may send collected data (for example, a time point and a touch position) related to the sliding track to theprocessor 380. Theprocessor 380 determines, based on the related data, that the touch event (namely, the sliding track) is the flick gesture, and calculates a speed and/or a direction of the gesture based on the related data. The related data may be data shown inFIG. 6 . According to descriptions in the foregoing embodiments, thethird touch panel 333 detects positions of three touch points A, B, and C of the finger at different time points t1, t2, and t3. Position coordinates of the three touch points on thethird touch panel 333 may be respectively expressed as (x1, y1), (x2, y2), and (x3, y3). Theprocessor 380 may calculate the speed of the flick gesture based on changes of the three touch points A, B, and C on X and Y axes. To improve efficiency of theprocessor 380, the speed and/or the direction (leftward or rightward) of the gesture may be calculated based on only X-axis coordinate values x1, x2, and x3 of the three touch points and the time points t1, t2, and t3. Theprocessor 380 then fast turns a currently displayed page of an ebook The Little Prince forward to page 8 (as shown by anicon 702 inFIG. 7B ), and displays content ofpage 8 on thesecond display panel 342. It may be understood that an animation or a like form may be used in a fast page flip process, to inform the user that fast page flip is being performed. It may be understood that, in some other embodiments, a progress bar shown by anicon 705 inFIG. 7B may be displayed on thesecond display panel 342, so that the user can visually and directly perceive a rough quantity of pages that have been flipped currently and a page flip speed. Theprogress bar 705 may further display a specific page flip speed. - In some embodiments, when the
processor 380 determines that the touch event is the swipe gesture, theprocessor 380 invokes an instruction corresponding to the swipe gesture. A result of executing the instruction is enabling the ebook displayed on thesecond display panel 342 to return to a catalogue page from a specific content page. As shown inFIG. 7C , the user uses twofingers 704 to make a rightward sliding track shown by adotted arrow 703 on thethird touch panel 333. According to the technical solution in the foregoing embodiments, thethird touch panel 333 may send collected data (for example, a time point and touch positions of the two fingers) related to the sliding track to theprocessor 380. Theprocessor 380 determines, based on the related data, that the touch event (namely, the sliding track) is the swipe gesture. Theprocessor 380 then switches currently displayed content of an ebook The Little Prince to a catalogue (as shown by anicon 702 inFIG. 7D ) of the ebook according to the instruction corresponding to the gesture, and displays the catalogue of the ebook on thesecond display panel 342. It may be understood that an animation or a like form may be used in a process of switching to the catalogue, to inform the user that returning to the catalogue is being performed. In the foregoing embodiments, the touch operation of the user on the third touch panel is detected, to determine to perform processing (for example, page flip or returning to the catalogue) on the content of the ebook displayed on the second display panel. This greatly simplifies operation steps of the user, improves transaction processing efficiency of the mobile device, and further enhances user experience. - In some embodiments, when the
processor 380 determines that the touch event is the pinch gesture, theprocessor 380 invokes an instruction corresponding to the pinch gesture. A result of executing the instruction may be zooming in, to a given multiple, text of specific content of the ebook displayed on thesecond display panel 342, to enhance a display effect of the ebook. On the contrary, when theprocessor 380 determines that the touch event is the unpinch gesture, theprocessor 380 invokes an instruction corresponding to the unpinch gesture. A result of executing the instruction may be zooming out, to a given level, text of specific content of the ebook displayed on thesecond display panel 342, to reduce a display effect of the ebook and further save power of the mobile device. - In some other embodiments, the
second touch panel 332 is further disposed on the rear face of the mobile device. The user may directly perform, on thesecond touch panel 332, a simple touch operation on an icon displayed on thesecond display panel 342. - In some other embodiments, the mobile device (for example, a mobile phone) may be used with a wearable device (for example, a smartwatch) paired with the mobile device, to complete the technical solution in the foregoing embodiments. For example, the
third touch panel 333 may not be disposed on themobile phone 300, but a third touch panel is disposed on the smartwatch. In this way, the user may perform a touch gesture operation on the third touch panel of the smartwatch. The touch gesture operation is compiled into an instruction, and the instruction is sent to the mobile device by using a short-range communications protocol (for example, Wi-Fi or Bluetooth) or the like. After receiving the instruction, the mobile device executes the instruction, so that thesecond display panel 342 on the rear face of the mobile device displays the specific content of the ebook. It may be understood that the mobile device may execute different instructions based on different gestures in the smartwatch, as described in the foregoing embodiments. Details are not described herein. -
FIG. 8 is a flowchart of a method in some embodiments. A mobile device receives a touch operation of a user on a first touch panel 331 (step S801), where the touch operation may be an operation of tapping, by the user, anicon 401 displayed on afirst display panel 341. The mobile device determines whether the touch operation works on an application program (step S802). When the mobile device determines that the touch operation does not work on the application program, the mobile device does not respond to the touch operation (step S812); or when the mobile device determines that the touch operation works on the application program, the mobile device determines a type of the application program (step S803). When the mobile device determines that the type of the application program is a reading type, the mobile device sends a display instruction to asecond display panel 342 based on the touch operation (step S804), where the display instruction may be specifically instructing thesecond display panel 342 to display a GUI of the application program corresponding to the icon touched by the user. When the mobile device determines that the type of the application program is not a reading type, the mobile device may send a display instruction to thefirst display panel 341 based on the touch operation (step S810), to display a GUI of the application program on the first display panel 341 (step S811). Thesecond display panel 342 of the mobile device displays the GUI of the application program according to the instruction in step 804 (step S805). Athird touch panel 333 is activated (step S806). Thethird touch panel 333 of the mobile device receives a touch event of the user (step S807), where the touch event may be any gesture in the foregoing embodiments. The mobile device determines an instruction corresponding to the touch event, and invokes and sends the instruction to the second display panel 342 (step S808). Thesecond display panel 342 displays, according to the instruction, a GUI corresponding to the instruction (step S809). For example, if the instruction is a page flip instruction, a page flip GUI is displayed on thesecond display panel 342; or if the instruction is an instruction used for returning to a catalogue, a GUI for returning to the catalogue is displayed on thesecond display panel 342. - As shown in
FIG. 9 , an embodiment provides a mobile device 900 for performing data processing. The technical solutions in the foregoing embodiments may be implemented by the mobile device in this embodiment. The mobile device 900 includes afirst display panel 901 and afirst touch panel 902 that are disposed on a front face of the mobile device, and asecond display panel 903 and asecond touch panel 904 that are disposed on a rear face of the mobile device. The mobile device further includes aprocessor 905, amemory 906, and a powersupply management system 907. The foregoing hardware may be connected by using acommunications bus 908. - The
first display panel 901 is configured to display an icon, a graphical user interface, and a widget of an application program stored in thememory 906. - When the
first touch panel 902 detects a touch operation of a user, theprocessor 905 determines whether the touch operation works on an application program displayed on thefirst display panel 901. - When the
processor 905 determines that the touch operation works on the application program, theprocessor 905 determines a type of the application program. - When the
processor 905 determines that the type of the application program is reading, theprocessor 905 instructs thesecond display panel 903 to display a first graphical user interface (first GUI) of the application program, and instructs the powersupply management system 907 to turn off a power supply of the first display panel and activate thesecond display panel 903, where the first GUI is an interface, in the application program, for displaying content of an ebook. - When the
second touch panel 904 detects a touch event of the user, theprocessor 905 determines, based on a speed and a direction of the touch event, an instruction corresponding to the touch event, and executes the instruction. - A result of executing the instruction is: displaying a second graphical user interface (second GUI) of the application program on the
second display panel 903 based on the speed and the direction of the touch event, where the second GUI is used to display a catalogue of the ebook. - The
first display panel 901 may be a liquid crystal display, and thesecond display panel 903 may be an electronic ink screen. - In some other embodiments, after it is detected that a user taps an application program “reading” (as shown in
FIG. 4 ), a prompt box may be displayed for the user to choose to display a graphical user interface of the application program on the first display panel 341 (as shown by anicon 1201 inFIG. 12 ), or display a graphical user interface of the application program on the second display panel 342 (as shown by anicon 1202 inFIG. 12 ), or display a graphical user interface of the application program on both thefirst display panel 341 and the second display panel 342 (as shown by anicon 1203 inFIG. 12 ). After the mobile device receives a selection instruction of the user, the mobile device performs displaying according to a selection of the user. When the selection of the user is starting the application program on the front face, the mobile device displays the GUI of the application program only on thefirst display panel 341. When the selection of the user is starting the application program on the rear face, the mobile device displays the GUI of the application program only on thesecond display panel 342, and the power supply of thefirst display panel 341 may be turned off. When related graphical user interfaces of the application program are displayed on both the first display panel and the second display panel, the graphical user interfaces shown inFIG. 10A andFIG. 10B may be displayed, that is, related controls are displayed on a front face (the first display panel) of a mobile phone, and specific content of an ebook is displayed on a rear face (the second display panel 342) of the mobile phone. In this way, when related content of the ebook is presented to another person, the user may control displayed content of the ebook on the rear face by using some controls (shown inFIG. 10A ) on the front face of the mobile phone or a third touch panel on a side face of the mobile phone, so that the user can share related content to another person. - According to the context, the term “if” used in the embodiments may be interpreted as a meaning of “when” or “after” or “in response to determining” or “in response to detecting”. Similarly, according to the context, the phrase “if it is determined that” or “if (a stated condition or event) is detected” may be interpreted as a meaning of “when it is determined that” or “in response to determining” or “when (a stated condition or event) is detected” or “in response to detecting (a stated condition or event)”.
- The terms used in the embodiments are merely for the purpose of illustrating specific embodiments, and are not intended to limit this application. The terms “a”, “said” and “the” of singular forms used in the embodiments and the appended claims are also intended to include plural forms, unless otherwise specified in the context clearly. It should also be understood that the term “and/or” used herein indicates and includes any or all possible combinations of one or more associated listed items.
- All or some of the foregoing embodiments may be implemented by using software, hardware, firmware, or any combination thereof. When software is used to implement the embodiments, the embodiments may be implemented completely or partially in a form of a computer program product. The computer program product includes one or more computer instructions. When the computer program instructions are loaded and executed on the computer, the procedure or functions according to the embodiments are all or partially generated. The computer may be a general-purpose computer, a dedicated computer, a computer network, or another programmable apparatus. The computer instructions may be stored in a computer-readable storage medium or may be transmitted from a computer-readable storage medium to another computer-readable storage medium. For example, the computer instructions may be transmitted from a website, computer, server, or data center to another website, computer, server, or data center in a wired (for example, a coaxial cable, an optical fiber, or a digital sub scriber line) or wireless (for example, infrared, radio, and microwave, or the like) manner. The computer-readable storage medium may be any usable medium accessible by a computer, or a data storage device, such as a server or a data center, integrating one or more usable media. The usable medium may be a magnetic medium (for example, a floppy disk, a hard disk, or a magnetic tape), an optical medium (for example, a DVD), a semiconductor medium (for example, a solid state disk), or the like.
- For a purpose of explanation, the foregoing descriptions are provided with reference to a specific embodiment. However, the foregoing example discussion is not intended to be detailed, and is not intended to limit this application to a disclosed precise form. According to the foregoing teaching content, many modification forms and variation forms are possible. Embodiments are selected and described to fully illustrate the principles of the technical solutions and practical application of the principles, so that other persons skilled in the art can make full use of technical solutions and various embodiments that have various modifications applicable to conceived specific usage.
Claims (20)
1. An electronic device comprising:
a first part;
a second part;
a first display disposed on the first part;
a second display disposed on the second part;
a touch panel disposed on the second part;
one or more processors configured to cause the electronic device to:
display an icon of a first application on a home screen on the first display;
detect a first gesture on the icon;
display, in response to the first gesture, a first graphical user interface (GUI) of the first application on the second display;
display, in response to the first gesture, an iconic control on the first display, wherein the iconic control is configured to control the first GUI;
detect a second gesture on the touch panel; and
display, in response to the second gesture and while displaying the iconic control on the first display, a second GUI of the first application on the second display.
2. The electronic device of claim 1 , wherein the one or more processors are further configured to cause the electronic device to:
detect, in the second gesture, coordinate position values of three touch points at time points;
determine a speed and a direction of the second gesture based on the coordinate position values and the time points;
perform, based on the speed and the direction, a fast page flip operation on content of the first application; and
display the fast page flip operation on the second display.
3. The electronic device of claim 1 , wherein the first application is a preset application.
4. The electronic device of claim 1 , wherein the second GUI is based on a speed and a direction of the second gesture.
5. The electronic device of claim 1 , wherein the second GUI is a catalog of the first application.
6. The electronic device of claim 1 , wherein the second gesture comprises touching the touch panel with fingers, keeping the fingers in contact with the touch panel, and moving the fingers away from contact with the touch panel.
7. The electronic device of claim 1 , wherein the touch panel is disposed on the second display at a position of the second part.
8. The electronic device of claim 1 , wherein the first display is a liquid-crystal display (LCD), and wherein the second display is an electronic ink display.
9. A method implemented by an electronic device and comprising:
displaying an icon of a first application on a home screen on a first display of the electronic device;
detecting a first gesture on the icon;
displaying, in response to the first gesture, a first graphical user interface (GUI) of the first application on a second display of the electronic device;
displaying, in response to the first gesture, an iconic control on the first display, wherein the iconic control is configured to control the first GUI;
detecting a second gesture on a touch panel which is disposed on the second display; and
displaying, in response to the second gesture and while displaying the iconic control on the first display, a second GUI of the first application on the second display.
10. The method of claim 9 , further comprising:
detecting, in the second gesture, coordinate position values of three touch points at time points;
determining a speed and a direction of the second gesture based on coordinate values of the three touch points and the time points;
performing, based on the speed and the direction, a fast page flip operation on content of the first application; and
displaying the fast page flip operation on the second display.
11. The method of claim 9 , wherein the first application is a preset application.
12. The method of claim 9 , wherein the second GUI is based on a speed and a direction of the second gesture.
13. The method of claim 9 , wherein the second GUI is a catalog of the first application.
14. The method of claim 9 , wherein the second gesture comprises touching the touch panel with fingers, keeping the fingers in contact with the touch panel, and moving the fingers away from contact with the touch panel.
15. A computer program product comprising instructions that are stored on a computer-readable medium and that, when executed by a processor, cause an electronic device to:
display an icon of a first application on a home screen on a first display of the electronic device;
detect a first gesture on the icon;
display, in response to the first gesture, a first graphical user interface (GUI) of the first application on a second display of the electronic device;
display, in response to the first gesture, an iconic control on the first display, wherein the iconic control is configured to control the first GUI;
detect a second gesture on a touch panel which is disposed on the second display; and
display, in response to the second gesture and while displaying the iconic control on the first display, a second GUI of the first application on the second display.
16. The computer program product of claim 15 , wherein the instructions, when executed by the processor, further cause the electronic device to:
detect, in the second gesture, coordinate position values of three touch points at time points;
determine a speed and a direction of the second gesture based on coordinate values of the three touch points and the time points;
perform, based on the speed and the direction, a fast page flip operation on content of the first application; and
display the fast page flip operation on the second display.
17. The computer program product of claim 15 , wherein the first application is a preset application.
18. The computer program product of claim 15 , wherein the second GUI is based on a speed and a direction of the second gesture.
19. The computer program product of claim 15 , wherein the second GUI is a catalog of the first application.
20. The computer program product of claim 15 , wherein the second gesture comprises touching the touch panel with fingers, keeping the fingers in contact with the touch panel, and moving the fingers away from contact with the touch panel.
Priority Applications (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
US18/149,838 US20240012605A1 (en) | 2017-01-26 | 2023-01-04 | Data Processing Method and Mobile Device |
Applications Claiming Priority (8)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN201710061746 | 2017-01-26 | ||
CN201710061746.7 | 2017-01-26 | ||
WOPCT/CN2017/073173 | 2017-02-09 | ||
CN2017073173 | 2017-02-09 | ||
PCT/CN2017/075633 WO2018137276A1 (en) | 2017-01-26 | 2017-03-03 | Method for processing data and mobile device |
US201916481417A | 2019-07-26 | 2019-07-26 | |
US17/130,317 US11567725B2 (en) | 2017-01-26 | 2020-12-22 | Data processing method and mobile device |
US18/149,838 US20240012605A1 (en) | 2017-01-26 | 2023-01-04 | Data Processing Method and Mobile Device |
Related Parent Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
US17/130,317 Continuation US11567725B2 (en) | 2017-01-26 | 2020-12-22 | Data processing method and mobile device |
Publications (1)
Publication Number | Publication Date |
---|---|
US20240012605A1 true US20240012605A1 (en) | 2024-01-11 |
Family
ID=62977955
Family Applications (3)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
US16/481,417 Active US10908868B2 (en) | 2017-01-26 | 2017-03-03 | Data processing method and mobile device |
US17/130,317 Active 2037-10-06 US11567725B2 (en) | 2017-01-26 | 2020-12-22 | Data processing method and mobile device |
US18/149,838 Pending US20240012605A1 (en) | 2017-01-26 | 2023-01-04 | Data Processing Method and Mobile Device |
Family Applications Before (2)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
US16/481,417 Active US10908868B2 (en) | 2017-01-26 | 2017-03-03 | Data processing method and mobile device |
US17/130,317 Active 2037-10-06 US11567725B2 (en) | 2017-01-26 | 2020-12-22 | Data processing method and mobile device |
Country Status (3)
Country | Link |
---|---|
US (3) | US10908868B2 (en) |
CN (1) | CN109074124B (en) |
WO (1) | WO2018137276A1 (en) |
Families Citing this family (4)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
WO2019222887A1 (en) * | 2018-05-21 | 2019-11-28 | 华为技术有限公司 | Display control method and terminal |
CN109634417B (en) * | 2018-12-13 | 2021-07-16 | 联想(北京)有限公司 | Processing method and electronic equipment |
CN112000408B (en) * | 2020-08-14 | 2022-07-05 | 青岛海信移动通信技术股份有限公司 | Mobile terminal and display method thereof |
CN112527152B (en) * | 2020-12-18 | 2023-01-06 | Oppo(重庆)智能科技有限公司 | Touch area control method and device, touch system and electronic equipment |
Family Cites Families (23)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
KR101320919B1 (en) | 2008-01-29 | 2013-10-21 | 삼성전자주식회사 | Method for providing GUI by divided screen and multimedia device using the same |
CN201628881U (en) | 2010-01-25 | 2010-11-10 | 张菊 | Double-screen display equipment |
US8749484B2 (en) * | 2010-10-01 | 2014-06-10 | Z124 | Multi-screen user interface with orientation based control |
WO2012081699A1 (en) | 2010-12-17 | 2012-06-21 | Necカシオモバイルコミュニケーションズ株式会社 | Mobile terminal device, display control method and program |
US8806369B2 (en) * | 2011-08-26 | 2014-08-12 | Apple Inc. | Device, method, and graphical user interface for managing and interacting with concurrently open software applications |
CN202383558U (en) * | 2011-09-18 | 2012-08-15 | 吴廷强 | Tablet computer with double-sided structure |
CN203490899U (en) | 2013-01-05 | 2014-03-19 | 刘遥 | Multi-screen mobile terminal |
KR102102157B1 (en) * | 2013-03-29 | 2020-04-21 | 삼성전자주식회사 | Display apparatus for executing plurality of applications and method for controlling thereof |
CN104182166A (en) * | 2013-05-28 | 2014-12-03 | 腾讯科技(北京)有限公司 | Control method and device of intelligent terminal application program |
US9891663B2 (en) * | 2014-02-10 | 2018-02-13 | Samsung Elctronics Co., Ltd. | User terminal device and displaying method thereof |
US9632614B2 (en) * | 2014-04-01 | 2017-04-25 | International Business Machines Corporation | Expanding touch zones of graphical user interface widgets displayed on a screen of a device without programming changes |
CN104536711A (en) | 2014-12-18 | 2015-04-22 | 深圳市金立通信设备有限公司 | Control method of terminal display |
CN104580705A (en) | 2014-12-18 | 2015-04-29 | 深圳市金立通信设备有限公司 | Terminal |
KR102317803B1 (en) * | 2015-01-23 | 2021-10-27 | 삼성전자주식회사 | Electronic device and method for controlling a plurality of displays |
US20160216930A1 (en) * | 2015-01-28 | 2016-07-28 | Kobo Incorporated | Method and system for deploying electronic personal device display screen |
CN106202119A (en) * | 2015-05-07 | 2016-12-07 | 上海玄霆娱乐信息科技有限公司 | SNB e-book storage organization and wiring method thereof and read method |
CN105511828A (en) * | 2015-12-09 | 2016-04-20 | 深圳市金立通信设备有限公司 | Double-screen display method and terminal equipment |
KR102168648B1 (en) * | 2016-01-07 | 2020-10-21 | 삼성전자주식회사 | User terminal apparatus and control method thereof |
CN106027791A (en) | 2016-06-29 | 2016-10-12 | 努比亚技术有限公司 | Mobile terminal and application service switching method |
CN106125845A (en) | 2016-06-30 | 2016-11-16 | 珠海格力电器股份有限公司 | Mobile terminal |
CN106155551A (en) | 2016-06-30 | 2016-11-23 | 努比亚技术有限公司 | Information processing method and terminal |
CN106210328B (en) | 2016-07-19 | 2019-10-29 | 努比亚技术有限公司 | Information display device and method |
KR102521333B1 (en) * | 2016-11-09 | 2023-04-14 | 삼성전자주식회사 | Method for displaying user interface related to user authentication and electronic device for the same |
-
2017
- 2017-03-03 CN CN201780010795.6A patent/CN109074124B/en active Active
- 2017-03-03 WO PCT/CN2017/075633 patent/WO2018137276A1/en active Application Filing
- 2017-03-03 US US16/481,417 patent/US10908868B2/en active Active
-
2020
- 2020-12-22 US US17/130,317 patent/US11567725B2/en active Active
-
2023
- 2023-01-04 US US18/149,838 patent/US20240012605A1/en active Pending
Also Published As
Publication number | Publication date |
---|---|
US10908868B2 (en) | 2021-02-02 |
CN109074124B (en) | 2021-05-14 |
CN109074124A (en) | 2018-12-21 |
US20200019366A1 (en) | 2020-01-16 |
WO2018137276A1 (en) | 2018-08-02 |
US11567725B2 (en) | 2023-01-31 |
US20210109699A1 (en) | 2021-04-15 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
AU2020201096B2 (en) | Quick screen splitting method, apparatus, and electronic device, display UI, and storage medium | |
US11054988B2 (en) | Graphical user interface display method and electronic device | |
US20200226007A1 (en) | Copying and pasting method, data processing apparatus, and user equipment | |
US20230068100A1 (en) | Widget processing method and related apparatus | |
EP4138368B1 (en) | User terminal device and control method thereof | |
US11567725B2 (en) | Data processing method and mobile device | |
US9395833B2 (en) | Method and apparatus for controlling lock or unlock in portable terminal | |
US8810535B2 (en) | Electronic device and method of controlling same | |
US20120096393A1 (en) | Method and apparatus for controlling touch screen in mobile terminal responsive to multi-touch inputs | |
WO2020134744A1 (en) | Icon moving method and mobile terminal | |
WO2020181955A1 (en) | Interface control method and terminal device | |
CN110007835B (en) | Object management method and mobile terminal | |
KR20200009164A (en) | Electronic device | |
CA3092598C (en) | Display method and mobile terminal | |
CN109407949B (en) | Display control method and terminal | |
WO2020001358A1 (en) | Icon sorting method and terminal device | |
US20230075464A1 (en) | Touch Operation Method and Device | |
US20130227463A1 (en) | Electronic device including touch-sensitive display and method of controlling same | |
EP3674867B1 (en) | Human-computer interaction method and electronic device | |
CN110874141A (en) | Icon moving method and terminal equipment | |
EP2584441A1 (en) | Electronic device and method of controlling same | |
CN111399626B (en) | Screen control method and device, storage medium and mobile terminal | |
EP2631755A1 (en) | Electronic device including touch-sensitive display and method of controlling same |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
AS | Assignment |
Owner name: HUAWEI TECHNOLOGIES CO., LTD., CHINA Free format text: ASSIGNMENT OF ASSIGNORS INTEREST;ASSIGNORS:LAN, JUAN;WEN, SHUANGXIONG;PENG, QIANG;SIGNING DATES FROM 20190826 TO 20190924;REEL/FRAME:062271/0520 |
|
STPP | Information on status: patent application and granting procedure in general |
Free format text: DOCKETED NEW CASE - READY FOR EXAMINATION |
|
STPP | Information on status: patent application and granting procedure in general |
Free format text: NON FINAL ACTION MAILED |
|
STPP | Information on status: patent application and granting procedure in general |
Free format text: RESPONSE TO NON-FINAL OFFICE ACTION ENTERED AND FORWARDED TO EXAMINER |
|
STPP | Information on status: patent application and granting procedure in general |
Free format text: FINAL REJECTION MAILED |