WO2017088131A1 - 一种快速分屏的方法、装置、电子设备、显示界面以及存储介质 - Google Patents

一种快速分屏的方法、装置、电子设备、显示界面以及存储介质 Download PDF

Info

Publication number
WO2017088131A1
WO2017088131A1 PCT/CN2015/095564 CN2015095564W WO2017088131A1 WO 2017088131 A1 WO2017088131 A1 WO 2017088131A1 CN 2015095564 W CN2015095564 W CN 2015095564W WO 2017088131 A1 WO2017088131 A1 WO 2017088131A1
Authority
WO
WIPO (PCT)
Prior art keywords
display
display area
interface
touch
application
Prior art date
Application number
PCT/CN2015/095564
Other languages
English (en)
French (fr)
Inventor
王晋
徐杰
Original Assignee
华为技术有限公司
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by 华为技术有限公司 filed Critical 华为技术有限公司
Priority to PCT/CN2015/095564 priority Critical patent/WO2017088131A1/zh
Priority to AU2015415755A priority patent/AU2015415755A1/en
Priority to US15/779,039 priority patent/US10642483B2/en
Priority to EP15909043.0A priority patent/EP3370139A4/en
Priority to KR1020187016604A priority patent/KR102141099B1/ko
Priority to CN201580059602.7A priority patent/CN107077295A/zh
Priority to RU2018122637A priority patent/RU2687037C1/ru
Priority to JP2018526897A priority patent/JP6675769B2/ja
Publication of WO2017088131A1 publication Critical patent/WO2017088131A1/zh
Priority to PH12018501105A priority patent/PH12018501105A1/en
Priority to AU2020201096A priority patent/AU2020201096B2/en

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]
    • G06F3/0487Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser
    • G06F3/0488Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser using a touch-screen or digitiser, e.g. input of commands through traced gestures
    • G06F3/04886Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser using a touch-screen or digitiser, e.g. input of commands through traced gestures by partitioning the display area of the touch-screen or the surface of the digitising tablet into independently controllable areas, e.g. virtual keyboards or menus
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]
    • G06F3/0487Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser
    • G06F3/0488Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser using a touch-screen or digitiser, e.g. input of commands through traced gestures
    • G06F3/04883Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser using a touch-screen or digitiser, e.g. input of commands through traced gestures for inputting data by handwriting, e.g. gesture or text
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/03Arrangements for converting the position or the displacement of a member into a coded form
    • G06F3/041Digitisers, e.g. for touch screens or touch pads, characterised by the transducing means
    • G06F3/0416Control or interface arrangements specially adapted for digitisers
    • G06F3/04166Details of scanning methods, e.g. sampling time, grouping of sub areas or time sharing with display driving
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/03Arrangements for converting the position or the displacement of a member into a coded form
    • G06F3/041Digitisers, e.g. for touch screens or touch pads, characterised by the transducing means
    • G06F3/044Digitisers, e.g. for touch screens or touch pads, characterised by the transducing means by capacitive means
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]
    • G06F3/0484Interaction techniques based on graphical user interfaces [GUI] for the control of specific functions or operations, e.g. selecting or manipulating an object, an image or a displayed text element, setting a parameter value or selecting a range
    • G06F3/04845Interaction techniques based on graphical user interfaces [GUI] for the control of specific functions or operations, e.g. selecting or manipulating an object, an image or a displayed text element, setting a parameter value or selecting a range for image manipulation, e.g. dragging, rotation, expansion or change of colour
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F2203/00Indexing scheme relating to G06F3/00 - G06F3/048
    • G06F2203/048Indexing scheme relating to G06F3/048
    • G06F2203/04803Split screen, i.e. subdividing the display area or the window area into separate subareas
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F2203/00Indexing scheme relating to G06F3/00 - G06F3/048
    • G06F2203/048Indexing scheme relating to G06F3/048
    • G06F2203/04806Zoom, i.e. interaction techniques or interactors for controlling the zooming operation

Definitions

  • Embodiments of the present invention relate to a method, an apparatus, an electronic device, a display interface, and a storage medium for quickly splitting a screen; and more particularly, to a method of dividing a display interface of a display having a touch-sensitive surface into at least two display areas using a touch gesture.
  • Terminal equipment has gradually become an indispensable item in people's daily lives.
  • the size of the terminal device screen is expanding.
  • the terminal devices can be co-processed.
  • the need for multiple tasks is also becoming more and more urgent.
  • the terminal device supports simultaneous processing of multiple tasks, in order to pursue a better user experience, it is also desirable to simultaneously display display areas of multiple applications on the same display interface.
  • the split screen button or the virtual switch and the touch gesture are required to perform the split screen operation, and the window size and the number of the split screen area after the split screen are fixed.
  • the window size and the number of the split screen area after the split screen are fixed.
  • the embodiment of the present invention provides a technical solution for fast split screen.
  • a method of fast split screen for use on a portable electronic device, the electronic device comprising a display having a touch-sensitive surface, the method comprising: when detecting the action on the touch-sensitive surface
  • the display interface of the display is divided into at least two display areas in response to the joint touch gesture.
  • the joint touch gesture consists of a joint touch action; a touch-sensitive surface mesh generated as a touch action for the touch-sensitive surface
  • the capacitance value is in the first preset capacitance value range, the number of grids of the non-zero capacitance value is less than the preset value, and the Z-axis direction acceleration signal is within the first preset acceleration range, then the touch action is the joint A touch action, the gesture consisting of the joint touch action is the joint touch gesture.
  • the fast split screen function is realized by the joint touch gesture, and a new gesture is expanded, and since the judgment condition of the joint touch gesture is different from the judgment condition of the finger touch gesture, it is not easy to be operated by mistake. Entering the split screen interface makes the split screen operation more convenient and improves the user experience.
  • a joint touch gesture acting on the touch-sensitive surface after detecting a joint touch gesture acting on the touch-sensitive surface, determining movement of the joint touch gesture Whether the distance is greater than the preset distance threshold.
  • a trajectory of the joint touch gesture Whether to match the preset track.
  • a trajectory of the joint touch gesture Whether the angle formed by the horizontal axial or vertical axis is smaller than the preset angle.
  • the dividing the display interface of the display into at least two display areas including: according to a preset boundary line
  • the display interface of the display is divided into at least two display areas.
  • the dividing a display interface of the display into at least two display areas including: a track according to the touch gesture
  • the display interface of the display is divided into at least two display areas.
  • the dividing the display interface of the display into at least two display areas including: according to the direction of the touch gesture And a coordinate of the start position of the touch gesture divides a display interface of the display into at least two display areas.
  • the position of the boundary line is related to the coordinates of the starting position of the touch gesture, so the user can purposely adjust the starting position of the touch gesture according to the running interface of the application displayed before the split screen, and the user can adjust the touch gesture by adjusting the touch gesture.
  • the starting position and the direction of the touch gesture determine the position of the dividing line, thereby adjusting the position and/or size of the at least two display areas according to the starting position and direction of the touch gesture.
  • the dividing the display interface of the display into at least two display areas including: displaying a running interface of the first application when the display interface of the display And reducing a size of the running interface of the first application, displaying a reduced running interface of the first application in the first display area; and generating at least one display area in the display area other than the first display area, Display area displays an identification of one or more applications related to the first application; or when the display interface of the display displays an operation interface of the first application, reducing an operation interface of the first application a size, displaying a reduced running interface of the first application in the first display area; generating at least one display area in the display area outside the first display area, and displaying the generated display area related to the first application Running interface of the application; or when the display interface of the display displays the running interface of the first application, reducing the size of the running interface of the first application, displaying the reduced first application in the first display
  • the first display area displays an operation interface of the application corresponding to the identifier; or when an operation instruction for moving the identifier displayed by the display area other than the first display area to the first display area is detected, in the first
  • the display area displays the identification; or when an operation instruction to move the identification displayed by the display area other than the first display area to the first display area is detected, embedding the identification in the running of the reduced first application Interface; or when detecting the first display area
  • the content displayed by the other display area moves to the operation instruction of the first display area, the moved content is embedded in the operation interface of the reduced first application.
  • the gesture that triggers the split-screen operation may also be a pressure touch gesture. Since the pressure touch gesture brings people a 3D tactile experience, it is more and more popular by people. At the same time, compared with the finger touch, it is not easy to be triggered by mistake, which simplifies the operation steps of the split screen and improves the operation experience of the split screen.
  • determining whether the starting position of the touch gesture is adjacent to a predetermined area of the first edge of the touch-sensitive display unit, and/or the touch Whether the end position of the gesture is outside the proximity of the touch-sensitive display unit or adjacent to a predetermined area of the second edge of the touch-sensitive display unit.
  • adjusting the size of the boundary line between the at least two display areas to adjust the size of at least two display areas and/or position. It is convenient for the user to adjust the size and/or position of the display area after the split screen, so that the user can adaptively adjust the size and/or position of each display area according to the display content of the display interface of each display area. Improved user experience.
  • the at least two display areas are merged into one display area.
  • the gesture operation the at least two display areas after the split screen are merged into one display area conveniently and quickly, so that the exiting the split screen operation is convenient and quick, and the user experience is improved.
  • a portable electronic device for fast split screen having the functions corresponding to implementing the above method.
  • the functions may be implemented by hardware or by corresponding software implemented by hardware.
  • the hardware or software includes one or more modules corresponding to the functions described above.
  • the portable electronic device includes a display, a memory, an acceleration sensor, and a processor; the display has a touch-sensitive surface; the touch-sensitive surface is for receiving a touch gesture; the display is further for displaying a display; the memory is for storing An acceleration sensor for acquiring an acceleration signal in a Z-axis direction and transmitting the acquired acceleration signal to a processor; the processor invoking an instruction stored in the memory to implement the method design of the first aspect above The schemes in the repetitions are not repeated here.
  • a device for quickly splitting a screen comprising a detecting unit for detecting a touch gesture acting on the touch-sensitive surface; the determining unit configured to determine whether the touch gesture matches a preset gesture; The unit is configured to divide the display interface of the display into at least two display areas when the touch gesture matches the preset gesture.
  • a display interface of a portable electronic device comprising a display, a memory, and a processor for executing instructions stored in the memory, wherein the display has a touch-sensitive surface: Determining whether the touch gesture matches a preset gesture when detecting a touch gesture acting on the touch-sensitive surface; and when determining that the touch gesture matches a preset gesture, responding to the touch gesture,
  • the display interface of the display is divided into at least two display areas, and the first party is implemented when the processor executes an instruction stored in the memory
  • a non-transitory computer readable storage medium storing one or more programs, the instructions comprising instructions, when included in a portable electronic device including a display having a touch-sensitive surface, is provided
  • the portable electronic device is configured to execute the solution in the method design of the first aspect described above, and the repeated portions are not described again.
  • Embodiments of the present invention disclose a technical solution for dividing a display interface of a display into at least two display areas by a joint touch gesture.
  • a user desires to perform a split-screen operation while using an electronic device having a touch-sensitive display unit, it is only necessary to perform a joint touch gesture on the touch-sensitive display unit to trigger the split screen function.
  • the electronic device can divide the display interface of the electronic device into at least two display regions according to the detected joint touch gesture.
  • the present invention can make the user realize the split screen function more conveniently through the joint touch gesture, and can respectively present different display interfaces in the provided split screen area. Therefore, the technical solution of the split screen operation provided by the present invention optimizes the operation flow of the prior art split screen, simplifies the operation steps of the split screen, and further improves the user experience.
  • FIG. 1 is a schematic diagram of an internal structure of a portable electronic device 100 according to an embodiment of the present invention
  • FIG. 2 is a schematic diagram of an external structure of a portable electronic device 100 according to an embodiment of the present invention.
  • FIG. 3 is a schematic diagram of a display interface of a split screen according to an embodiment of the present invention.
  • FIG. 4 is an exemplary display interface according to a touch gesture split screen according to an embodiment of the present invention.
  • FIG. 5 is another exemplary display interface according to a touch gesture split screen according to an embodiment of the present invention.
  • FIG. 6 is still another exemplary display interface according to a touch gesture split screen according to an embodiment of the present invention.
  • FIG. 7 is still another exemplary display interface according to a touch gesture split screen according to an embodiment of the present invention.
  • FIG. 8 is an exemplary display interface for adjusting a position of a boundary line according to a touch gesture according to an embodiment of the present invention
  • FIG. 9 is an exemplary display interface for replacing a split screen display interface according to a touch gesture according to an embodiment of the present invention.
  • FIG. 10 is an exemplary display interface for dividing a split screen interface into more display areas according to a touch gesture.
  • embodiments of the present invention are exemplified by a portable electronic device 100 that includes a touch sensitive display unit.
  • a portable electronic device 100 that includes a touch sensitive display unit.
  • UE User Equipment
  • MS Mobile Station
  • terminals Terminal Equipment, etc. .
  • the electronic device 100 can support a variety of applications. For example, multimedia applications such as video and audio, text applications (email applications, blog applications, web browsing applications, etc.), web browsing, instant messaging applications, and the like.
  • multimedia applications such as video and audio, text applications (email applications, blog applications, web browsing applications, etc.), web browsing, instant messaging applications, and the like.
  • the touch sensitive display unit of the electronic device 100 can intuitively present a display interface of the application. The user can perform various applications through the touch-sensitive display unit of the electronic device 100.
  • FIG. 1 is a schematic diagram of an internal structure of a portable electronic device 100 according to an embodiment of the present invention.
  • the electronic device 100 may include a touch sensitive display unit 130, an acceleration sensor 151, a pressure sensor 196, a proximity light sensor 152, an ambient light sensor 153, a memory 120, a processor 190, a radio frequency unit 110, an audio circuit 160, a speaker 161, a microphone. 162, WiFi (wireless fidelity) module 170, Bluetooth module 180, power supply 193, external interface 197 and other components.
  • WiFi wireless fidelity
  • FIG. 1 is merely an example of a portable electronic device and does not constitute a limitation on a portable electronic device. May include more or fewer parts than the illustration, or a combination Some parts, or different parts.
  • the touch-sensitive display unit 130 is sometimes referred to as a "touch screen" for convenience, and may also be referred to as or referred to as a touch-sensitive display system, and may also be referred to as a display having a touch-sensitive surface.
  • the display having a touch-sensitive surface includes a touch-sensitive surface and a display screen; the screen interface can be displayed, and a touch action can also be received.
  • the touch sensitive display unit 130 provides an input interface and an output interface between the device and the user.
  • the touch sensitive display unit 130 may collect touch operations on or near the user, such as the user using the finger 202, the joint 400, a stylus, or the like on the touch sensitive display unit 130 or near the touch sensitive display unit 130. Operation.
  • the touch sensitive display unit 130 may detect touch information and transmit the touch information to the processor 190.
  • the touch information may include a touch action, a grid capacitance value of the touch-sensitive surface, and contact coordinates.
  • the touch sensitive display unit can receive commands from the processor 190 and execute them.
  • the touch sensitive display unit 130 displays a visual output.
  • Visual output can include graphics, text, logos, video, and any combination thereof (collectively referred to as "graphics"). In some embodiments, some visual output or all of the visual output may correspond to a display interface object.
  • Touch sensitive display unit 130 may use LCD (Liquid Crystal Display) technology, LPD (Light Emitting Polymer Display) technology, or LED (Light Emitting Diode) technology, although other display technologies may be used in other embodiments.
  • Touch sensitive display unit 130 may utilize any of a variety of touch sensing techniques now known or later developed, as well as other proximity sensor arrays or for determining one or more of contact with touch sensitive display unit 130 Other elements of the point to detect the contact and any movement or interruption thereof.
  • the various touch sensing technologies include, but are not limited to, capacitive, resistive, infrared, and surface acoustic wave technologies. In an exemplary embodiment, a projected mutual capacitance sensing technique is used.
  • the user can contact the touch sensitive display unit 130 using any suitable object or add-on such as a stylus, finger, joint, or the like.
  • the display interface is designed to work based on joint contact and gestures.
  • the display interface is designed to work together based on finger touches and gestures.
  • the display interface is designed to work together based on pressure touches and gestures.
  • the device translates the rough input of the touch into an accurate pointer/cursor position or command to perform the action desired by the user.
  • device 100 in addition to the touch sensitive display unit, device 100 can include for activation or Deactivate the touchpad (not shown) for a specific function.
  • the touchpad is a touch sensitive area of the device.
  • the touchpad is different from the touch sensitive display unit.
  • the trackpad does not display visual output.
  • the touchpad can be a touch-sensitive surface that is separate from the touch-sensitive display unit 130, or an extension of the touch-sensitive surface formed by the touch-sensitive display unit.
  • the acceleration sensor 151 can acquire the magnitude of acceleration in each direction (typically three axes). At the same time, the acceleration sensor 151 can also be used to detect the magnitude and direction of gravity when the terminal is stationary. It can be used to identify the gesture of the phone (such as horizontal and vertical screen switching, related games, magnetometer attitude calibration), vibration recognition related functions (such as pedometer, tap). In the embodiment of the present invention, the acceleration sensor 151 is configured to acquire a gravity acceleration signal of the touch action of the user in contact with the touch-sensitive display unit in the Z-axis direction.
  • the pressure sensor 196 can detect if there is pressure applied to the electronic device 100. And the magnitude of the pressure applied to the electronic device 100 can be determined. The detected pressure value is passed to the processor 190.
  • the pressure sensor 193 can be installed in a portion of the electronic device 100 where pressure is required to be detected. If the pressure sensor 196 is installed in the display module, the touch input and the pressure touch action can be distinguished based on the signal output by the pressure sensor 196.
  • the signal output by pressure sensor 196 can also indicate the pressure applied to the display. Illustratively, assume that the magnitude of the pressure applied to the display by the touch input is one, and the pressure greater than two acting on the display is identified as a pressure touch action.
  • the electronic device 100 may also include one or more proximity light sensors 152 for turning off and disabling the touch function of the touch-sensitive surface when the electronic device 100 is closer to the user (eg, close to the ear when the user is making a call) Avoid user misuse of the touch sensitive display unit.
  • the electronic device 100 may further include one or more ambient light sensors 153 for keeping the touch-sensitive display unit off when the electronic device 100 is located in a user's pocket or other dark area to prevent the electronic device 100 from consuming unnecessary when in the locked state. Battery power consumption is incorrectly operated.
  • the proximity light sensor and the ambient light sensor can be integrated into one component or as two separate components.
  • FIG. 1 shows the proximity photosensor and the ambient light sensor, it can be understood that it does not belong to the essential configuration of the electronic device 100, and may be omitted as needed within the scope of not changing the essence of the invention.
  • the memory 120 can be used to store instructions and data.
  • the memory 120 may mainly include a storage instruction area and a storage data area.
  • the storage data area can store preset trajectory information, preset features such as position, shape, and color of the boundary line, and correlations between applications.
  • the storage instruction area can store an operating system, instructions required for at least one function, and the like.
  • the instructions may cause the processor 190 to perform the following method, the specific method comprising: when detecting a touch gesture acting on the touch-sensitive surface, and determining that the touch gesture is a joint touch gesture, responding to the joint touch gesture,
  • the display interface of the display is divided into at least two display areas.
  • the processor 190 is a control center of the electronic device 100. Connecting various parts of the entire mobile phone using various interfaces and lines, performing various functions and processing data of the electronic device 100 by operating or executing an instruction stored in the memory 120 and calling data stored in the memory 120, thereby performing the mobile phone Overall monitoring.
  • processor 190 can include one or more processing units.
  • the processor 190 can integrate an application processor and a modem processor.
  • the application processor mainly processes an operating system, a display interface, an application, and the like, and the modem processor mainly processes wireless communication. It will be appreciated that the above described modem processor may also not be integrated into the processor 190.
  • the processor, memory can be implemented on a single chip.
  • the processor 190 is further configured to invoke an instruction in the memory to implement responding to the joint when detecting a touch gesture acting on the touch-sensitive surface and determining that the touch gesture is a joint touch gesture
  • the touch gesture divides the display interface of the display into at least two display areas.
  • the radio frequency unit 110 can be used for transmitting and receiving information or receiving and transmitting signals during a call. Specifically, the downlink information of the base station is received and then transmitted to the processor 190 for processing. In addition, the data for designing the uplink is transmitted to the base station.
  • RF circuits include, but are not limited to, an antenna, at least one amplifier, a transceiver, a coupler, a Low Noise Amplifier (LNA), a duplexer, and the like.
  • the radio unit 110 can also communicate with network devices and other devices through wireless communication.
  • the wireless communication may use any communication standard or protocol, including but not limited to Global System of Mobile communication (GSM), General Packet Radio Service (GPRS), Code Division Multiple Access (Code). Division Multiple Access (CDMA), Wideband Code Division Multiple Access (Wideband Code Division Multiple Access, WCDMA), Long Term Evolution (LTE), e-mail, Short Messaging Service (SMS), etc.
  • GSM Global System of Mobile communication
  • GPRS General Packet Radio Service
  • the audio circuit 160, the speaker 161, and the microphone 162 can provide an audio interface between the user and the electronic device 100.
  • the audio circuit 160 can transmit the converted electrical signal of the audio data to the speaker 161 for conversion to the sound signal output by the speaker 161.
  • the microphone 162 converts the collected sound signal into an electrical signal. It is converted by the audio circuit 160 into audio data.
  • the audio data output processor 190 then processes it, transmits it to the, for example, another terminal via the radio frequency unit 110, or outputs the audio data to the memory 120 for further processing.
  • the audio circuit can also include a headphone jack 163 for providing a connection interface between the audio circuit and the earphone.
  • WiFi is a short-range wireless transmission technology
  • the electronic device 100 can help users to send and receive emails, browse web pages, and access streaming media through the WiFi module 170, which provides wireless broadband Internet access for users.
  • FIG. 1 shows the WiFi module 170, it can be understood that it does not belong to the essential configuration of the electronic device 100, and may be omitted as needed within the scope of not changing the essence of the invention.
  • Bluetooth is a short-range wireless communication technology.
  • the use of Bluetooth technology can effectively simplify communication between mobile communication terminal devices such as handheld computers, notebook computers and mobile phones. Communication between these devices and the Internet can also be successfully simplified.
  • the electronic device 100 makes data transmission between the electronic device 100 and the Internet more rapid and efficient through the Bluetooth module 180, broadening the road for wireless communication.
  • Bluetooth technology is an open solution for wireless transmission of voice and data.
  • FIG. 1 shows the WiFi module 170, it can be understood that it does not belong to the essential configuration of the electronic device 100, and may be omitted as needed within the scope of not changing the essence of the invention.
  • the electronic device 100 also includes a power source 193 (such as a battery) that powers the various components.
  • a power source 193 such as a battery
  • the power source can be logically coupled to the processor 190 via the power management system 194.
  • the functions of managing charging, discharging, and power consumption management are thus realized by the power management system 194.
  • the electronic device 100 also includes an external interface 197, which may be a standard Micro USB interface or a multi-pin connector. It can be used to connect the electronic device 100 to communicate with other devices, and can also be used to connect the charger to charge the electronic device 100.
  • an external interface 197 which may be a standard Micro USB interface or a multi-pin connector. It can be used to connect the electronic device 100 to communicate with other devices, and can also be used to connect the charger to charge the electronic device 100.
  • the electronic device 100 may further include a camera, a flash, etc., and no longer ⁇ Said.
  • the following describes the method for implementing fast split screen by taking the electronic device 100 as an example.
  • FIG. 2 is a schematic diagram of the appearance of an electronic device 100 according to an embodiment of the present invention, including a touch-sensitive display unit 130, a switch button 133, and a volume control button 132.
  • the position of each physical button in Fig. 2 is only an example, and the position of the physical button in the actual product can be arbitrarily changed.
  • the electronic device 100 may further include an acceleration sensor 151, a microphone 162, a speaker 161, an external interface 197, a headphone jack 163, and a pressure sensor 196.
  • 2 illustrates a touch-sensitive display unit display interface 200 of the electronic device 100. As shown in FIG. 2, the display interface 200 can display information such as virtual keys, battery power, time, and the like.
  • FIG. 2 is merely an exemplary display interface, and other information may be displayed, which is not specifically limited by the present invention.
  • 2 is an example of an electronic device equipped with an Android operating system, and the present invention can also be applied to an electronic device equipped with other operating systems such as iOS and Windows.
  • the touch-sensitive display unit 130 can receive a touch input of a user.
  • the touch-sensitive display unit 130 as the main input or control device for operating the electronic device 100, the number of physical inputs or control devices on the electronic device 100 can be reduced.
  • display interface 200 can be presented by a touch sensitive display unit.
  • the touch sensitive display unit may be referred to as a "menu button.”
  • the "menu button" can be a physical button or other physical input or control device.
  • the pressure sensor 196 can detect whether pressure is applied to the electronic device 100 and can determine the magnitude of the pressure applied to the electronic device 100.
  • the pressure sensor 196 can be integrated in the display in a stacked form, or it can be a stand-alone device.
  • the pressure applied to the display can be identified based on the pressure sensor 196 as a finger touch input or a pressure touch action.
  • the acceleration sensor 151 is configured to acquire a gravity acceleration signal of a Z-axis of a touch action of a user on the touch-sensitive display unit.
  • the power of the electronic device 100 can be turned on or off by pressing and holding the switch button in a depressed state for a predetermined time interval.
  • the locking of the electronic device 100 can be achieved by depressing the switch button and releasing it before a predetermined time interval.
  • voice input for activating some functions may also be received via microphone 162.
  • FIG. 3 is a schematic diagram of a split screen display interface according to an embodiment of the present invention.
  • the split screen display interface 200 is composed of a boundary line 300, a first display area 301, and a second display area 302.
  • First display area The domain 301 and the second display area 302 are respectively located on both sides of the boundary line 300.
  • the first display area 301 and the second display area 302 may be arranged in a top-bottom arrangement or a left-right arrangement. The specific arrangement manner is not specifically limited in the present invention.
  • the method of splitting the screen provided by the embodiment of the present invention can be performed on a portable electronic device (for example, the electronic device 100 in FIG. 1 or FIG. 2).
  • the electronic device 100 includes a touch sensitive display unit.
  • the touch sensitive display unit is also referred to as a display having a touch sensitive surface.
  • some of the methods may be combined, and/or the order of some operations may vary.
  • Embodiments of the present invention provide a method for implementing fast split screen.
  • the split screen may be to divide the display interface of the display into at least two display areas.
  • the method helps the user to screen the display interface simply and quickly with fewer steps. Simplified the operation steps of the split screen and improved the user experience.
  • Step 1 When detecting a touch gesture acting on the touch-sensitive surface, determining whether the touch gesture matches the preset gesture;
  • Step 2 When it is determined that the touch gesture matches the preset gesture, the display interface of the display is divided into at least two display areas in response to the touch gesture.
  • the touch gesture can be composed of touch actions.
  • a tap gesture consists of pressing and lifting two touch actions
  • a swipe gesture consists of pressing, moving, and lifting three touch actions.
  • the touch sensitive display unit receives a touch action on the touch sensitive surface
  • the touch information is communicated to the processor.
  • the touch information may include contact coordinates, grid capacitance information of the touch-sensitive surface, one or more signals of the touch action.
  • the touch action may include actions such as pressing, moving, and lifting.
  • the terminal device periodically detects whether a touch action acts on the touch-sensitive surface.
  • the preset gesture may be a joint touch gesture, a multi-touch gesture, or a pressure touch gesture.
  • the preset gesture may be saved in the memory 120 in advance.
  • the detected touch action is composed
  • the touch gesture is compared with the preset gesture to determine whether the touch gesture matches the preset gesture (step 1).
  • An implementation manner of determining whether the touch gesture matches a preset gesture determines whether the touch gesture is a joint touch gesture. Whether the touch action is a joint touch action may be determined based on the touch information and a Z-axis direction acceleration signal generated by the touch action.
  • a gesture consisting of a joint touch action is a joint touch gesture.
  • a joint click gesture consists of pressing and lifting two joint touch actions; the joint swipe gesture consists of pressing, moving, and lifting three joint touch actions.
  • the movement trajectory between pressing to lifting is the trajectory of the joint touch gesture.
  • the grid capacitance value of the touch-sensitive surface generated as the touch action for the touch-sensitive surface is in a first preset capacitance value range, and the number of grids of non-zero capacitance values is less than a preset value, and within a preset time
  • the Z-axis direction acceleration signal is in the first preset acceleration range, it may be determined that the touch action is a joint touch action.
  • the grid capacitance value of the touch-sensitive surface of the touch-sensitive surface generated as the touch action for the touch-sensitive surface is in a second preset capacitance value range, and the number of grids of non-zero capacitance values is greater than or equal to a preset value, And when the Z-axis direction acceleration signal is within the second preset acceleration range within a preset time, the touch action may be determined to be a finger touch action.
  • the grid capacitance value of the touch-sensitive surface generated as a touch action for the touch-sensitive surface indicates that the maximum capacitance value is within a first predetermined capacitance value range (eg, less than or equal to 0.42 pF), and the distribution has a non-zero capacitance value.
  • the number of grids is less than 7.
  • the grid capacitance value of the touch-sensitive surface generated as a touch action for the touch-sensitive surface indicates that the maximum capacitance value is within a second predetermined capacitance value range (eg, greater than 0.42 pF, less than or equal to 0.46 pF), and the distribution is non-zero.
  • the number of grids of the capacitance value is greater than or equal to 7, and within a preset time, the acceleration signal in the Z-axis direction is in the second preset acceleration range (for example, within 5 ms, the acceleration signal is less than 2 g, and g is gravitational acceleration) It can be judged that the touch action is a finger touch action.
  • the finger touch action in the embodiment of the present invention is not necessarily triggered by a finger, and other objects may touch the touch-sensitive display unit 130 as long as the finger touch action is satisfied.
  • the conditions may be referred to as finger touch actions of embodiments of the present invention.
  • the joint touch action in the embodiment of the present invention is not necessarily triggered by the finger joint, and other objects may tap or touch the touch-sensitive display unit 130 at a fast speed, as long as the judgment condition of the joint touch action is satisfied.
  • the gesture consisting of the joint touch action is a joint touch gesture.
  • Another implementation manner of determining whether the touch gesture matches a preset gesture determines whether the touch gesture is a pressure touch gesture.
  • Whether the touch action is a pressure touch action may be determined based on the touch information and a pressure applied to the touch-sensitive display unit. Since the touch gesture can be composed of a touch action.
  • the pressure touch gesture may be composed of a pressure touch, a hold state, a finger touch motion, and a touch lift action; or a pressure touch, a press state, and a press state, Pressure touch movement action, touch up movement action.
  • the movement trajectory between pressing to lifting is the trajectory of the pressure touch gesture.
  • the touch action is a pressure touch action.
  • the pressure value applied to the display by the touch action satisfies the second preset pressure value range, it may be determined that the touch action is a finger touch action.
  • the pressure value may represent the value of the pressure; it may also represent the magnitude of the pressure.
  • the touch action applied to the display with the amount of pressure greater than or equal to 2 is recognized as a pressure touch action.
  • the pressure value applied to the display by the finger touch action is about 70 g.
  • the touch action applied to the display with a pressure value greater than 140 g is a pressure touch action. It can be understood that the relationship between the pressure value and the pressure in the embodiment of the present invention is only an implementation of the embodiment of the present invention, and the relationship between the pressure value and the pressure can be appropriately adjusted according to the design requirements.
  • a further implementation manner of determining whether the touch gesture matches a preset gesture may determine whether the touch gesture is a multi-touch gesture based on the touch information. Identifying one or more contacts that touch the touch-sensitive surface simultaneously or sequentially based on the touch information, thereby determining whether the touch gesture is a multi-touch gesture.
  • the multi-finger touch can be a two-finger, three-finger, four-finger, five-finger or even more fingers touching the touch-sensitive surface simultaneously or sequentially.
  • the track of the multi-touch gesture can be The trajectory of the movement of the center point of the plurality of contacts that are in contact with the touch-sensitive surface simultaneously or sequentially on the touch-sensitive surface.
  • the preset distance threshold can be 1/2 of the width of the display.
  • the preset trajectory may be a straight line or a straight line trajectory, or may be a curve that can be detected by any type of electronic device or a trajectory that approximates a curve, such as an S-type, a Z-shape, an X-type, or the like.
  • the start position of the touch gesture is adjacent to a predetermined area of the first edge of the touch-sensitive display unit, and/or the termination position of the touch gesture Whether it is outside the touch sensitive display unit or adjacent to a predetermined area of the second edge of the touch sensitive display unit.
  • the first edge and the second edge may be two oppositely disposed different edges, respectively located on two sides of the touch-sensitive display unit, for example, the first edge is the left edge of the touch-sensitive display unit, and the second edge is the right of the touch-sensitive display unit. edge. Or the first edge is the upper edge of the touch sensitive display unit and the second edge is the lower edge of the touch sensitive display unit.
  • the end position of the touch gesture outside the screen means that the trajectory of the touch gesture slides in a direction other than the effective area of the touch-sensitive display unit.
  • the display interface of the display is divided into at least two display regions in response to the touch gesture (step 2).
  • the display interface of the display is divided into at least two display areas, wherein at least one of the display areas displays an operation interface of the first application running on the display interface before the split screen. It can be understood that the size of the running interface of the first application is reduced.
  • the embodiment of the present invention takes the display interface of the display into two display areas as an example.
  • the two display areas may be arranged left and right or may be arranged up and down.
  • the display interface can be divided into three, four or more display areas according to the split screen method of the embodiment of the present invention. Not listed here.
  • the implementation of dividing the display interface into two display areas may divide the display interface into a first display area and a second display area according to a preset boundary line. It can be understood that the size of the running interface of the first application running on the display interface before the split screen is reduced by a predetermined size.
  • Features such as the position, shape, and color of the preset boundary line may be preset and stored in the memory 120.
  • the preset boundary line preferably divides the display interface of the display into two display areas.
  • the preset dividing line may be a horizontal dividing line, and the display interface is evenly divided into upper and lower display areas having the same area.
  • the display area above the boundary line is the first display area, and the display area below the boundary line is the second display area.
  • the preset boundary line may also be a vertical dividing line, and the display interface is evenly divided into two display areas with the same area on the left and the right. It can be understood by those skilled in the art that the position of the preset boundary line can be located at any position of the display interface, which is not specifically limited by the present invention.
  • Another implementation manner of dividing the display interface into two display areas may divide the display interface into a first display area and a second display area according to the trajectory of the touch gesture.
  • the implementation includes dividing a display interface of the display into a first display area and a second display area by using a trajectory of the touch gesture as a boundary line.
  • the trajectory of the touch gesture may be a straight line or a gesture trajectory close to a straight line, or may be a curve that can be detected by any type of electronic device or a gesture trajectory similar to a curve, such as an S-type, a Z-shape, an X-type, or the like.
  • the trajectory of the touch gesture is preferably a straight line.
  • the direction in which the touch gesture moves may be horizontal, vertical, or any form of electronic device capable of detecting the direction.
  • the boundary line is a straight line in the horizontal direction.
  • the display area above the boundary line is the first display area
  • the display area below the boundary line is the second display area.
  • the boundary line is a straight line in the vertical direction.
  • the dividing line has a left display area as a first display area, and a dividing line with a right display area as a second display area.
  • a further implementation manner of dividing the display interface into two display areas may perform a split screen operation according to the direction of the touch gesture and the coordinates of the start position of the touch gesture.
  • the implementation includes the direction in which the touch gesture moves is the direction of the boundary line, and the coordinates of the boundary line are determined according to coordinates of a start position of the touch gesture.
  • the boundary line is a horizontal direction
  • the y-axis coordinate value of the boundary line is a y-axis coordinate value of a start position of the touch gesture.
  • the display area above the boundary line is the first display area, and the display area below the boundary line is the second display area.
  • the boundary line is a vertical direction;
  • the x-axis coordinate value of the boundary line is an x-axis coordinate value of a start position of the touch gesture.
  • the dividing line has a left display area as a first display area, and a dividing line with a right display area as a second display area.
  • the display interfaces shown in FIGS. 4-7 can also be implemented according to multi-finger touch gestures and pressure touch gestures.
  • the trajectory of the joint touch gesture is a boundary line.
  • the boundary line may also be a preset boundary line, and the boundary line may be determined according to the direction of the touch gesture and the coordinates of the start position of the touch gesture.
  • the direction of the arrow is the direction in which the joint touch gesture moves.
  • the dotted line is the joint touch gesture movement trajectory.
  • the running interface of the first application is displayed on the display interface before the split screen. After the split screen displays a representation of the interface, the display interface is divided into two display areas.
  • the two display areas include a first display area and a second display area.
  • the first display area may display an operation interface of the first application running before the split screen. It can be understood that the size of the running interface of the first application before the split screen is reduced. And generating a second display area in a display area other than the reduced running interface of the first application.
  • the second display area may display an identification of one or more applications associated with the first application for selection by a user.
  • the second application is executed, and the running interface of the second application is displayed in the second display area.
  • the correlation between the respective applications can be set in advance and stored in the memory 120.
  • the identifier of the application may be an icon of the application.
  • FIG. 4 is an exemplary display interface according to a touch gesture split screen according to an embodiment of the present invention.
  • "e-book”, “memo”, “dictionary”, and “text editing” are set in advance as related applications, and the correlation between the respective applications is stored in the memory 120.
  • the running interface of the "e-book” is displayed on the display interface before the split screen. After splitting the screen, the first display area is displayed.
  • the operation interface of the "e-book” may be displayed, and the second display area may display an application identifier related to the "e-book” such as "memo", "dictionary”, “text editing".
  • the display interface of the first display area reduces the size of the running interface of the "e-book" before the split screen.
  • a scroll bar 400 is displayed on the right side of the first display area. It is also understood by those skilled in the art that when the user selects any identifiers of "memo”, “dictionary”, and “text editing", the application corresponding to the selected identifier is executed, and the second display area is displayed. The running interface of the application corresponding to the selected identifier.
  • the running interface of the first application may be displayed in the first display area after the split screen, and the second display area The running interface of the second application is displayed. That is, after the split screen, the second application is automatically run and displayed in the second display area, which saves the user's manual selection of the second application.
  • FIG. 5 is another exemplary display interface according to a touch gesture split screen according to an embodiment of the present invention.
  • "e-book” and “memo” are set in advance as related applications.
  • the running interface of the "e-book” is displayed on the display interface before the split screen.
  • the first display area displays the running interface of the "e-book”
  • the second display area displays the running interface of the "memo”. It is convenient for users to do study notes and other related operations on the "e-book” in the "memo" running interface.
  • the first display area displays an operation interface of the first application.
  • the second display area can display a main menu interface, such as a home interface.
  • the main menu interface displays an application identifier for the user to select. After detecting that the user selects the identifier of the second application displayed by the second display area, executing the second application, and displaying the second application in the second display area. Run the interface.
  • FIG. 6 is still another exemplary display interface according to a touch gesture split screen according to an embodiment of the present invention.
  • the running interface of the "e-book” is displayed on the display interface before the split screen.
  • the first display area displays an operation interface of the "e-book”
  • the second display area displays a main menu interface, the main menu interface including an application identifier for the user to select, for example: “camera”, “information", " The logo of the application such as "video”. Understandably, when it is detected that the user selects any application After the identification, the selected application is executed, and the running interface of the selected application is displayed in the second display area.
  • the second display area may also display the identifier of the application (hereinafter referred to as "historical program identifier") that the user has recently executed for the user to select. After detecting that the user selects the history program identifier, executing the selected application, and displaying the running interface of the selected application in the second display area. As can be understood by those skilled in the art, the second display area can also display a thumbnail of the running interface of the history program.
  • FIG. 7 is still another exemplary display interface according to a touch gesture split screen according to an embodiment of the present invention.
  • the running interface of the "e-book” is displayed on the display interface before the split screen.
  • the user has executed the "Information”, “camera”, “Album”, “Phone” applications before the split screen.
  • the first display area displays the running interface of the "e-book”
  • the second display area displays the identifiers of the "information", “camera”, “album”, and “telephone” applications for the user to select. It can be understood that, after the user selects any application identifier, the application corresponding to the selected identifier is executed, and the running interface of the application corresponding to the selected identifier is displayed in the second display area.
  • the second display area displays an identification of an application such as "camera”, “information”, “video”, and the movement of the "camera” identification to the first is detected.
  • the operation interface of "camera” is displayed in the first display area, and the operation interface of the "e-book” is replaced.
  • the second display area displays the identification of the application such as "camera”, “information”, “video”, and the operation instruction for moving the "camera” identification to the first display area is detected. , move the "camera” logo to the main menu interface.
  • the moved content is embedded in the running interface of the first application.
  • the first display area displays an operation interface of "text editing”
  • the second display area is displayed
  • the identifier of an application such as "video”
  • the operation interface or logo of "video” is embedded in the "text editing” interface.
  • the first display area displays the running interface of the "text editing”
  • the operation instruction for moving the picture displayed by the second display area to the first display area is detected
  • the second display area is embedded in the "text editing" interface. image.
  • the position of the boundary line can be adjusted after the split screen.
  • the position of the dividing line on the touch-sensitive display unit is changed, the size and/or position of the display area after the split screen changes accordingly.
  • the boundary line prompts the user to adjust the position of the boundary line in a highlighted manner. The user can adjust the position of the dividing line by touching the touch sensitive display unit and moving the dividing line.
  • FIG. 8 is an exemplary display interface for adjusting a position of a boundary line according to a touch gesture according to an embodiment of the present invention.
  • the boundary position is adjusted. It can be understood that when the touch gesture for adjusting the position of the boundary line is moved from the first display area to the second display area, the display interface size of the first display area is increased, and the size of the display interface of the second display area is reduced.
  • the replacement is performed.
  • the preset graphical trajectory may be a clockwise or counterclockwise curved rotational trajectory or any other form of trajectory that the electronic device can detect.
  • FIG. 9 is an exemplary display interface for replacing a split screen display interface according to a touch gesture according to an embodiment of the present invention. As shown in FIG. 9, when a touch gesture is applied to the first display area and the second display area, the trajectory of the touch gesture is a counterclockwise curved rotation trajectory matching the preset graphic trajectory, replacing the first Display content of the display area and the second display area.
  • the first display area and the second display area are merged into one display area, thereby exiting the split screen interface.
  • the exiting the split screen interface may be a second application of the second display area
  • the program goes to run in the background. After the second application is turned into the background operation, if the user resumes the split screen operation, the running interface of the second application may be displayed in the split screen area after the split screen.
  • One implementation of the exit split screen interface gesture may be a touch gesture that slides from the first display area toward the second display area.
  • the starting position of the touch gesture exiting the split screen interface may be in the first display area; and the ending position of the touch gesture exiting the split screen interface may be in the second display area.
  • two or more fingers may simultaneously or sequentially contact the touch sensitive display unit, and the distance between the fingers gradually decreases or increases after the contact.
  • the specific implementation manner of the exiting the split screen interface can be adaptively adjusted according to specific design requirements, and the present invention is not specifically limited.
  • the resume split screen touch gesture may be a touch gesture that acts on the merged display area.
  • the recovery split-screen touch gesture may be a joint touch gesture, a multi-finger touch finger, a pressure touch gesture, or other gestures set according to a design requirement, which is not specifically limited by the present invention.
  • the split screen interface may be divided into more display areas, such as adding a third display area.
  • the split screen interface includes a first display area and a second display area.
  • the third display area may display the main menu interface, and may also display an identifier or an operation interface of the application related to the application displayed by the first display area and/or the second display area, which is not specifically limited by the present invention.
  • FIG. 10 is an exemplary display interface for dividing a split screen interface into more display areas according to a touch gesture.
  • the split screen interface includes a first display area and a second display area.
  • the second display area When receiving the joint touch gesture matching the preset gesture acts on the second display area, in response to the joint touch gesture, dividing the second display area into two displays with the trajectory of the joint touch gesture as a boundary line
  • the area above the boundary line displays the display interface of the original second display area, which can be understood as the size of the display interface of the original second display area.
  • the display area below the boundary line is the third display area, and the main menu interface can be displayed.
  • An embodiment of the present invention further provides an electronic device that implements fast split screen.
  • the electronic device includes a touch sensitive display unit 130, a memory 120, an acceleration sensor 151, and a processor 190.
  • the touch sensitive display unit 130 can be a display having a touch sensitive surface, the touch sensitive display unit 130 including a touch sensitive surface and a display screen.
  • the touch sensitive display unit 130 is configured to present a display interface, and is further configured to receive touch information applied to the touch sensitive surface and transmit the touch information to the processor 190; the touch information may include contact coordinates, a touch sensitive surface One or more of a grid capacitance value, a touch action; the touch action may include an action of pressing, moving, and lifting.
  • the memory 120 stores instructions.
  • the acceleration sensor is configured to acquire an acceleration signal in the Z-axis direction and transmit the acquired acceleration signal in the Z-axis direction to the processor 190.
  • the processor 190 invokes instructions stored in the memory 120 to implement corresponding functions in the methods described above. For example, when a touch gesture acting on the touch-sensitive surface is detected, it is determined whether the touch gesture matches a preset gesture; when it is determined that the touch gesture matches a preset gesture, the display is responded to the touch gesture
  • the display interface is divided into at least two display areas. And/or functions corresponding to other methods described herein.
  • the electronic device further includes a pressure sensor 196 for detecting pressure applied to the electronic device and transmitting the detected pressure value to the processor 190.
  • the principle of the electronic device is similar to the split screen method in the method embodiment of the present invention. Therefore, the implementation of the electronic device may refer to the implementation of the method, and the repeated description is not repeated.
  • Embodiments of the present invention also provide an apparatus for implementing fast split screen.
  • the device comprises a detecting unit, a judging unit and a split screen unit.
  • the detection unit is for detecting a touch gesture acting on the touch-sensitive surface.
  • the determining unit is configured to determine whether the touch gesture matches a preset gesture.
  • the split screen unit is configured to divide the display interface of the display into at least two display areas when the touch gesture matches the preset gesture.
  • the principle of solving the problem is similar to the split screen method in the method embodiment of the present invention. Therefore, the implementation of the device can be referred to the implementation of the method, and the repeated description is not repeated.
  • the technical solution adopted by the fast split screen method, the electronic device and the device of the embodiment of the invention discloses that the display interface of the display is divided into at least two display areas by a touch gesture.
  • the electronic device may divide the display interface of the electronic device into at least two display areas according to the recognized touch gesture that matches the preset gesture.
  • the present invention can make the user realize the split screen function more conveniently by the touch gesture, and can respectively display different display interfaces in the provided split screen area. Therefore, the technical solution of the split screen operation provided by the present invention can perform the split screen operation simply and quickly, and simplifies the operation steps of the split screen, thereby improving the user experience.
  • the split screen solution may adopt any combination of the foregoing solutions, which is not specifically limited in this embodiment of the present invention.

Landscapes

  • Engineering & Computer Science (AREA)
  • General Engineering & Computer Science (AREA)
  • Theoretical Computer Science (AREA)
  • Human Computer Interaction (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • User Interface Of Digital Computer (AREA)
  • Telephone Function (AREA)
  • Position Input By Displaying (AREA)
  • Digital Computer Display Output (AREA)

Abstract

本发明实施例公开了一种快速分屏的方法、装置、电子设备、显示界面以及存储介质,涉及电子领域。所述方法包括:当检测到作用于触敏表面的触摸手势时,判断所述触摸手势是否与预设手势匹配;当判断所述触摸手势与预设手势匹配时,则响应所述触摸手势,将所述显示器的显示界面划分为至少两个显示区域。

Description

一种快速分屏的方法、装置、电子设备、显示界面以及存储介质 技术领域
本发明实施例涉及一种快速分屏的方法、装置、电子设备、显示界面以及存储介质;特别涉及使用触摸手势将具有触敏表面的显示器的显示界面划分为至少两个显示区域的方法。
背景技术
近年来,随着通信技术和电子产业的飞速发展,以数据、语音、视频为基础的新业务发展迅猛。微电子技术、计算机软硬件技术的快速发展,使得终端具备越来越强大的功能。用户对终端设备有迫切的需求,希望终端设备的功能更强大、更灵活、更便捷,为用户提供更便捷的用户体验。
终端设备已经逐渐成为人们日常生活不可缺少的物品。为了使人们更方便地使用终端设备以及给人们提供更好的视觉体验,终端设备屏幕的尺寸在不断扩大,随着能够支持众多应用的大屏幕触控终端设备的发展,对终端设备能够协同处理多个任务的需求也越来越迫切。
然而,在用户希望终端设备支持同时处理多个任务的同时,为了追求更好的用户体验,还希望在同一个显示界面同时呈现多个应用程序的显示区域。现有的终端设备启动分屏应用时,需要触发分屏按键或虚拟开关与触摸手势的配合执行分屏操作,且分屏后的分屏区域的窗口大小和数量是固定的。基于目前用户对在同一显示界面同时呈现多个应用程序的显示区域的迫切需求,以及当前分屏操作步骤繁杂的缺陷。因此,有必要提出相应的技术方案,将终端设备显示界面进行灵活划分,为运行新的应用程序、同时执行多任务提供便利。
发明内容
为了改进现有技术分屏操作的用户体验,本发明实施例提供了一种快速分屏的技术方案。
上述目标和其他目标将通过独立权利要求中的特征来达成。进一步的实现方式在从属权利要求、说明书和附图中体现。
第一方面,提供了一种快速分屏的方法,应用于一种便携式电子设备上,所述电子设备包括具有触敏表面的显示器,所述方法包括:当检测到作用于所述触敏表面的关节触摸手势时,响应所述关节触摸手势,将所述显示器的显示界面划分为至少两个显示区域。通过预先设置触发分屏的关节触摸手势,当检测到作用于所述触敏表面的关节触摸手势时,则执行分屏操作。通过一个手势则可以实现分屏操作,简单快捷,优化了现有技术分屏的操作流程,简化了分屏的操作步骤,进而改进了用户体验。
根据第一方面,在所述快速分屏的第一种可能的实现方式中,所述关节触摸手势由关节触摸动作组成;当作用于所述触敏表面的触摸动作产生的触敏表面网格电容值在第一预设电容值范围,非零电容值的网格个数小于预设值,且Z轴方向加速度信号在第一预设加速度范围内时,则所述触摸动作是所述关节触摸动作,由所述关节触摸动作组成的手势是所述关节触摸手势。由于电子设备手指触摸手势定义比较普遍,通过关节触摸手势实现快速分屏功能,拓展了一种新的手势,且由于关节触摸手势的判断条件与手指触摸手势的判断条件不同,不容易因误操作进入分屏界面,使分屏操作更便捷,提升了用户体验。
根据第一方面的实现方式,在所述快速分屏的第二种可能的实现方式中,所述当检测到作用于所述触敏表面的关节触摸手势后,判断所述关节触摸手势的移动距离是否大于预设距离阈值。通过限定触发分屏操作的触摸手势的移动距离,降低了因误操作触发分屏操作的概率,提升了用户体验。
根据第一方面的实现方式,在所述快速分屏的第三种可能的实现方式中,所述当检测到作用于所述触敏表面的关节触摸手势后,判断所述关节触摸手势的轨迹是否与预设轨迹匹配。通过限定触发分屏操作的触摸手势的轨迹,降低了因误操作触发分屏操作的概率,提升了用户体验。
根据第一方面的实现方式,在所述快速分屏的第四种可能的实现方式中,所述当检测到作用于所述触敏表面的关节触摸手势后,判断所述关节触摸手势的轨迹与水平轴向或垂直轴向所成夹角的角度是否小于预设角度。
根据第一方面的实现方式,在所述快速分屏的第五种可能的实现方式中,所述将所述显示器的显示界面划分为至少两个显示区域,包括:根据预设分界线将所述显示器的显示界面划分为至少两个显示区域。通过预设分界线的位置,可以结合显示器的尺寸,预先将分界线位置设置在相对合理的位置。
根据第一方面的实现方式,在所述快速分屏的第六种可能的实现方式中,所述将所述显示器的显示界面划分为至少两个显示区域,包括:根据所述触摸手势的轨迹将所述显示器的显示界面划分为至少两个显示区域。通过以触摸手势的轨迹作为分界线,可以随用户触摸手势的轨迹灵活调整分界线位置以及分界线形状,提供更灵活多样化的分屏显示界面。
根据第一方面的实现方式,在所述快速分屏的第七种可能的实现方式中,所述将所述显示器的显示界面划分为至少两个显示区域,包括:根据所述触摸手势的方向和所述触摸手势起始位置的坐标将所述显示器的显示界面划分为至少两个显示区域。分界线的位置与所述触摸手势的起始位置的坐标相关,因此用户可以根据分屏前所显示的应用程序的运行界面,有目的的调整触摸手势的起始位置,用户可以通过调整触摸手势的起始位置和触摸手势的方向决定分界线的位置,进而根据所述触摸手势的起始位置和方向调整所述至少两个显示区域的位置和/或尺寸。提升了分屏操作的用户体验。
根据第一方面或以上第一方面的任意一种可能的实现方式,在所述快速分屏的第八种可能的实现方式中,所述将所述显示器的显示界面划分为至少两个显示区域,包括:当所述显示器的显示界面显示第一应用程序的运行界面 时,缩小所述第一应用程序的运行界面的尺寸,在第一显示区域显示缩小的第一应用程序的运行界面;并在第一显示区域以外的显示区域生成至少一个显示区域,在所生成的显示区域显示与所述第一应用程序相关的一个或多个应用程序的标识;或当所述显示器的显示界面显示第一应用程序的运行界面时,缩小所述第一应用程序的运行界面的尺寸,在第一显示区域显示缩小的第一应用程序的运行界面;并在第一显示区域以外的显示区域生成至少一个显示区域,在所生成的显示区域显示与所述第一应用程序相关的应用程序的运行界面;或当所述显示器的显示界面显示第一应用程序的运行界面时,缩小所述第一应用程序的运行界面的尺寸,在第一显示区域显示缩小的第一应用程序的运行界面;并在第一显示区域以外的显示区域生成至少一个显示区域,在所生成的显示区域显示主菜单界面;或当所述显示器的显示界面显示第一应用程序的运行界面时,缩小所述第一应用程序的运行界面的尺寸,在第一显示区域显示缩小的第一应用程序的运行界面;并在第一显示区域以外的显示区域生成至少一个显示区域,在所生成的显示区域显示历史程序标识;或当所述显示器的显示界面显示第一应用程序的运行界面时,缩小所述第一应用程序的运行界面的尺寸,在第一显示区域显示缩小的第一应用程序的运行界面;并在第一显示区域以外的显示区域生成至少一个显示区域,在所生成的显示区域显示历史程序的运行界面的缩略图。分屏后显示界面的一个显示区域显示第一应用程序的运行界面,分屏后第一应用程序的运行界面以外的显示区域所显示的内容提升了用户操作的便利性。进而提升了用户体验。
根据第一方面,在所述快速分屏的第九种可能的实现方式中,当检测到将第一显示区域以外的显示区域所显示的标识移动到第一显示区域的操作指令时,在所述第一显示区域显示所述标识对应的应用程序的运行界面;或当检测到将第一显示区域以外的显示区域所显示的标识移动到第一显示区域的操作指令时,在所述第一显示区域显示所述标识;或当检测到将第一显示区域以外的显示区域所显示的标识移动到第一显示区域的操作指令时,将所述标识嵌入所述缩小的第一应用程序的运行界面;或当检测到将第一显示区域 以外的显示区域所显示的内容移动到第一显示区域的操作指令时,将移动的内容嵌入所述缩小的第一应用程序的运行界面。分屏后显示区域之间的互动操作,使电子设备操作更便捷,提升了用户体验。
在一种可能的设计中,所述触发分屏操作的手势还可以是压力触摸手势。由于压力触摸手势给人们带来3D的触觉体验,越来越受人们的喜爱,同时与手指触摸相比,不容易被误触发,简化了分屏的操作步骤,改进了分屏的操作体验。
在一种可能的设计中,当判断所述触摸手势与预设手势匹配后,判断所述触摸手势的起始位置是否临近触敏显示单元的第一边缘的预定区域,和/或所述触摸手势的终止位置是否在所述触敏显示单元以外或临近所述触敏显示单元第二边缘的预定区域。通过限定触发分屏操作的判断条件,降低了因误操作触发分屏操作的概率,提升了用户体验。
在一种可能的设计中,将所述显示器的显示界面划分为至少两个显示区域后,通过调整至少两个显示区域之间的分界线的位置以调整至少两个显示区域的尺寸和/或位置。方便用户调整分屏后显示区域的尺寸和/或位置,便于用户根据各显示区域的显示界面的显示内容适应性的调整各显示区域的尺寸和/或位置。提升了用户体验。
在一种可能的设计中,将所述显示器的显示界面划分为至少两个显示区域后,当检测到作用于至少两个显示区域的触摸手势,且所述触摸手势的轨迹与预设图形轨迹匹配,置换所述至少两个显示区域的位置或显示内容。方便用户通过触摸手势调整显示区域到便于操控的显示区域。尤其对显示器尺寸比较大的电子设备而言,并不是显示器的所有显示区域都便于触摸操控。提升了用户体验。
在一种可能的设计中,将所述显示器的显示界面划分为至少两个显示区域后,当检测到退出分屏手势时,将所述至少两个显示区域合并成一个显示区域。通过手势操作,方便快捷的将分屏后的至少两个显示区域合并成一个显示区域,使得退出分屏操作方便快捷,提升了用户体验。
第二方面,提供了一种快速分屏的便携式电子设备,所述电子设备具有实现上述方法所对应的功能。所述功能可以通过硬件实现,也可以通过硬件执行相应的软件实现。所述硬件或软件包括一个或多个与上述功能相对应的模块。所述便携式电子设备包括显示器、存储器、加速度传感器和处理器;所述显示器具有触敏表面;所述触敏表面用于接收触摸手势;所述显示器还用于显示界面;所述存储器用于存储指令;所述加速度传感器用于获取Z轴方向的加速度信号并将所获取到的加速度信号传递给处理器;所述处理器调用存储在所述存储器中的指令以实现上述第一方面的方法设计中的方案,重复之处不再赘述。
第三方面,提供了一种快速分屏的装置,所述装置包括检测单元用于检测作用于触敏表面的触摸手势;判断单元用于判断所述触摸手势是否与预设手势匹配;分屏单元用于当所述触摸手势与预设手势匹配时,将所述显示器的显示界面划分为至少两个显示区域。基于同一发明构思,由于该装置解决问题的原理与第一方面的方法设计中的方案对应,因此该装置的实施可以参见方法的实施,重复之处不再赘述。
第四方面,提供了一种便携式电子设备的显示界面,所述便携式电子设备包括显示器、存储器以及用于执行存储在所述存储器中的指令的处理器,其中,所述显示器具有触敏表面:当检测到作用于所述触敏表面的触摸手势时,判断所述触摸手势是否与预设手势匹配;当判断所述触摸手势与预设手势匹配时,则响应所述触摸手势,将所述显示器的显示界面划分为至少两个显示区域,当所述处理器执行存储在所述存储器中的指令时实现上述第一方 面的方法设计中的方案,并在所述便携式电子设备的显示界面显示所述处理器执行所述方案时生成的显示界面。
第五方面,提供了一种存储一个或多个程序的非易失性计算机可读存储介质,所述一个或多个程序包括指令,所述指令当被包括具有触敏表面的显示器的便携式电子设备执行时使所述便携式电子设备执行上述第一方面的方法设计中的方案,重复之处不再赘述。
本发明实施例公开了通过关节触摸手势将显示器的显示界面划分为至少两个显示区域的技术方案。当用户在使用具有触敏显示单元的电子设备的过程中希望执行分屏操作时,只需要在触敏显示单元上执行关节触摸手势即可触发分屏功能。相应地,电子设备可以根据所检测到的关节触摸手势将电子设备的显示界面划分为至少两个显示区域。与现有技术相比,本发明可以使用户通过关节触摸手势更便捷的实现分屏功能,并且可以在所提供的分屏区域分别呈现不同的显示界面。因此,本发明提供的分屏操作的技术方案优化了现有技术分屏的操作流程,简化了分屏的操作步骤,进而改进了用户体验。
附图说明
为了更清楚地说明本发明实施例的技术方案,下面将对实施例描述中所需要使用的附图作简单地介绍,显而易见地,下面描述中的附图仅仅是本发明的一些实施例,对于本领域普通技术人员来讲,在不付出创造性劳动的前提下,还可以根据这些附图获得其他的附图。
图1为本发明实施例提供的便携式电子设备100的内部结构示意图;
图2为本发明实施例提供的便携式电子设备100的外部结构示意图;
图3为本发明实施例提供的分屏的显示界面的示意图;
图4为本发明实施例提供的一种根据触摸手势分屏的示例性显示界面;
图5为本发明实施例提供的另一种根据触摸手势分屏的示例性显示界面;
图6为本发明实施例提供的又一种根据触摸手势分屏的示例性显示界面;
图7为本发明实施例提供的再一种根据触摸手势分屏的示例性显示界面;
图8为本发明实施例提供的根据触摸手势调整分界线位置的示例性显示界面;
图9为本发明实施例提供的根据触摸手势置换分屏显示界面的示例性显示界面;
图10为根据触摸手势将分屏界面划分为更多显示区域的示例性显示界面。
具体实施方式
下面将结合本发明实施例中的附图,对本发明实施例中的技术方案进行清楚、完整地描述,显然,所描述的实施例仅仅是本发明的一部分实施例,而不是全部的实施例。基于本发明中的实施例,本领域普通技术人员在没有做出创造性劳动前提下所获得的所有其他实施例,都属于本发明保护的范围。
为便于说明,本发明中的实施例以包括触敏显示单元的便携式电子设备100作示例性说明。本领域技术人员可以理解的,本发明中的实施例同样适用于其他装置。例如手持设备、车载设备、可穿戴设备、计算设备、以及各种形式的用户设备(User Equipment,UE)、移动台(Mobile station,MS)、终端(terminal)、终端设备(Terminal Equipment)等等。
所述电子设备100可以支持多种应用。例如视频、音频等多媒体应用、文本应用(电子邮件应用,博客应用,网页浏览应用等)、网页浏览、即时通讯应用等。所述电子设备100的触敏显示单元可以直观的呈现所述应用的显示界面。用户能够通过所述电子设备100的触敏显示单元执行各种应用。
图1为本发明实施例提供的便携式电子设备100的内部结构示意图。所述电子设备100可以包括触敏显示单元130、加速度传感器151、压力传感器196、接近光传感器152、环境光传感器153、存储器120、处理器190、射频单元110、音频电路160、扬声器161、麦克风162、WiFi(wireless fidelity,无线保真)模块170、蓝牙模块180、电源193、外部接口197等部件。
本领域技术人员可以理解,图1仅仅是便携式电子设备的举例,并不构成对便携式电子设备的限定。可以包括比图示更多或更少的部件,或者组合 某些部件,或者不同的部件。
所述触敏显示单元130有时为了方便被称为“触摸屏”,并且也可被称为是或者被叫做触敏显示器系统,也可以被称为具有触敏表面(touch-sensitive surface)的显示器。所述具有触敏表面的显示器包括触敏表面和显示屏;可以显示屏幕界面、也可以接收触摸动作。
触敏显示单元130提供设备与用户之间的输入接口和输出接口。所述触敏显示单元130可收集用户在其上或附近的触摸操作,例如用户使用手指202、关节400、触笔等任何适合的物体在触敏显示单元130上或在触敏显示单元130附近的操作。触敏显示单元130可以检测触摸信息,并将所述触摸信息发送给所述处理器190。所述触摸信息可以包括触摸动作、触敏表面的网格电容值、触点坐标。所述触敏显示单元能接收所述处理器190发来的命令并加以执行。触敏显示单元130显示视觉输出。视觉输出可包括图形、文本、标识、视频及它们的任何组合(统称为“图形”)。在一些实施例中,一些视觉输出或全部的视觉输出可对应于显示界面对象。
触敏显示单元130可使用LCD(液晶显示器)技术、LPD(发光聚合物显示器)技术、或LED(发光二极管)技术,但是在其他实施例中可使用其他显示技术。触敏显示单元130可以利用现在已知的或以后将开发出的多种触摸感测技术中的任何技术,以及其他接近传感器阵列或用于确定与所述触敏显示单元130接触的一个或多个点的其他元件来检测接触及其任何移动或中断。该多种触摸感测技术包括但不限于电容性的、电阻性的、红外线的、和表面声波技术。在一示例性实施例中,使用投射式互电容感测技术。
用户可以利用任何合适的物体或附加物诸如触笔、手指、关节等与触敏显示单元130接触。在一些实施例中,显示界面被设计为基于关节的接触和手势一起工作。在另一些实施例中,显示界面被设计为基于手指触摸和手势一起工作。还有一些实施例中,显示界面被设计为基于压力触摸和手势一起工作。在一些实施例中,设备将触摸的粗略输入翻译为精确的指针/光标位置或命令,以执行用户所期望的动作。
在一些实施例中,除了触敏显示单元之外,设备100可包括用于激活或 解除激活特定功能的触控板(未示出)。在一些实施例中,触控板是设备的触敏区域。触控板与触敏显示单元不同。触控板不显示视觉输出。触控板可以是与触敏显示单元130分开的触敏表面,或者是由触敏显示单元形成的触敏表面的延伸部分。
所述加速度传感器151可获取各个方向上(一般为三轴)加速度的大小。同时,所述加速度传感器151还可用于检测终端静止时重力的大小及方向。可用于识别手机姿态的应用(比如横竖屏切换、相关游戏、磁力计姿态校准),振动识别相关功能(比如计步器、敲击)等。在本发明实施例中,所述加速度传感器151用于获取用户的触摸动作接触触敏显示单元在Z轴方向的重力加速度信号。
压力传感器196可检测是否有向电子设备100施加了压力。并可以确定施加于电子设备100上的压力的量值。并将所检测到的压力值传递给处理器190。可以将压力传感器193安装在电子设备100中需要检测压力的部分中。如果将压力传感器196安装显示模块中,则可以基于由压力传感器196输出的信号区分触摸输入和压力触摸动作。由压力传感器196输出的信号还可以指示施加于显示器上的压力。示例性地,假设触摸输入作用于显示器上的压力的量值为1,将作用于显示器上的大于2的压力识别为压力触摸动作。
电子设备100还可以包括一个或多个接近光传感器152,用于当所述电子设备100距用户较近时(例如当用户正在打电话时靠近耳朵)关闭并禁用触敏表面的触摸功能,以避免用户对触敏显示单元的误操作。电子设备100还可以包括一个或多个环境光传感器153,用于当电子设备100位于用户口袋里或其他黑暗区域时保持触敏显示单元关闭,以防止电子设备100在锁定状态时消耗不必要的电池功耗或被误操作。在一些实施例中,接近光传感器和环境光传感器可以集成在一颗部件中,也可以作为两个独立的部件。至于电子设备100还可配置陀螺仪、气压计、湿度计、温度计、红外线传感器等其他传感器,在此不再赘述。虽然图1示出了接近光传感器和环境光传感器,但是可以理解的是,其并不属于电子设备100的必须构成,完全可以根据需要在不改变发明的本质的范围内而省略。
所述存储器120可用于存储指令和数据。存储器120可主要包括存储指令区和存储数据区。存储数据区可以存储预设轨迹信息,预设分界线的位置、形状以及颜色等特征,应用程序之间的相关关系等。存储指令区可存储操作系统、至少一个功能所需要的指令等。所述指令可使处理器190执行以下方法,具体方法包括:当检测到作用于所述触敏表面的触摸手势,且判断所述触摸手势为关节触摸手势时,响应所述关节触摸手势,将所述显示器的显示界面划分为至少两个显示区域。
处理器190是电子设备100的控制中心。利用各种接口和线路连接整个手机的各个部分,通过运行或执行存储在存储器120内的指令以及调用存储在存储器120内的数据,执行电子设备100的各种功能和处理数据,从而对手机进行整体监控。可选的,处理器190可包括一个或多个处理单元。优选的,处理器190可集成应用处理器和调制解调处理器。其中,应用处理器主要处理操作系统、显示界面和应用程序等,调制解调处理器主要处理无线通信。可以理解的是,上述调制解调处理器也可以不集成到处理器190中。在一些实施例中,处理器、存储器、可以在单一芯片上实现。在一些实施例中,他们也可以在独立的芯片上分别实现。在本发明实施例中,处理器190还用于调用存储器中的指令以实现当检测到作用于所述触敏表面的触摸手势,且判断所述触摸手势为关节触摸手势时,响应所述关节触摸手势,将所述显示器的显示界面划分为至少两个显示区域。
所述射频单元110可用于收发信息或通话过程中信号的接收和发送。特别地,将基站的下行信息接收后,传递给处理器190处理。另外,将设计上行的数据发送给基站。通常,RF电路包括但不限于天线、至少一个放大器、收发信机、耦合器、低噪声放大器(Low Noise Amplifier,LNA)、双工器等。此外,射频单元110还可以通过无线通信与网络设备和其他设备通信。所述无线通信可以使用任一通信标准或协议,包括但不限于全球移动通讯系统(Global System of Mobile communication,GSM)、通用分组无线服务(General Packet Radio Service,GPRS)、码分多址(Code Division Multiple Access,CDMA)、宽带码分多址(Wideband Code Division Multiple Access, WCDMA)、长期演进(Long Term Evolution,LTE)、电子邮件、短消息服务(Short Messaging Service,SMS)等。
音频电路160、扬声器161、麦克风162可提供用户与电子设备100之间的音频接口。音频电路160可将接收到的音频数据转换后的电信号,传输到扬声器161,由扬声器161转换为声音信号输出。另一方面,麦克风162将收集的声音信号转换为电信号。由音频电路160接收后转换为音频数据。再将音频数据输出处理器190处理后,经射频单元110以发送给比如另一终端,或者将音频数据输出至存储器120以便进一步处理。音频电路也可以包括耳机插孔163,用于提供音频电路和耳机之间的连接接口。
WiFi属于短距离无线传输技术,电子设备100通过WiFi模块170可以帮助用户收发电子邮件、浏览网页和访问流式媒体等,它为用户提供了无线的宽带互联网访问。虽然图1示出了WiFi模块170,但是可以理解的是,其并不属于电子设备100的必须构成,完全可以根据需要在不改变发明的本质的范围内而省略。
蓝牙是一种短距离无线通讯技术。利用蓝牙技术,能够有效地简化掌上电脑、笔记本电脑和手机等移动通信终端设备之间的通信。也能够成功地简化以上这些设备与因特网(Internet)之间的通信。电子设备100通过蓝牙模块180使电子设备100与因特网之间的数据传输变得更加迅速高效,为无线通信拓宽道路。蓝牙技术是能够实现语音和数据无线传输的开放性方案。虽然图1示出了WiFi模块170,但是可以理解的是,其并不属于电子设备100的必须构成,完全可以根据需要在不改变发明的本质的范围内而省略。
电子设备100还包括给各个部件供电的电源193(比如电池)。优选的,电源可以通过电源管理系统194与处理器190逻辑相连。从而通过电源管理系统194实现管理充电、放电、以及功耗管理等功能。
电子设备100还包括外部接口197,所述外部接口可以是标准的Micro USB接口,也可以使多针连接器。可以用于连接电子设备100与其他装置进行通信,也可以用于连接充电器为电子设备100充电。
尽管未示出,电子设备100还可以包括摄像头、闪光灯等,在此不再赘 述。
以下以电子设备100为例说明实现快速分屏的方法。
图2为本发明实施例提供的一种电子设备100的外观示意图,包括触敏显示单元130、开关按键133,音量控制按键132。图2中的各个物理按键的位置仅为示例,实际产品中物理按键的位置可以任意变化。在本实施例中,电子设备100还可以包括加速度传感器151、麦克风162、扬声器161、外部接口197、耳机插孔163、压力传感器196。图2示意了电子设备100的触敏显示单元显示界面200,如图2所示,显示界面200可以显示虚拟按键、电池电量,时间等信息。本领域技术人员可以理解的,图2所示的显示界面仅仅是一种示例性显示界面,还可以显示其他信息,本发明不做具体限定。图2以搭载了Android操作系统的电子设备为例,本发明还可以应用于搭载了iOS、Windows等其他操作系统的电子设备。
在本实施例中,所述触敏显示单元130可以接收用户的触摸输入。通过使用触敏显示单元130作为操作电子设备100的主输入或控制装置,可以减少电子设备100上的物理输入或控制装置的数量。如图2所示,通过触敏显示单元可呈现显示界面200。在本实施例中,触敏显示单元可以被称为“菜单按钮”。在一些其他实施例中,“菜单按钮”可以是物理按钮或其他物理输入或控制装置。所述压力传感器196可检测是否有向电子设备100施加了压力,并可以确定施加于电子设备100上的压力的量值。在一些实施例中,压力传感器196可以以叠层形式集成在显示器中,也可以是独立的器件。可以基于压力传感器196识别施加于显示器上的压力为手指触摸输入或压力触摸动作。所述加速度传感器151用于获取用户在触敏显示单元上的触摸动作在Z轴的重力加速度信号。通过压下并保持所述开关按键在被压下状态达预定时间间隔,可以实现打开或关闭电子设备100的电源。通过压下所述开关按键并在预定时间间隔之前释放,可以实现锁定电子设备100。在其他实施例中,还可以通过麦克风162接收用于激活一些功能的语音输入。
图3为本发明实施例提供的分屏后显示界面的示意图。分屏后显示界面200由分界线300、第一显示区域301和第二显示区域302组成。第一显示区 域301和第二显示区域302分别位于分界线300的两侧。本领域技术人员可以理解的,第一显示区域301和第二显示区域302可以是上下排布,也可以是左右排布。具体排布方式,本发明不做具体限定。
本发明实施例提供的分屏的方法可在一种便携式电子设备(例如,图1或图2中的电子设备100)上被执行。所述电子设备100包括触敏显示单元。所述触敏显示单元也被称为具有触敏表面的显示器。在一些实施例中,方法中的一些操作可以被组合,和/或一些操作的次序可以改变。
本发明实施例提供了一种实现快速分屏的方法。所述分屏可以是将显示器的显示界面划分为至少两个显示区域。该方法帮助用户通过较少的操作步骤,能够简单、快捷地对显示界面分屏。简化了分屏的操作步骤,改进了用户体验。
本发明实施例提供的实现分屏的方法可以包括以下步骤:
步骤1:当检测到作用于触敏表面的触摸手势,判断所述触摸手势是否与预设手势匹配;
步骤2:当判断所述触摸手势与预设手势匹配时,则响应所述触摸手势,将所述显示器的显示界面划分为至少两个显示区域。
本领域技术人员可以理解的,触摸手势可以由触摸动作组成。例如,点击手势由按下和抬起两个触摸动作组成;滑动手势由按下、移动和抬起三个触摸动作组成。当触敏显示单元接收到作用于触敏表面的触摸动作后,将触摸信息传递给处理器。所述触摸信息可以包括触点坐标、触敏表面的网格电容信息、触摸动作中的一种或多种信号。所述触摸动作可以包括按下、移动以及抬起等动作。作为一种实施方式,终端设备周期性检测是否有触摸动作作用于触敏表面。
所述预设手势可以为关节触摸手势、多点触摸手势或压力触摸手势。可以预先将预设手势保存在存储器120中。
当检测到作用于触敏表面的触摸动作时,将所检测到的触摸动作组成的 触摸手势与所述预设手势做对比,判断所述触摸手势是否与所述预设手势匹配(步骤1)。
判断所述触摸手势是否与预设手势匹配的一种实现方式,判断所述触摸手势是否是关节触摸手势。可以基于所述触摸信息以及触摸动作产生的Z轴方向加速度信号判断所述触摸动作是否是关节触摸动作。由关节触摸动作组成的手势为关节触摸手势。例如:关节点击手势由按下和抬起两个关节触摸动作组成;关节滑动手势由按下、移动和抬起三个关节触摸动作组成。在按下至抬起之间的移动轨迹为所述关节触摸手势的轨迹。
以下介绍判断所述触摸动作是否是关节触摸动作的实施方式:
当作用于所述触敏表面的触摸动作产生的触敏表面的网格电容值在第一预设电容值范围,非零电容值的网格个数小于预设值,且在预设时间内所述Z轴方向加速度信号在第一预设加速度范围时,可以判断所述触摸动作是关节触摸动作。当作用于所述触敏表面的触摸动作产生的触敏表面的触敏表面的网格电容值在第二预设电容值范围,非零电容值的网格个数大于或等于预设值,且在预设时间内所述Z轴方向加速度信号在第二预设加速度范围时,可以判断所述触摸动作是手指触摸动作。
例如,当作用于所述触敏表面的触摸动作产生的触敏表面的网格电容值指示最大电容值在第一预设电容值范围(比如小于或等于0.42pF),分布有非零电容值的网格个数小于7,在预设时间内,Z轴方向加速度信号在第一预设加速度范围(比如,在5ms内,加速度信号大于3g)时,可以判断该触摸动作为关节触摸动作。当作用于所述触敏表面的触摸动作产生的触敏表面的网格电容值指示最大电容值在第二预设电容值范围(比如大于0.42pF、小于或等于0.46pF),分布有非零电容值的网格个数大于或等于7,且在预设时间内,Z轴方向加速度信号在第二预设加速度范围(比如,在5ms内,加速度信号小于2g,g为重力加速度)时,可以判断该触摸动作为手指触摸动作。
可以理解的是,本发明实施例中的手指触摸动作并非一定由手指触发,也可以是其他物体触摸触敏显示单元130,只要满足上述手指触摸动作的判断 条件均可称为本发明实施例的手指触摸动作。本发明实施例中的关节触摸动作并非一定由手指关节触发,也可以是其他物体以很快的速度敲击或触摸触敏显示单元130,只要满足上述关节触摸动作的判断条件均可称为本发明实施例的关节触摸动作。由所述关节触摸动作组成的手势为关节触摸手势。
判断所述触摸手势是否与预设手势匹配的另一种实现方式,判断所述触摸手势是否是压力触摸手势。可以基于所述触摸信息和施加于触敏显示单元上的压力,判断所述触摸动作是否是压力触摸动作。由于触摸手势可以由触摸动作组成。在本发明实施例中,所述压力触摸手势可以由压力触摸按下动作后保持按下状态、手指触摸移动动作、触摸抬起动作组成;也可以由压力触摸按下动作后保持按下状态、压力触摸移动动作、触摸抬起动作组成。在按下至抬起之间的移动轨迹为所述压力触摸手势的轨迹。
当触摸动作施加于显示器上的压力值满足第一预设压力值范围时,可以判断所述触摸动作是压力触摸动作。当触摸动作施加于显示器上的压力值满足第二预设压力值范围时,可以判断所述触摸动作是手指触摸动作。
本实施例中,压力值,既可表示压力的数值;也可表示压力的量值。
示例性的,假设手指触摸动作施加于显示器上的压力的量值为1,将施加于显示器上的压力量值大于等于2的触摸动作识别为压力触摸动作。
例如,手指触摸动作施加于显示器上的压力值为70g左右。施加于显示器上的压力值大于140g的触摸动作为压力触摸动作。可以理解的是,本发明实施例中的压力值以及压力的大小关系只是本发明实施例的一种实现,压力值以及压力的大小关系可以根据设计的需要进行适当的调整。
判断所述触摸手势是否与预设手势匹配的又一种实现方式,可以基于所述触摸信息判断所述触摸手势是否是多点触摸手势。基于所述触摸信息识别同时或先后接触触敏表面的一个或多个触点,进而判断所述触摸手势是否是多点触摸手势。本领域技术人员可以理解的,多指触摸可以为双指、三指、四指、五指甚至更多手指同时或先后接触触敏表面。多点触摸手势的轨迹可 以为同时或先后与触敏表面接触的多个触点的中心点在触敏表面上的移动轨迹。
可选的,当判断所述触摸手势与预设手势匹配后,判断所述触摸手势的移动距离是否大于预设距离阈值,和/或所述触摸手势的轨迹是否与预设轨迹匹配。例如,预设距离阈值可以为显示器宽度的1/2。所述预设轨迹可以是直线或近似于直线的轨迹,也可以是任何形式的电子设备能够检测到的曲线或近似于曲线的轨迹,例如S型、Z型、X型等。
可选的,当判断所述触摸手势与预设手势匹配后,判断所述触摸手势的起始位置是否临近触敏显示单元的第一边缘的预定区域,和/或所述触摸手势的终止位置是否在触敏显示单元以外或临近触敏显示单元第二边缘的预定区域。第一边缘和第二边缘可以是两个相对设置的不同边缘,分别位于触敏显示单元的两侧,例如第一边缘为触敏显示单元的左边缘,第二边缘为触敏显示单元的右边缘。或第一边缘为触敏显示单元的上边缘,第二边缘为触敏显示单元的下边缘。所述触摸手势的终止位置在屏幕以外是指触摸手势的轨迹朝向触敏显示单元有效区域以外的方向滑动。
可选的,当判断所述触摸手势与预设手势匹配后,判断所述触摸手势的轨迹与水平轴向或垂直轴向所成夹角的角度是否小于预设角度。
当判断所述触摸手势与预设手势匹配后,则响应所述触摸手势,将所述显示器的显示界面划分为至少两个显示区域(步骤2)。
将显示器的显示界面划分为至少两个显示区域,其中至少一个显示区域显示分屏前在显示界面所运行的第一应用程序的运行界面。可以理解为缩小了第一应用程序的运行界面的尺寸。
为便于说明,本发明实施例以将显示器的显示界面划分为两个显示区域为例。所述两个显示区域可以是左右排布,也可以是上下排布的。本领域技术人员可以理解的,可以根据本发明实施例的分屏方法将显示界面划分为三个、四个或更多个显示区域。此处不一一列举。
所述将显示界面划分为两个显示区域的一种实现方式,可以根据预设分界线将显示界面划分为第一显示区域和第二显示区域。可以理解为按预定尺寸缩小了分屏前在显示界面所运行的第一应用程序的运行界面的尺寸。所述预设分界线的位置、形状以及颜色等特征可以为预先设置并存储在存储器120中。所述预设分界线优选将显示器的显示界面均匀划分为两个显示区域。所述预设分界线可以是一条水平分界线,将显示界面均匀划分为上下两个面积相同的显示区域。分界线以上的显示区域为第一显示区域,分界线以下的显示区域为第二显示区域。所述预设分界线也可以是一条垂直分界线,将显示界面均匀划分为左右两个面积相同的显示区域。本领域技术人员可以理解的,所述预设分界线的位置可以位于显示界面的任意位置,本发明不做具体限定。
将显示界面划分为两个显示区域的另一种实现方式,可以根据所述触摸手势的轨迹将显示界面划分为第一显示区域和第二显示区域。所述实现方式包括以触摸手势的轨迹作为分界线,将显示器的显示界面划分为第一显示区域和第二显示区域。所述触摸手势的轨迹可以是直线或近似于直线的手势轨迹,也可以是任何形式的电子设备能够检测到的曲线或近似于曲线的手势轨迹,例如S型、Z型、X型等。所述触摸手势的轨迹优选为直线。触摸手势移动的方向可以是水平方向、垂直方向或者任何形式的电子设备能够检测到的方向。当所述触摸手势的轨迹为水平方向的直线时,则分界线为水平方向的直线。分界线以上的显示区域为第一显示区域,分界线以下的显示区域为第二显示区域。当所述触摸手势的轨迹为垂直方向的直线时,则分界线为垂直方向的直线。分界线以左的显示区域为第一显示区域,分界线以右的显示区域为第二显示区域。
将显示界面划分为两个显示区域的又一种实现方式,可以根据所述触摸手势的方向和所述触摸手势起始位置的坐标执行分屏操作。根据所述触摸手势的方向和所述触摸手势起始位置的坐标将显示界面划分为第一显示区域和第二显示区域。所述实现方式包括以所述触摸手势移动的方向为所述分界线的方向,根据所述触摸手势的起始位置的坐标确定所述分界线的坐标。例如: 当所述触摸手势移动的方向为水平方向时,则分界线为水平方向;所述分界线的y轴坐标值为触摸手势的起始位置的y轴坐标值。所述分界线以上的显示区域为第一显示区域,分界线以下的显示区域为第二显示区域。当所述触摸手势移动的方向为垂直方向时,则分界线为垂直方向;所述分界线的x轴坐标值为触摸手势的起始位置的x轴坐标值。分界线以左的显示区域为第一显示区域,分界线以右的显示区域为第二显示区域。
图4-图7为根据关节触摸手势分屏的示例性显示界面。本领域技术人员可以理解的,图4-图7所示的显示界面也可以根据多指触摸手势和压力触摸手势实现。如图4-图7所示,以所述关节触摸手势的轨迹为分界线。其中,所述分界线也可以是预设分界线,也可以根据所述触摸手势的方向和所述触摸手势起始位置的坐标确定分界线。其中,图示箭头方向为关节触摸手势移动的方向。虚线为所述关节触摸手势移动轨迹。分屏前在显示界面上显示第一应用程序的运行界面。分屏后显示界面的一种表现形式,将显示界面划分为两个显示区域。所述两个显示区域包括第一显示区域和第二显示区域。第一显示区域可以显示分屏前所运行的第一应用程序的运行界面。可以理解为缩小了分屏前第一应用程序的运行界面的尺寸。并在缩小的第一应用程序的运行界面以外的显示区域生成第二显示区域。
分屏后显示界面的一种实施方式,第二显示区域可以显示与所述第一应用程序相关的一个或多个应用程序的标识供用户选择。当接收到用户选中第二显示区域所显示的第二应用程序的标识时,执行所述第二应用程序,并在第二显示区域显示第二应用程序的运行界面。各个应用程序之间的相关关系可以预先设置并存储在存储器120中。
本发明实施例中,所述应用程序的标识,可以为应用程序的图标。
图4为本发明实施例提供的一种根据触摸手势分屏的示例性显示界面。如图4所示,假设预先设置了“电子书”、“备忘录”、“词典”、“文本编辑”为相关应用,并将各应用程序之间的相关关系保存在存储器120中。分屏前在显示界面上显示“电子书”的运行界面。分屏后,第一显示区域显 示“电子书”的运行界面,第二显示区域可以显示“备忘录”、“词典”、“文本编辑”等与“电子书”相关的应用标识。本领域技术人员可以理解的,第一显示区域的显示界面缩小了分屏前“电子书”的运行界面的尺寸。可选地,在第一显示区域的右侧显示滚动条400。本领域技术人员还可以理解的,当接收到用户选中“备忘录”、“词典”、“文本编辑”任意标识时,执行所述被选中的标识所对应的应用程序,在第二显示区域显示被选中的标识所对应的应用程序的运行界面。
分屏后显示界面的另一种实施方式,如果只存在一个第二应用程序与第一应用程序相关,可以在分屏后的第一显示区域显示第一应用程序的运行界面,第二显示区域显示第二应用程序的运行界面。即:分屏后自动运行第二应用程序并显示于第二显示区域,节省了用户手动选择第二应用程序的步骤。
图5为本发明实施例提供的另一种根据触摸手势分屏的示例性显示界面。如图5所示,假设预先设置了“电子书”与“备忘录”为相关应用。分屏前在显示界面上显示“电子书”的运行界面。分屏后,第一显示区域显示“电子书”的运行界面,第二显示区域显示“备忘录”的运行界面。方便用户在“备忘录”运行界面对“电子书”做学习笔记等相关操作。
分屏后显示界面的又一种实施方式,第一显示区域显示第一应用程序的运行界面。第二显示区域可以显示主菜单界面,如Home界面。所述主菜单界面显示应用程序标识供用户选择,当检测到用户选中第二显示区域所显示的第二应用程序的标识后,执行第二应用程序,在第二显示区域显示第二应用程序的运行界面。
图6为本发明实施例提供的又一种根据触摸手势分屏的示例性显示界面。如图6所示,分屏前在显示界面上显示“电子书”的运行界面。分屏后,第一显示区域显示“电子书”的运行界面,第二显示区域显示主菜单界面,所述主菜单界面包括应用程序标识供用户选择,例如:“camera”、“信息”、“视频”等应用程序的标识。可以理解的,当检测到用户选中任意应用程序 的标识后,执行所述被选中的应用程序,在第二显示区域显示被选中的应用程序的运行界面。
分屏后显示界面的再一种实施方式,第二显示区域也可以显示用户最近执行过的应用程序的标识(以下称为“历史程序标识”)供用户选择。当检测用户选中历史程序标识后,执行被选中的应用程序,在第二显示区域显示被选中的应用程序的运行界面。本领域技术人员可以理解的,第二显示区域也可以显示历史程序的运行界面的缩略图。
图7为本发明实施例提供的再一种根据触摸手势分屏的示例性显示界面。如图7所示,分屏前在显示界面上显示“电子书”的运行界面。假设用户在分屏前执行过“信息”、“camera”、“相册”、“电话”应用程序。分屏后,第一显示区域显示“电子书”的运行界面,第二显示区域显示“信息”、“camera”、“相册”、“电话”应用程序的标识供用户选择。可以理解的,当接收到用户选中任意应用标识后,执行所述被选中的标识对应的应用程序,在第二显示区域显示被选中的标识对应的应用程序的运行界面。
可选的,当检测到将第二显示区域所显示的标识移动到第一显示区域的操作指令时,在所述第一显示区域显示所移动的标识或所移动的标识对应的应用程序的运行界面。例如,当第一显示区域显示“电子书”的运行界面,第二显示区域显示“camera”、“信息”、“视频”等应用程序的标识,且检测到将“camera”标识移动到第一显示区域的操作指令时,在第一显示区域显示“camera”的运行界面,替换“电子书”的运行界面。当第一显示区域显示主菜单界面,第二显示区域显示“camera”、“信息”、“视频”等应用程序的标识,且检测到将“camera”标识移动到第一显示区域的操作指令时,将“camera”标识移动到主菜单界面。
可选的,当检测到将第二显示区域所显示的内容移动到第一显示区域的操作指令时,将所移动的内容嵌入第一应用程序的运行界面中。例如,当第一显示区域显示“文本编辑”的运行界面,且检测到将第二显示区域显示的 “视频”等应用程序的标识移动到第一显示区域的操作指令时,在“文本编辑”界面嵌入“视频”的运行界面或标识。当第一显示区域显示“文本编辑”的运行界面,且检测到将第二显示区域显示的图片移动到第一显示区域的操作指令时,在“文本编辑”界面嵌入第二显示区域所显示的图片。
进一步地,分界线的位置可以在分屏后进行调整。当改变分界线在触敏显示单元上的位置后,分屏后显示区域的尺寸和/或位置随之发生改变。当检测到长按分界线超过预定时间后,分界线以高亮方式提示用户调整分界线的位置。用户可以通过触摸触敏显示单元并移动分界线以调整分界线的位置。
图8为本发明实施例提供的根据触摸手势调整分界线位置的示例性显示界面。如图8所示,当检测到长按分界线超过预定时间,且触摸并移动分界线,进而调整分界线位置。可以理解的,当调整分界线位置的触摸手势是由第一显示区域移向第二显示区域时,第一显示区域的显示界面尺寸增大,同时缩小了第二显示区域的显示界面的尺寸。
进一步地,当显示界面被划分为第一显示区域和第二显示区域,且检测到作用于两个显示区域的触摸手势且所述触摸手势的轨迹与预设图形轨迹匹配后,置换所述第一显示区域和所述第二显示区域的位置或显示内容。所述预设图形轨迹可以是顺时针或逆时针的弧形旋转轨迹或其他任何形式的电子设备能够检测到的轨迹。
图9为本发明实施例提供的根据触摸手势置换分屏显示界面的示例性显示界面。如图9所示,当检测到作用于第一显示区域和第二显示区域的触摸手势,所述触摸手势的轨迹为与预设图形轨迹匹配的逆时针弧形旋转轨迹,置换所述第一显示区域和所述第二显示区域的显示内容。
进一步地,当显示界面被划分为第一显示区域和第二显示区域后,检测到退出分屏界面手势,第一显示区域和第二显示区域合并成一个显示区域,从而退出分屏界面。所述退出分屏界面,可以是将第二显示区域的第二应用 程序转为后台运行。当第二应用程序转为后台运行后,如果用户恢复分屏操作,可以将第二应用程序的运行界面显示在分屏后的分屏区域中。所述退出分屏界面手势的一种实施方式,可以是从第一显示区域朝向第二显示区域滑动的触摸手势。所述退出分屏界面的触摸手势的起始位置可以在第一显示区域;且所述退出分屏界面的触摸手势的终止位置可以在第二显示区域。所述退出分屏界面手势的另一种实施方式,也可以是两根或两根以上手指同时或先后接触触敏显示单元,接触之后手指之间的距离逐渐缩小或增大。所述退出分屏界面的具体实现方式,可以根据具体设计需求做适应性调整,本发明不做具体限定。
当退出分屏界面,第一显示区域和第二显示区域合并成一个显示区域之后,检测到恢复分屏触摸手势作用于合并后的显示区域时,将合并后的显示区域拆分为合并前的分屏区域,以恢复分屏界面。所述恢复分屏触摸手势可以是作用于合并后的显示区域的触摸手势。所述恢复分屏触摸手势可以是与预设手势匹配的关节触摸手势、多指触摸手指、压力触摸手势,也可以是根据设计需要所设置的其他手势,本发明不做具体限定。
进一步地,当检测到与预设手势匹配的触摸手势作用于分屏界面后,可以将分屏界面划分为更多显示区域,如新增第三显示区域。所述分屏界面包括第一显示区域和第二显示区域。所述第三显示区域可以显示主菜单界面,也可以显示与第一显示区域和/或第二显示区域所显示的应用程序相关的应用程序的标识或运行界面,本发明不做具体限定。
图10为根据触摸手势将分屏界面划分为更多显示区域的示例性显示界面。如图10所示,所述分屏界面包括第一显示区域和第二显示区域。当接收到与预设手势匹配的关节触摸手势作用于第二显示区域,响应所述关节触摸手势,以所述关节触摸手势的轨迹作为分界线,将所述第二显示区域划分为两个显示区域,分界线以上显示区域显示原第二显示区域的显示界面,可以理解为缩小了原第二显示区域的显示界面的尺寸。分界线以下显示区域为第三显示区域,可以显示主菜单界面。
本发明实施例还提供了一种实现快速分屏的电子设备。
所述电子设备包括:触敏显示单元130、存储器120、加速度传感器151、处理器190。
所述触敏显示单元130可以为具有触敏表面的显示器,所述触敏显示单元130包括触敏表面和显示屏。所述触敏显示单元130用于呈现显示界面,还用于接收作用于触敏表面的触摸信息,并将触摸信息传递给处理器190;所述触摸信息可以包括触点坐标、触敏表面的网格电容值、触摸动作中的一种或多种信号;所述触摸动作可以包括按下、移动以及抬起等动作。
所述存储器120存储指令。
所述加速度传感器用于获取Z轴方向的加速度信号并将获取到的Z轴方向的加速度信号传递给处理器190。
所述处理器190调用存储在所述存储器120中的指令以实现上述方法中相应的功能。例如当检测到作用于触敏表面的触摸手势时,判断所述触摸手势是否与预设手势匹配;当判断所述触摸手势与预设手势匹配时,则响应所述触摸手势,将所述显示器的显示界面划分为至少两个显示区域。和/或与本发明所描述的其他方法所对应的功能。
在一些实施例,所述电子设备还包括压力传感器196,所述压力传感器196用于检测施加于所述电子设备的压力并将所检测到的压力值传递给处理器190。
基于同一发明构思,由于该电子设备解决问题的原理与本发明方法实施例中的分屏方法相似,因此该电子设备的实施可以参见方法的实施,重复之处不再赘述。
本发明实施例还提供了一种实现快速分屏的装置。
所述装置包括检测单元、判断单元、分屏单元。
所述检测单元用于检测作用于触敏表面的触摸手势。
所述判断单元用于判断所述触摸手势是否与预设手势匹配。
所述分屏单元用于当所述触摸手势与预设手势匹配时,将所述显示器的显示界面划分为至少两个显示区域。
基于同一发明构思,由于该装置解决问题的原理与本发明方法实施例中的分屏方法相似,因此该装置的实施可以参见方法的实施,重复之处不再赘述。
本发明实施例的快速分屏的方法、电子设备以及装置所采用的技术方案公开了通过触摸手势将显示器的显示界面划分为至少两个显示区域。在用户在使用具有触敏显示单元的电子设备的过程中,当用户希望执行分屏操作时,只需要在触敏显示单元上执行预设触摸手势即可触发分屏功能。相应地,电子设备可以根据所识别的与预设手势匹配的触摸手势将电子设备的显示界面划分为至少两个显示区域。与现有技术比,本发明可以使用户通过触摸手势更便捷的实现分屏功能,并且可以在所提供的分屏区域分别呈现不同的显示界面。因此,本发明提供的分屏操作的技术方案能够简便快捷的执行分屏操作,简化了分屏的操作步骤,进而改进了用户体验。
需要说明的是,在实际应用中,所述分屏方案可以采用上述方案中的任意方案组合,本发明实施例对此不进行具体限定。
在本申请所提供的实施例中,本领域普通技术人员可以理解实现上述实施例的全部或部分步骤仅仅是示意性的,可以通过硬件来完成,也可以通过程序来指令相关的硬件完成,当通过程序指令相关的硬件完成时,所述的程序可以存储在一个非易失性(non-transitory)计算机可读取存储介质中。基于这样的理解,本发明的技术方案本质上或者说对现有技术做出贡献的部分或者该技术方案的全部或部分可以以软件产品的形式体现出来,该计算机软件产品存储在一个存储介质中,包括若干指令用以使得一台计算机设备(可以是个人计算机,服务器,或者网络设备等)执行本发明各个实施例所述方法的全部或部分步骤。而前述的存储介质包括:U盘、移动硬盘、只读存储器(ROM,Read-Only Memory)、随机存取存储器(RAM,Random Access Memory)、磁 碟或者光盘等各种可以存储程序代码的介质。
以上所述,以上实施例仅用以说明本发明的技术方案,而非对其限制;尽管参照前述实施例对本发明进行了详细的说明,本领域的普通技术人员应当理解:其依然可以对前述各实施例所记载的技术方案进行修改,或者对其中部分技术特征进行等同替换;而这些修改或者替换,并不使相应技术方案的本质脱离本发明各实施例技术方案的精神和范围。

Claims (23)

  1. 一种快速分屏的方法,应用于一种便携式电子设备上,所述电子设备包括具有触敏表面的显示器,其特征在于,所述方法包括:
    当检测到作用于所述触敏表面的关节触摸手势时,响应所述关节触摸手势,将所述显示器的显示界面划分为至少两个显示区域。
  2. 如权利要求1所述的方法,其特征在于,所述关节触摸手势由关节触摸动作组成;当作用于所述触敏表面的触摸动作产生的触敏表面网格电容值在第一预设电容值范围,非零电容值的网格个数小于预设值,且Z轴方向加速度信号在第一预设加速度范围内时,则所述触摸动作是所述关节触摸动作,由所述关节触摸动作组成的手势是所述关节触摸手势。
  3. 如权利要求1所述的方法,其特征在于,所述当检测到作用于所述触敏表面的关节触摸手势后,判断所述关节触摸手势的移动距离是否大于预设距离阈值。
  4. 如权利要求1所述的方法,其特征在于,所述当检测到作用于所述触敏表面的关节触摸手势后,判断所述关节触摸手势的轨迹是否与预设轨迹匹配。
  5. 如权利要求1所述的方法,其特征在于,所述当检测到作用于所述触敏表面的关节触摸手势后,判断所述关节触摸手势的轨迹与水平轴向或垂直轴向所成夹角的角度是否小于预设角度。
  6. 如权利要求1所述的方法,其特征在于,所述将所述显示器的显示界面划分为至少两个显示区域,包括:
    根据预设分界线将所述显示器的显示界面划分为至少两个显示区域。
  7. 如权利要求1所述的方法,其特征在于,所述将所述显示器的显示界面划分为至少两个显示区域,包括:
    根据所述关节触摸手势的轨迹将所述显示器的显示界面划分为至少两个显示区域。
  8. 如权利要求1所述的方法,其特征在于,所述将所述显示器的显示界面划分为至少两个显示区域,包括:
    根据所述关节触摸手势的方向和所述关节触摸手势起始位置的坐标将所述显示器的显示界面划分为至少两个显示区域。
  9. 如权利要求1-8中任一项所述的方法,其特征在于,所述将所述显示器的显示界面划分为至少两个显示区域,包括:
    当所述显示器的显示界面显示第一应用程序的运行界面时,缩小所述第一应用程序的运行界面的尺寸,在第一显示区域显示缩小的第一应用程序的运行界面;并在第一显示区域以外的显示区域生成至少一个显示区域,在所生成的显示区域显示与所述第一应用程序相关的一个或多个应用程序的标识;或
    当所述显示器的显示界面显示第一应用程序的运行界面时,缩小所述第一应用程序的运行界面的尺寸,在第一显示区域显示缩小的第一应用程序的运行界面;并在第一显示区域以外的显示区域生成至少一个显示区域,在所生成的显示区域显示与所述第一应用程序相关的应用程序的运行界面;或
    当所述显示器的显示界面显示第一应用程序的运行界面时,缩小所述第一应用程序的运行界面的尺寸,在第一显示区域显示缩小的第一应用程序的运行界面;并在第一显示区域以外的显示区域生成至少一个显示区域,在所生成的显示区域显示主菜单界面;或
    当所述显示器的显示界面显示第一应用程序的运行界面时,缩小所述第一应用程序的运行界面的尺寸,在第一显示区域显示缩小的第一应用程序的运行界面;并在第一显示区域以外的显示区域生成至少一个显示区域,在所生成的显示区域显示历史程序标识;或
    当所述显示器的显示界面显示第一应用程序的运行界面时,缩小所述第一应用程序的运行界面的尺寸,在第一显示区域显示缩小的第一应用程序的运行界面;并在第一显示区域以外的显示区域生成至少一个显示区域,在所生成的显示区域显示历史程序的运行界面的缩略图。
  10. 如权利要求9所述的方法,其特征在于,当检测到将第一显示区域以外的显示区域所显示的标识移动到第一显示区域的操作指令时,在所述第一显示区域显示所述标识对应的应用程序的运行界面;或
    当检测到将第一显示区域以外的显示区域所显示的标识移动到第一显示区域的操作指令时,在所述第一显示区域显示所述标识;或
    当检测到将第一显示区域以外的显示区域所显示的标识移动到第一显示区域的操作指令时,将所述标识嵌入所述缩小的第一应用程序的运行界面;或
    当检测到将第一显示区域以外的显示区域所显示的内容移动到第一显示区域的操作指令时,将移动的内容嵌入所述缩小的第一应用程序的运行界面。
  11. 一种快速分屏的便携式电子设备,其特征在于,所述便携式电子设备包括显示器、存储器、加速度传感器和处理器;
    所述显示器具有触敏表面;所述触敏表面用于接收触摸手势;所述显示器还用于显示界面;
    所述存储器用于存储指令;
    所述加速度传感器用于获取Z轴方向的加速度信号并将所获取到的加速度信号传递给处理器;
    所述处理器调用存储在所述存储器中的指令以实现:
    当检测到作用于所述触敏表面的关节触摸手势时,响应所述关节触摸手
    势,将所述显示器的显示界面划分为至少两个显示区域。
  12. 如权利要求11所述的电子设备,其特征在于,所述关节触摸手势由关节触摸动作组成;当作用于所述触敏表面的触摸动作产生的触敏表面网格电容值在第一预设电容值范围,非零电容值的网格个数小于预设值,且Z轴方向加速度信号在第一预设加速度范围内时,则所述触摸动作是所述关节触摸动作,由所述关节触摸动作组成的手势是所述关节触摸手势。
  13. 如权利要求11所述的电子设备,其特征在于,所述当检测到作用于所述触敏表面的关节触摸手势后,判断所述关节触摸手势的移动距离是否大于预设距离阈值。
  14. 如权利要求11所述的电子设备,其特征在于,所述当检测到作用于所述触敏表面的关节触摸手势后,判断所述关节触摸手势的轨迹是否与预设轨迹匹配。
  15. 如权利要求11所述的电子设备,其特征在于,所述当检测到作用于所述触敏表面的关节触摸手势后,判断所述关节触摸手势的轨迹与水平轴向或垂直轴向所成夹角的角度是否小于预设角度。
  16. 如权利要求11所述的电子设备,其特征在于,所述将所述显示器的显示界面划分为至少两个显示区域,包括:
    根据预设分界线将所述显示器的显示界面划分为至少两个显示区域。
  17. 如权利要求11所述的电子设备,其特征在于,所述将所述显示器的显示界面划分为至少两个显示区域,包括:
    根据所述关节触摸手势的轨迹将所述显示器的显示界面划分为至少两个显示区域。
  18. 如权利要求11所述的电子设备,其特征在于,所述将所述显示器的显示界面划分为至少两个显示区域,包括:
    根据所述关节触摸手势的方向和所述关节触摸手势起始位置的坐标将所述显示器的显示界面划分为至少两个显示区域。
  19. 如权利要求11-18中任一项所述的电子设备,其特征在于,所述将所述显示器的显示界面划分为至少两个显示区域,包括:
    当所述显示器的显示界面显示第一应用程序的运行界面时,缩小所述第一应用程序的运行界面的尺寸,在第一显示区域显示缩小的第一应用程序的运行界面;并在第一显示区域以外的显示区域生成至少一个显示区域,在所生成的显示区域显示与所述第一应用程序相关的一个或多个应用程序的标识;或
    当所述显示器的显示界面显示第一应用程序的运行界面时,缩小所述第一应用程序的运行界面的尺寸,在第一显示区域显示缩小的第一应用程序的运行界面;并在第一显示区域以外的显示区域生成至少一个显示区域,在所生成的显示区域显示与所述第一应用程序相关的应用程序的运行界面;或
    当所述显示器的显示界面显示第一应用程序的运行界面时,缩小所述第一应用程序的运行界面的尺寸,在第一显示区域显示缩小的第一应用程序的运行界面;并在第一显示区域以外的显示区域生成至少一个显示区域,在所 生成的显示区域显示主菜单界面;或
    当所述显示器的显示界面显示第一应用程序的运行界面时,缩小所述第一应用程序的运行界面的尺寸,在第一显示区域显示缩小的第一应用程序的运行界面;并在第一显示区域以外的显示区域生成至少一个显示区域,在所生成的显示区域显示历史程序标识;或
    当所述显示器的显示界面显示第一应用程序的运行界面时,缩小所述第一应用程序的运行界面的尺寸,在第一显示区域显示缩小的第一应用程序的运行界面;并在第一显示区域以外的显示区域生成至少一个显示区域,在所生成的显示区域显示历史程序的运行界面的缩略图。
  20. 如权利要求19所述的电子设备,其特征在于,当检测到将第一显示区域以外的显示区域所显示的标识移动到第一显示区域的操作指令时,在所述第一显示区域显示所述标识对应的应用程序的运行界面;或
    当检测到将第一显示区域以外的显示区域所显示的标识移动到第一显示区域的操作指令时,在所述第一显示区域显示所述标识;或
    当检测到将第一显示区域以外的显示区域所显示的标识移动到第一显示区域的操作指令时,将所述标识嵌入所述缩小的第一应用程序的运行界面;或
    当检测到将第一显示区域以外的显示区域所显示的内容移动到第一显示区域的操作指令时,将移动的内容嵌入所述缩小的第一应用程序的运行界面。
  21. 一种快速分屏的装置,其特征在于,所述装置包括:检测单元、分屏单元;
    所述检测单元用于检测作用于触敏表面的关节触摸手势;
    所述分屏单元用于响应所述关节触摸手势,将所述显示器的显示界面划分为至少两个显示区域。
  22. 一种便携式电子设备的显示界面,其特征在于,所述便携式电子设备包括显示器、存储器、加速度传感器以及用于执行存储在所述存储器中的指令的处理器,其中,所述显示器具有触敏表面:
    当检测到作用于所述触敏表面的关节触摸手势时,响应所述关节触摸手 势,将所述显示器的显示界面划分为至少两个显示区域,在所述便携式电子设备的显示界面显示所述至少两个显示区域。
  23. 一种存储一个或多个程序的非易失性计算机可读存储介质,其特征在于,所述一个或多个程序包括指令,所述指令当被包括具有触敏表面的显示器的便携式电子设备执行时使所述便携式电子设备执行以下事件:
    当检测到作用于所述触敏表面的关节触摸手势时,响应所述关节触摸手势,将所述显示器的显示界面划分为至少两个显示区域。
PCT/CN2015/095564 2015-11-25 2015-11-25 一种快速分屏的方法、装置、电子设备、显示界面以及存储介质 WO2017088131A1 (zh)

Priority Applications (10)

Application Number Priority Date Filing Date Title
PCT/CN2015/095564 WO2017088131A1 (zh) 2015-11-25 2015-11-25 一种快速分屏的方法、装置、电子设备、显示界面以及存储介质
AU2015415755A AU2015415755A1 (en) 2015-11-25 2015-11-25 Quick screen splitting method, apparatus, and electronic device, display UI, and storage medium
US15/779,039 US10642483B2 (en) 2015-11-25 2015-11-25 Quick screen splitting method, apparatus, and electronic device, display UI, and storage medium
EP15909043.0A EP3370139A4 (en) 2015-11-25 2015-11-25 Method and apparatus for rapidly dividing screen, electronic device, display interface and storage medium
KR1020187016604A KR102141099B1 (ko) 2015-11-25 2015-11-25 신속한 스크린 분할 방법 및 장치, 전자 디바이스, 디스플레이 인터페이스, 및 저장 매체
CN201580059602.7A CN107077295A (zh) 2015-11-25 2015-11-25 一种快速分屏的方法、装置、电子设备、显示界面以及存储介质
RU2018122637A RU2687037C1 (ru) 2015-11-25 2015-11-25 Способ, устройство быстрого разделения экрана, электронное устройство, ui отображения и носитель хранения
JP2018526897A JP6675769B2 (ja) 2015-11-25 2015-11-25 迅速な画面分割方法、装置、および電子デバイス、表示ui、および記憶媒体
PH12018501105A PH12018501105A1 (en) 2015-11-25 2018-05-24 Quick screen splitting method, apparatus, and electronic device, display ui, and storage medium
AU2020201096A AU2020201096B2 (en) 2015-11-25 2020-02-14 Quick screen splitting method, apparatus, and electronic device, display UI, and storage medium

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
PCT/CN2015/095564 WO2017088131A1 (zh) 2015-11-25 2015-11-25 一种快速分屏的方法、装置、电子设备、显示界面以及存储介质

Publications (1)

Publication Number Publication Date
WO2017088131A1 true WO2017088131A1 (zh) 2017-06-01

Family

ID=58762847

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/CN2015/095564 WO2017088131A1 (zh) 2015-11-25 2015-11-25 一种快速分屏的方法、装置、电子设备、显示界面以及存储介质

Country Status (9)

Country Link
US (1) US10642483B2 (zh)
EP (1) EP3370139A4 (zh)
JP (1) JP6675769B2 (zh)
KR (1) KR102141099B1 (zh)
CN (1) CN107077295A (zh)
AU (2) AU2015415755A1 (zh)
PH (1) PH12018501105A1 (zh)
RU (1) RU2687037C1 (zh)
WO (1) WO2017088131A1 (zh)

Cited By (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN107273036A (zh) * 2017-06-30 2017-10-20 广东欧珀移动通信有限公司 移动终端及其分屏控制方法、计算机可读存储介质
RU2764157C1 (ru) * 2018-06-29 2022-01-13 Бэйцзин Майкролайв Вижн Текнолоджи Ко., Лтд Способ и устройство для переключения глобальных специальных эффектов, оконечное устройство и носитель данных
JP2022505897A (ja) * 2018-11-22 2022-01-14 ホアウェイ・テクノロジーズ・カンパニー・リミテッド タッチ操作をロックする方法及び電子デバイス

Families Citing this family (34)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN108605165A (zh) * 2016-10-31 2018-09-28 华为技术有限公司 在电子设备中生成视频缩略图的方法及电子设备
WO2019000438A1 (zh) * 2017-06-30 2019-01-03 华为技术有限公司 显示图形用户界面的方法及电子设备
WO2019061052A1 (zh) * 2017-09-27 2019-04-04 深圳传音通讯有限公司 一种用于智能终端的分屏显示控制方法
CN107643870A (zh) * 2017-09-27 2018-01-30 努比亚技术有限公司 分屏显示方法、移动终端及计算机可读存储介质
CN108255390A (zh) * 2018-01-16 2018-07-06 上海掌门科技有限公司 一种用于电子资料的对比显示方法、设备和计算机存储介质
CN108255405B (zh) * 2018-01-19 2019-09-10 Oppo广东移动通信有限公司 用户界面显示方法、装置及终端
CN108536509B (zh) * 2018-03-30 2020-07-28 维沃移动通信有限公司 一种应用分身方法及移动终端
CN108632462A (zh) * 2018-04-19 2018-10-09 Oppo广东移动通信有限公司 分屏显示的处理方法、装置、存储介质及电子设备
CN108932093A (zh) * 2018-07-03 2018-12-04 Oppo广东移动通信有限公司 分屏应用切换方法、装置、存储介质和电子设备
CN109408174A (zh) * 2018-09-27 2019-03-01 上海哔哩哔哩科技有限公司 用于平板端应用的分屏方法、装置和存储介质
CN109460176A (zh) * 2018-10-22 2019-03-12 四川虹美智能科技有限公司 一种快捷菜单展示方法和智能冰箱
KR102599383B1 (ko) * 2018-10-26 2023-11-08 삼성전자 주식회사 분할된 화면 상에서 실행 가능한 어플리케이션 리스트를 디스플레이하는 전자 장치 및 전자 장치의 동작 방법
CN112703476A (zh) * 2018-10-30 2021-04-23 深圳市柔宇科技股份有限公司 终端设备及其图形用户界面以及多任务交互控制方法
CN109766053B (zh) * 2019-01-15 2020-12-22 Oppo广东移动通信有限公司 用户界面显示方法、装置、终端及存储介质
CN109857241B (zh) * 2019-02-27 2021-04-23 维沃移动通信有限公司 一种显示控制方法、终端设备及计算机可读存储介质
CN110333818A (zh) * 2019-05-24 2019-10-15 华为技术有限公司 分屏显示的处理方法、装置、设备和存储介质
CN110531903B (zh) * 2019-07-31 2021-01-12 维沃移动通信有限公司 屏幕显示方法、终端设备和存储介质
CN112463084A (zh) * 2019-09-06 2021-03-09 北京小米移动软件有限公司 分屏显示方法、装置、终端设备及计算机可读存储介质
EP3792739A1 (en) * 2019-09-13 2021-03-17 MyScript Systems and methods for macro-mode document editing
CN111190517B (zh) * 2019-12-30 2022-03-04 维沃移动通信有限公司 分屏显示方法及电子设备
CN113497838A (zh) * 2020-04-01 2021-10-12 Oppo广东移动通信有限公司 电子装置及其显示控制方法、计算机存储介质
CN111638847B (zh) * 2020-05-27 2022-01-28 维沃移动通信有限公司 分屏显示方法、装置及电子设备
CN111880700B (zh) * 2020-06-09 2022-02-01 维沃移动通信有限公司 应用程序控制方法、装置及电子设备
CN111857505B (zh) * 2020-07-16 2022-07-05 Oppo广东移动通信有限公司 一种显示方法、装置及存储介质
CN113805487B (zh) * 2020-07-23 2022-09-23 荣耀终端有限公司 控制指令的生成方法、装置、终端设备及可读存储介质
CN112261220A (zh) * 2020-09-18 2021-01-22 湖北亿咖通科技有限公司 应用于终端的分屏显示方法
CN112433693B (zh) * 2020-12-11 2023-06-23 维沃移动通信(杭州)有限公司 分屏显示方法、装置及电子设备
CN112698896B (zh) * 2020-12-25 2024-05-31 维沃移动通信有限公司 文本显示方法和电子设备
CN112764640A (zh) * 2020-12-31 2021-05-07 北京谊安医疗系统股份有限公司 一种用于医疗设备的监测值切换显示方法及系统
CN115079971A (zh) * 2021-03-16 2022-09-20 Oppo广东移动通信有限公司 一种多屏设备的控制方法、电子设备和存储介质
KR20230009222A (ko) * 2021-07-08 2023-01-17 삼성전자주식회사 복수의 터치스크린 디스플레이를 포함하는 전자 장치 및 화면 분할 방법
CN115237324A (zh) * 2022-07-11 2022-10-25 Oppo广东移动通信有限公司 分屏显示方法、装置、电子设备及计算机可读介质
CN115268755B (zh) * 2022-08-11 2024-04-12 北京奕斯伟计算技术股份有限公司 一种手势判别方法及触控芯片
CN118312085A (zh) * 2024-05-30 2024-07-09 深圳新芯智能有限公司 分屏处理方法、装置、用户终端及存储介质

Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN103324435A (zh) * 2013-05-24 2013-09-25 华为技术有限公司 分屏显示的方法、装置及其电子设备
CN104331246A (zh) * 2014-11-19 2015-02-04 广州三星通信技术研究有限公司 在终端中进行分屏显示的设备和方法
US9128606B2 (en) * 2011-06-27 2015-09-08 Lg Electronics Inc. Mobile terminal and screen partitioning method thereof
CN104898952A (zh) * 2015-06-16 2015-09-09 魅族科技(中国)有限公司 一种终端分屏实现方法及终端

Family Cites Families (18)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US8645853B2 (en) 2006-11-03 2014-02-04 Business Objects Software Ltd. Displaying visualizations linked to one or more data source queries
KR100831721B1 (ko) 2006-12-29 2008-05-22 엘지전자 주식회사 휴대단말기의 디스플레이 장치 및 방법
US7877707B2 (en) * 2007-01-06 2011-01-25 Apple Inc. Detecting and interpreting real-world and security gestures on touch and hover sensitive devices
KR101640460B1 (ko) 2009-03-25 2016-07-18 삼성전자 주식회사 휴대 단말기의 분할 화면 운용 방법 및 이를 지원하는 휴대 단말기
US20120235925A1 (en) * 2011-03-14 2012-09-20 Migos Charles J Device, Method, and Graphical User Interface for Establishing an Impromptu Network
KR101891803B1 (ko) 2011-05-23 2018-08-27 삼성전자주식회사 터치스크린을 구비한 휴대 단말기의 화면 편집 방법 및 장치
JP6021335B2 (ja) 2011-12-28 2016-11-09 任天堂株式会社 情報処理プログラム、情報処理装置、情報処理システム、および、情報処理方法
CN103067569B (zh) 2012-12-10 2015-01-14 广东欧珀移动通信有限公司 一种智能手机多窗口显示方法和装置
JP6215534B2 (ja) 2013-01-07 2017-10-18 サターン ライセンシング エルエルシーSaturn Licensing LLC 情報処理装置及び情報処理方法、並びにコンピューター・プログラム
CN107621922B (zh) 2013-03-07 2021-04-02 北京三星通信技术研究有限公司 分屏操作的方法及装置
KR20140113119A (ko) 2013-03-15 2014-09-24 엘지전자 주식회사 전자 기기 및 그 제어방법
US9612689B2 (en) * 2015-02-02 2017-04-04 Qeexo, Co. Method and apparatus for classifying a touch event on a touchscreen as related to one of multiple function generating interaction layers and activating a function in the selected interaction layer
US10599250B2 (en) 2013-05-06 2020-03-24 Qeexo, Co. Using finger touch types to interact with electronic devices
US20150035759A1 (en) * 2013-08-02 2015-02-05 Qeexo, Co. Capture of Vibro-Acoustic Data Used to Determine Touch Types
CN103425453B (zh) 2013-08-23 2016-12-28 广东欧珀移动通信有限公司 一种分屏显示方法和装置
CN103475784B (zh) 2013-09-18 2016-03-30 广东欧珀移动通信有限公司 一种手机应用程序窗口模式显示和操作方法
GB2523132A (en) 2014-02-13 2015-08-19 Nokia Technologies Oy An apparatus and associated methods for controlling content on a display user interface
CN104965702B (zh) 2015-06-15 2017-08-25 广东欧珀移动通信有限公司 一种智能终端的多窗口运行方法、装置及智能终端

Patent Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US9128606B2 (en) * 2011-06-27 2015-09-08 Lg Electronics Inc. Mobile terminal and screen partitioning method thereof
CN103324435A (zh) * 2013-05-24 2013-09-25 华为技术有限公司 分屏显示的方法、装置及其电子设备
CN104331246A (zh) * 2014-11-19 2015-02-04 广州三星通信技术研究有限公司 在终端中进行分屏显示的设备和方法
CN104898952A (zh) * 2015-06-16 2015-09-09 魅族科技(中国)有限公司 一种终端分屏实现方法及终端

Non-Patent Citations (1)

* Cited by examiner, † Cited by third party
Title
See also references of EP3370139A4 *

Cited By (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN107273036A (zh) * 2017-06-30 2017-10-20 广东欧珀移动通信有限公司 移动终端及其分屏控制方法、计算机可读存储介质
US11237724B2 (en) 2017-06-30 2022-02-01 Guangdong Oppo Mobile Telecommunications Corp., Ltd. Mobile terminal and method for split screen control thereof, and computer readable storage medium
RU2764157C1 (ru) * 2018-06-29 2022-01-13 Бэйцзин Майкролайв Вижн Текнолоджи Ко., Лтд Способ и устройство для переключения глобальных специальных эффектов, оконечное устройство и носитель данных
US11249630B2 (en) 2018-06-29 2022-02-15 Beijing Microlive Vision Technology Co., Ltd Method, apparatus, terminal device, and storage medium for switching global special effects
JP2022505897A (ja) * 2018-11-22 2022-01-14 ホアウェイ・テクノロジーズ・カンパニー・リミテッド タッチ操作をロックする方法及び電子デバイス
JP7215813B2 (ja) 2018-11-22 2023-01-31 ホアウェイ・テクノロジーズ・カンパニー・リミテッド タッチ操作をロックする方法及び電子デバイス

Also Published As

Publication number Publication date
EP3370139A4 (en) 2018-11-07
EP3370139A1 (en) 2018-09-05
US20180356972A1 (en) 2018-12-13
CN107077295A (zh) 2017-08-18
US10642483B2 (en) 2020-05-05
JP6675769B2 (ja) 2020-04-01
JP2019500688A (ja) 2019-01-10
PH12018501105A1 (en) 2019-01-21
AU2015415755A1 (en) 2018-06-14
AU2020201096B2 (en) 2021-06-24
RU2687037C1 (ru) 2019-05-06
KR20180081133A (ko) 2018-07-13
AU2020201096A1 (en) 2020-03-05
KR102141099B1 (ko) 2020-08-04

Similar Documents

Publication Publication Date Title
WO2017088131A1 (zh) 一种快速分屏的方法、装置、电子设备、显示界面以及存储介质
KR102240088B1 (ko) 애플리케이션 스위칭 방법, 디바이스 및 그래픽 사용자 인터페이스
US10725646B2 (en) Method and apparatus for switching screen interface and terminal
US20200183574A1 (en) Multi-Task Operation Method and Electronic Device
CN109426410B (zh) 控制光标移动的方法、内容选择方法、控制页面滚动的方法及电子设备
WO2019015404A1 (zh) 在分屏模式下切换应用的方法、装置及其相关设备
CN105786878B (zh) 一种浏览对象的显示方法及装置
WO2020258929A1 (zh) 文件夹界面切换方法及终端设备
EP2613247A2 (en) Method and apparatus for displaying keypad in terminal having touch screen
CN110837318A (zh) 移动终端折叠屏的防误触方法、装置及存储介质
WO2018039914A1 (zh) 一种数据复制方法及用户终端
CN108833679B (zh) 一种对象显示方法及终端设备
CN107003759B (zh) 一种选择文本的方法
EP3674867B1 (en) Human-computer interaction method and electronic device
US20180253225A1 (en) Display Operation Method and Apparatus, User Interface, and Storage Medium
CN114296626A (zh) 一种输入界面的显示方法及终端
CN110874141A (zh) 图标移动的方法及终端设备
CN105700762B (zh) 一种显示选择项信息的方法和装置
WO2017166209A1 (zh) 设置勿触区域的方法、装置、电子设备、显示界面以及存储介质
CN107924261B (zh) 一种选择文本的方法

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 15909043

Country of ref document: EP

Kind code of ref document: A1

WWE Wipo information: entry into national phase

Ref document number: 12018501105

Country of ref document: PH

Ref document number: 2018526897

Country of ref document: JP

NENP Non-entry into the national phase

Ref country code: DE

WWE Wipo information: entry into national phase

Ref document number: 2015909043

Country of ref document: EP

ENP Entry into the national phase

Ref document number: 20187016604

Country of ref document: KR

Kind code of ref document: A

ENP Entry into the national phase

Ref document number: 2015415755

Country of ref document: AU

Date of ref document: 20151125

Kind code of ref document: A