CN107132983A - Split screen window operation method and device - Google Patents
Split screen window operation method and device Download PDFInfo
- Publication number
- CN107132983A CN107132983A CN201710245763.6A CN201710245763A CN107132983A CN 107132983 A CN107132983 A CN 107132983A CN 201710245763 A CN201710245763 A CN 201710245763A CN 107132983 A CN107132983 A CN 107132983A
- Authority
- CN
- China
- Prior art keywords
- split screen
- input operation
- screen window
- control region
- collaborative control
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Granted
Links
Classifications
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F3/00—Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
- G06F3/01—Input arrangements or combined input and output arrangements for interaction between user and computer
- G06F3/048—Interaction techniques based on graphical user interfaces [GUI]
- G06F3/0487—Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser
- G06F3/0488—Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser using a touch-screen or digitiser, e.g. input of commands through traced gestures
- G06F3/04886—Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser using a touch-screen or digitiser, e.g. input of commands through traced gestures by partitioning the display area of the touch-screen or the surface of the digitising tablet into independently controllable areas, e.g. virtual keyboards or menus
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04M—TELEPHONIC COMMUNICATION
- H04M1/00—Substation equipment, e.g. for use by subscribers
- H04M1/72—Mobile telephones; Cordless telephones, i.e. devices for establishing wireless links to base stations without route selection
- H04M1/724—User interfaces specially adapted for cordless or mobile telephones
- H04M1/72403—User interfaces specially adapted for cordless or mobile telephones with means for local support of applications that increase the functionality
Landscapes
- Engineering & Computer Science (AREA)
- General Engineering & Computer Science (AREA)
- Human Computer Interaction (AREA)
- Theoretical Computer Science (AREA)
- Physics & Mathematics (AREA)
- General Physics & Mathematics (AREA)
- Computer Networks & Wireless Communication (AREA)
- Signal Processing (AREA)
- User Interface Of Digital Computer (AREA)
Abstract
The disclosure is directed to split screen window operation method and device.The split screen window operation method includes:Under split screen state, monitor the input operation in the Collaborative Control region on touch-screen, in response to monitoring input operation in Collaborative Control region, determine the corresponding operational order of input operation, the corresponding operational order of input operation is distributed to the application that current page is currently running in each split screen window respectively, controls the application that current page is currently running in each split screen window to perform operational order.Above-mentioned technical proposal, operation is performed in Collaborative Control region, and the application response in different split screens can be triggered simultaneously, the operating procedure of user is reduced, largely improves the operating efficiency of user, improve user experience.
Description
Technical field
This disclosure relates to field of terminal equipment, more particularly to split screen window operation method and device.
Background technology
The terminal devices such as present mobile phone have become the indispensable article of people's daily life.In order that people are more square
Just using terminal equipment, the size of terminal device screen constantly increases, and touch controllable function is also more and more sensitiveer.Recently, with
The development of the giant-screen touch control terminal of numerous applications can be supported, system split screen is gradually popular, certainly in the small screen mobile phone, also has
User likes split screen.Under span mode, user can open multiple APP simultaneously, check simultaneously.
The content of the invention
The embodiment of the present disclosure provides split screen window operation method and device.The technical scheme is as follows:
According to the first aspect of the embodiment of the present disclosure there is provided a kind of split screen window operation method, including:
Under split screen state, the input operation in Collaborative Control region on monitoring touch-screen;
In response to monitoring the input operation in the Collaborative Control region, the corresponding behaviour of the input operation is determined
Instruct;
The corresponding operational order of the input operation is distributed into current page in each split screen window respectively to be currently running
Application;
The application that current page is currently running in each split screen window is controlled to perform the operational order.
In one embodiment, methods described may also include:
At least one predeterminable area on touch-screen is set to Collaborative Control region, the predeterminable area includes:Split screen
The status bar region of window, the bottom section of split screen window, the left field of split screen window, the right side area of split screen window, point
The bottom section in region and display screen where the line of demarcation of screen window.
In one embodiment, methods described may also include:
The corresponding executable gesture operation in each Collaborative Control region is pre-set, and will each described collaboration control
Region processed and corresponding executable gesture operation correspondence are preserved;
After the input operation in Collaborative Control region on the monitoring touch-screen, methods described also includes:
Whether determine the input operation is the corresponding executable gesture operation in the Collaborative Control region;
When the input operation is the corresponding executable gesture operation in the Collaborative Control region, the input is determined
Operate corresponding operational order.
In one embodiment, methods described may also include:
When the corresponding executable gesture operation in the non-Collaborative Control region of the input operation, forbid response described
Input operation.
In one embodiment, methods described may also include:
Prestore the corresponding relation of input operation and operational order;
The application that current page is currently running in control each split screen window performs the operational order, bag
Include:
The corresponding operational order of the input operation is determined from the corresponding relation, is controlled in each split screen window
The application that current page is currently running performs the operational order.
According to the second aspect of the embodiment of the present disclosure there is provided a kind of split screen window operation device, including:
Monitoring modular, under split screen state, monitoring the input operation in the Collaborative Control region on touch-screen;
First determining module, in response to monitoring the input operation in the Collaborative Control region, determining institute
State the corresponding operational order of input operation;
Distribution module, for the corresponding operational order of the input operation to be distributed in each split screen window currently respectively
The application that the page is currently running;
Control module, for controlling the application that current page is currently running in each split screen window to perform the operation
Instruction.
In one embodiment, described device may also include:
First setup module, it is described at least one predeterminable area on touch-screen to be set into Collaborative Control region
Predeterminable area includes:The status bar region of split screen window, the bottom section of split screen window, the left field of split screen window, split screen
The bottom section in region and display screen where the right side area of window, the line of demarcation of split screen window.
In one embodiment, described device may also include:
Second setup module, for pre-setting the corresponding executable gesture operation in each Collaborative Control region, and
Each described Collaborative Control region and corresponding executable gesture operation correspondence are preserved;
Second determining module, for determining whether the input operation is that the Collaborative Control region is corresponding executable
Gesture operation;
3rd determining module, for being the corresponding executable gesture behaviour in the Collaborative Control region when the input operation
When making, the corresponding operational order of the input operation is determined.
In one embodiment, described device may also include:
Disabled module, for when the corresponding executable gesture operation in the non-Collaborative Control region of the input operation
When, forbid responding the input operation.
In one embodiment, described device may also include:
Memory module, the corresponding relation for prestoring input operation and operational order;
The control module, including:
Control submodule, for determining the corresponding operational order of the input operation from the corresponding relation, controls institute
State the application execution operational order that current page in each split screen window is currently running.
According to the third aspect of the embodiment of the present disclosure there is provided a kind of split screen window operation device, including:
Processor;
Memory for storing processor-executable instruction;
Wherein, the processor is configured as:
Under split screen state, the input operation in Collaborative Control region on monitoring touch-screen;
In response to monitoring the input operation in the Collaborative Control region, the corresponding behaviour of the input operation is determined
Instruct;
The corresponding operational order of the input operation is distributed into current page in each split screen window respectively to be currently running
Application;
The application that current page is currently running in each split screen window is controlled to perform the operational order.
The technical scheme provided by this disclosed embodiment can include the following benefits:
Above-mentioned technical proposal, by under split screen state, monitoring the input operation in the Collaborative Control region on touch-screen,
In response to monitoring input operation in Collaborative Control region, the corresponding operational order of input operation is determined, by input operation pair
The operational order answered is distributed to the application that current page is currently running in each split screen window respectively, controls in each split screen window
The application that current page is currently running performs operational order.So as to perform operation in Collaborative Control region, it can trigger simultaneously
Application response in different split screens, reduces the operating procedure of user, largely improves the operating efficiency of user, carry
User experience is risen.
It should be appreciated that the general description of the above and detailed description hereinafter are only exemplary and explanatory, not
The disclosure can be limited.
Brief description of the drawings
Accompanying drawing herein is merged in specification and constitutes the part of this specification, shows the implementation for meeting the disclosure
Example, and be used to together with specification to explain the principle of the disclosure.
Figure 1A is the split screen interface display schematic diagram according to an exemplary embodiment.
Figure 1B is the flow chart of the split screen window operation method according to an exemplary embodiment.
Fig. 2 is the flow chart of another split screen window operation method according to an exemplary embodiment.
Fig. 3 is the flow chart of another split screen window operation method according to an exemplary embodiment.
Fig. 4 is the flow chart of another split screen window operation method according to an exemplary embodiment.
Fig. 5 is the flow chart of the split screen window operation method according to an exemplary embodiment.
Fig. 6 is the block diagram of the split screen window operation device according to an exemplary embodiment.
Fig. 7 is the block diagram of another split screen window operation device according to an exemplary embodiment.
Fig. 8 is the block diagram of another split screen window operation device according to an exemplary embodiment.
Fig. 9 is the block diagram of another split screen window operation device according to an exemplary embodiment.
Figure 10 is the block diagram of the split screen window operation device according to an exemplary embodiment.
Figure 11 is the block diagram suitable for split screen window operation device according to an exemplary embodiment.
Embodiment
Here exemplary embodiment will be illustrated in detail, its example is illustrated in the accompanying drawings.Following description is related to
During accompanying drawing, unless otherwise indicated, the same numbers in different accompanying drawings represent same or analogous key element.Following exemplary embodiment
Described in embodiment do not represent all embodiments consistent with the disclosure.On the contrary, they be only with it is such as appended
The example of the consistent apparatus and method of some aspects be described in detail in claims, the disclosure.
Terminal is detected is on a display screen for triggering the operational order of split screen, and being called according to the operational order detected
System is corresponding to be used for the interface of split screen, opens split screen interface.Wherein, the operational order for triggering split screen for example can be advance
The sliding trace for split screen for being pressed or pre-setting for the operation button of split screen set.Display screen can be drawn
It is divided into two, even more than screen area.When user's one of which split screen opens an APP, the content of the APP is only at this
Shown in split screen.Each point can also be set in the application of two split screen operations with five or five pair of half split screen or pseudo-ginseng split screen
The display size of screen.As shown in Figure 1A, Figure 1A is split screen interface display schematic diagram.Terminal screen is divided into two in Figure 1A,
Such as, the 1st split screen sees that microblogging, the 2nd split screen see Facebook, then how to slide the content of 2 split screens simultaneouslyOr the
1 split screen sees novel, and the 2nd split screen is visited a friend circle, how when circle of friends is slided, and can allow novel page turningSlide and look into simultaneously
2 APP are seen, the speed that user browses APP can be improved.The problem of disclosure will exactly solve cooperating between split screen, makes
User can control the application of different split screens simultaneously.
Figure 1B is a kind of flow chart of split screen window operation method according to an exemplary embodiment, such as Figure 1B institutes
Show, the split screen window operation method comprises the following steps S101-S104:
In step S101, under split screen state, the input operation in Collaborative Control region on monitoring touch-screen.
Can be using the ad-hoc location of prespecified screen as Collaborative Control region, user performs specific in these specific regions
Operation, the content of multiple split screens, such as sliding list, page turning, video fast forward etc. can be controlled simultaneously.
In step s 102, in response to monitoring input operation in Collaborative Control region, determine that input operation is corresponding
Operational order.
In step s 103, the corresponding operational order of input operation is distributed to current page in each split screen window respectively
The application being currently running.
In step S104, the application that current page is currently running in each split screen window is controlled to perform operational order.
When user when Collaborative Control region is operated, it is necessary to know what the application in each split screen should perform
Operation, such as, for same operational order, the operation that different applications are corresponded to respectively can be slip, page turning etc..It need to shift to an earlier date
Predetermined registration operation instructs the operation of corresponding different application.
The above method of the embodiment of the present disclosure, by under split screen state, monitoring in the Collaborative Control region on touch-screen
Input operation, in response to monitoring input operation in Collaborative Control region, determine the corresponding operational order of input operation, will
The corresponding operational order of input operation is distributed to the application that current page is currently running in each split screen window respectively, and control is each
The application that current page is currently running in split screen window performs operational order.So as to perform operation, energy in Collaborative Control region
Enough application responses triggered simultaneously in different split screens, reduce the operating procedure of user, largely improve user's
Operating efficiency, improves user experience.
In one embodiment, as shown in Fig. 2 before step S101, the split screen window operation method may also include with
Lower step S105:
In step S105, at least one predeterminable area on touch-screen is set to Collaborative Control region, predeterminable area
Including:The status bar region of split screen window, the bottom section of split screen window, the left field of split screen window, the right side of split screen window
The bottom section in region and display screen where side region, the line of demarcation of split screen window.
Multiple specific regions on device screen are set as Collaborative Control region, and record these specific regions, such as
Status bar, bottom of screen, left side, right side etc. are close to the rectangular area of screen edge;Meanwhile, in the separated place of adjacent split screen,
2 split screens can be divided into above and below such as mobile phone as a specific region, then at boundary line up and down between 2 split screens, as
Specific region.Due to one or more predeterminable areas are set into Collaborative Control region in advance, mistake during operation effectively prevent
Operation.
In one embodiment, as shown in figure 3, before step S101, the split screen window operation method may also include with
Lower step S106:
In step s 106, the corresponding executable gesture operation in each Collaborative Control region is pre-set, and by institute
State each Collaborative Control region and corresponding executable gesture operation correspondence is preserved.
After the input operation in Collaborative Control region on monitoring touch-screen, the split screen window operation method also includes
Following steps S107-S108:
In step s 107, whether determine input operation is the corresponding executable gesture operation in Collaborative Control region.
In step S108, when input operation is the corresponding executable gesture operation in Collaborative Control region, determine defeated
Enter and operate corresponding operational order.
By multiple regions on display screen, Collaborative Control region is used as.When user operates in this region, it can control simultaneously
APP in multiple split screens;Meanwhile, different Collaborative Control regions, user can be respectively set to control some split screens or certain
A little APP.
Set the certain gestures that can be performed in each Collaborative Control region to operate, and record each Collaborative Control region pair
The executable operation of the gesture operation answered, such as status bar region is:Slide laterally, click on, press;Right side area can be held
OK:Slide;The separated place of adjacent split screen can perform:Operations such as slip, pressing etc..
In the present embodiment, some recognizable gesture operations of each Collaborative Control region correspondence, so as to facilitate user to enter
Row operation, lifts the usage experience of user.
In one embodiment, as shown in figure 4, after step S107, the split screen window operation method may also include with
Lower step S109:
In step S109, when the corresponding executable gesture operation in input operation miscoordination control area, forbid ringing
Answer input operation.
If in a Collaborative Control region, performing the operation that this region is not provided with, then this time operation is invalid
, i.e., it will not be responded by system.
In one embodiment, as shown in figure 5, before step S101, the split screen window operation method may also include with
Lower step S110:
In step s 110, the corresponding relation of input operation and operational order is prestored.
Step S104 may be embodied as following steps S1041:
In step S1041, the corresponding operational order of input operation is determined from corresponding relation, each split screen window is controlled
The application that interior current page is currently running performs operational order.
Pre-set the different operation of user and distinguish corresponding operational order, so as to facilitate different applications for user's
Same input operation performs the corresponding operational order of the input operation respectively.Such as:
Slide, because being slided in region, has 2 directions to slide, can be treated as different operations.
Horizontal slide to the right:Slide downward instruction is generated, simultaneously comprising sliding distance in instruction;
Slide laterally to the left:Upward sliding instruction is generated, simultaneously comprising sliding distance in instruction;
Longitudinal slide downward operation:Slide downward instruction is generated, simultaneously comprising sliding distance in instruction;
Slide longitudinally upward:Upward sliding instruction is generated, simultaneously comprising sliding distance in instruction.
Pressing operation, corresponding to different pressing dynamics grades, can generate different instruction:
1 grade of pressing dynamics:Slide downward instruction is generated, sliding distance is:30 pixels,
2 grades of pressing dynamics:Upward sliding instruction is generated, sliding distance is:25 pixels.
Other operations are same, can generate corresponding operational order.It is other operation including but not limited to:Click on, double-click, refer to
Line identification etc..
It should be noted that aforesaid operations instruction name, a simply title, the application that response is not represented has to carry out behaviour
Make the operation of instruction name representative.Such as slide downward is instructed, and the application of response can perform slide downward operation, can also hold
Row turns down a page operations.
Illustrate the above-mentioned technical proposal that the embodiment of the present disclosure is provided with specific embodiment below.
The present embodiment utilizes the split screen window operation method that the disclosure is provided, and this method comprises the following steps A1-A8:
Step A1, is set to Collaborative Control region, predeterminable area includes by least one predeterminable area on touch-screen:Point
Shield the status bar region of window, the bottom section of split screen window, the left field of split screen window, the right side area of split screen window,
The bottom section in region and display screen where the line of demarcation of split screen window.
Step A2, pre-sets the corresponding executable gesture operation in each Collaborative Control region, and each is assisted
Preserved with control area and corresponding executable gesture operation correspondence.
Step A3, pre-sets the corresponding relation of input operation and operational order, and by the input operation and corresponding behaviour
Make instruction correspondence to preserve.
Step A4, the operation for setting each application to respond, such as the input operation of user, correspondence is for example slided
Dynamic, page turning, jump to specified page or open certain application etc..
Step A5, receives the split screen instruction of user, according to the instruction by system split screen.
Step A6, is applied when being loaded into split screen, is sent register instruction, is shown that current application can respond Collaborative Control
Instruction, the split screen ID of this module record registration and application.
Step A7, under split screen state, the input operation in Collaborative Control region on monitoring touch-screen.Recognize user behaviour
The Collaborative Control region of work and the input operation of user, if meet Collaborative Control region and the input operation of storage.If it is,
Then send corresponding instruction and arrive registered split screen and application.
Step A8, in step S104, controls the application that current page is currently running in each split screen window to perform input
Operate corresponding operational order.
The present embodiment, can make user perform operation in a region, while triggering the sound of the application in multiple split screens
Should, thus greatly improve the operating efficiency of user.
Following is disclosure device embodiment, can be used for performing method of disclosure embodiment.It is real for disclosure device
The details not disclosed in example is applied, method of disclosure embodiment is refer to.
Fig. 6 is a kind of block diagram of split screen window operation device according to an exemplary embodiment, as shown in fig. 6, should
Split screen window operation device can include but is not limited to:Monitoring modular 61, the first determining module 62, distribution module 63 and control mould
Block 64.
Monitoring modular 61, under split screen state, monitoring the input operation in the Collaborative Control region on touch-screen;
First determining module 62, for monitoring to monitor input behaviour in Collaborative Control region in response to monitoring modular 61
Make, determine the corresponding operational order of input operation;
Distribution module 63, the corresponding operational order of input operation for the first determining module 62 to be determined is distributed respectively
The application being currently running to current page in each split screen window;
Control module 64, for controlling the application that current page is currently running in each split screen window to perform distribution module 63
The operational order of distribution.
In one embodiment, as shown in fig. 7, the split screen window operation device may also include:
First setup module 65, at least one predeterminable area on touch-screen to be set into Collaborative Control region, in advance
If region includes:The status bar region of split screen window, the bottom section of split screen window, the left field of split screen window, split screen window
Mouthful right side area, the bottom section of the region where the line of demarcation of split screen window and display screen.
In one embodiment, as shown in figure 8, the split screen window operation device may also include:
Second setup module 66, for pre-setting the corresponding executable gesture operation in each Collaborative Control region,
And preserve each Collaborative Control region and corresponding executable gesture operation correspondence;
Second determining module 67, for determine input operation whether be the second setup module 66 set Collaborative Control region
Corresponding executable gesture operation;
3rd determining module 68, for determining that input operation is that Collaborative Control region is corresponding when the second determining module 67
During executable gesture operation, the corresponding operational order of input operation is determined.
In one embodiment, as shown in figure 9, the split screen window operation device may also include:
Disabled module 69, for when the corresponding executable gesture operation in input operation miscoordination control area, forbidding
Respond input operation.
In one embodiment, as shown in Figure 10, the split screen window operation device may also include:
Memory module 70, the corresponding relation for prestoring input operation and operational order;
Control module 64, including:
Control submodule, for determining that the corresponding operation of input operation refers in the corresponding relation that is stored from memory module 70
Order, controls the application that current page is currently running in each split screen window to perform operational order.
The said apparatus of the embodiment of the present disclosure, by under split screen state, monitoring in the Collaborative Control region on touch-screen
Input operation, in response to monitoring input operation in Collaborative Control region, determine the corresponding operational order of input operation, will
The corresponding operational order of input operation is distributed to the application that current page is currently running in each split screen window respectively, and control is each
The application that current page is currently running in split screen window performs operational order.So as to perform operation, energy in Collaborative Control region
Enough application responses triggered simultaneously in different split screens, reduce the operating procedure of user, largely improve user's
Operating efficiency, improves user experience.
The embodiment of the present disclosure also provides a kind of split screen window operation device, including:
Processor;
Memory for storing processor-executable instruction;
Wherein, the processor is configured as:
Under split screen state, the input operation in Collaborative Control region on monitoring touch-screen;
In response to monitoring the input operation in the Collaborative Control region, the corresponding behaviour of the input operation is determined
Instruct;
The corresponding operational order of the input operation is distributed into current page in each split screen window respectively to be currently running
Application;
The application that current page is currently running in each split screen window is controlled to perform the operational order.
Above-mentioned processor is additionally configured to:
At least one predeterminable area on touch-screen is set to Collaborative Control region, the predeterminable area includes:Split screen
The status bar region of window, the bottom section of split screen window, the left field of split screen window, the right side area of split screen window, point
The bottom section in region and display screen where the line of demarcation of screen window.
Above-mentioned processor is additionally configured to:
The corresponding executable gesture operation in each Collaborative Control region is pre-set, and will each described collaboration control
Region processed and corresponding executable gesture operation correspondence are preserved;
After the input operation in Collaborative Control region on the monitoring touch-screen, methods described also includes:
Whether determine the input operation is the corresponding executable gesture operation in the Collaborative Control region;
When the input operation is the corresponding executable gesture operation in the Collaborative Control region, the input is determined
Operate corresponding operational order.
Above-mentioned processor is additionally configured to:
When the corresponding executable gesture operation in the non-Collaborative Control region of the input operation, forbid response described
Input operation.
Above-mentioned processor is additionally configured to:
Prestore the corresponding relation of input operation and operational order;
The application that current page is currently running in control each split screen window performs the operational order, bag
Include:
The corresponding operational order of the input operation is determined from the corresponding relation, is controlled in each split screen window
The application that current page is currently running performs the operational order.
Figure 11 is a kind of block diagram for split screen window operation device according to an exemplary embodiment, and the device is fitted
For terminal device.For example, device 1200 can be mobile phone, computer, digital broadcast terminal, messaging devices, trip
Play console, tablet device, Medical Devices, body-building equipment, personal digital assistant etc..
Device 1200 can include following one or more assemblies:Processing assembly 1202, memory 1204, power supply module
1206, multimedia groupware 1208, audio-frequency assembly 1210, the interface 1212 of input/output (I/O), sensor cluster 1214, and
Communication component 1216.
The integrated operation of the usual control device 1200 of processing assembly 1202, such as with display, call, data communication,
The camera operation operation associated with record operation.Processing assembly 1202 can include one or more processors 1220 to perform
Instruction, to complete all or part of step of above-mentioned method.In addition, processing assembly 1202 can include one or more moulds
Block, is easy to the interaction between processing assembly 1202 and other assemblies.For example, processing assembly 1202 can include multi-media module,
To facilitate the interaction between multimedia groupware 1208 and processing assembly 1202.
Memory 1204 is configured as storing various types of data supporting the operation in device 1200.These data
Example includes the instruction of any application program or method for being used to operate on device 1200, contact data, telephone book data,
Message, picture, video etc..Memory 1204 can by any kind of volatibility or non-volatile memory device or they
Combination realize, such as static RAM (SRAM), Electrically Erasable Read Only Memory (EEPROM), it is erasable can
Program read-only memory (EPROM), programmable read only memory (PROM), read-only storage (ROM), magnetic memory, flash memory
Reservoir, disk or CD.
Power supply module 1206 provides electric power for the various assemblies of device 1200.Power supply module 1206 can include power management
System, one or more power supplys, and other components associated with generating, managing and distributing electric power for device 1200.
Multimedia groupware 1208 is included in the screen of one output interface of offer between described device 1200 and user.
In some embodiments, screen can include liquid crystal display (LCD) and touch panel (TP).If screen includes touch panel,
Screen may be implemented as touch-screen, to receive the input signal from user.Touch panel includes one or more touch and passed
Sensor is with the gesture on sensing touch, slip and touch panel.The touch sensor can not only sensing touch or slip be dynamic
The border of work, but also the detection duration related to the touch or slide and pressure.In certain embodiments, it is many
Media component 1208 includes a front camera and/or rear camera.When device 1200 is in operator scheme, mould is such as shot
When formula or video mode, front camera and/or rear camera can receive the multi-medium data of outside.Each preposition shooting
Head and rear camera can be a fixed optical lens systems or with focusing and optical zoom capabilities.
Audio-frequency assembly 1210 is configured as output and/or input audio signal.For example, audio-frequency assembly 1210 includes a wheat
Gram wind (MIC), when device 1200 is in operator scheme, when such as call model, logging mode and speech recognition mode, microphone quilt
It is configured to receive external audio signal.The audio signal received can be further stored in memory 1204 or via communication
Component 1216 is sent.In certain embodiments, audio-frequency assembly 1210 also includes a loudspeaker, for exports audio signal.
I/O interfaces 1212 are that interface, above-mentioned peripheral interface module are provided between processing assembly 1202 and peripheral interface module
Can be keyboard, click wheel, button etc..These buttons may include but be not limited to:Home button, volume button, start button and
Locking press button.
Sensor cluster 1214 includes one or more sensors, and the state for providing various aspects for device 1200 is commented
Estimate.For example, sensor cluster 1214 can detect opening/closed mode of device 1200, the relative positioning of component, such as institute
Display and keypad that component is device 1200 are stated, sensor cluster 1214 can be with detection means 1200 or device 1,200 1
The position of individual component changes, the existence or non-existence that user contacts with device 1200, the orientation of device 1200 or acceleration/deceleration and dress
Put 1200 temperature change.Sensor cluster 1214 can include proximity transducer, be configured in not any physics
The presence of object nearby is detected during contact.Sensor cluster 1214 can also include optical sensor, such as CMOS or ccd image sensing
Device, for being used in imaging applications.In certain embodiments, the sensor cluster 1214 can also include acceleration sensing
Device, gyro sensor, Magnetic Sensor, pressure sensor or temperature sensor.
Communication component 1216 is configured to facilitate the communication of wired or wireless way between device 1200 and other equipment.Dress
The wireless network based on communication standard, such as WiFi, 2G or 3G, or combinations thereof can be accessed by putting 1200.It is exemplary at one
In embodiment, communication component 1216 receives broadcast singal or broadcast correlation from external broadcasting management system via broadcast channel
Information.In one exemplary embodiment, the communication component 1216 also includes near-field communication (NFC) module, to promote short distance
Communication.For example, radio frequency identification (RFID) technology, Infrared Data Association (IrDA) technology, ultra wide band can be based in NFC module
(UWB) technology, bluetooth (BT) technology and other technologies are realized.
In the exemplary embodiment, device 1200 can be by one or more application specific integrated circuits (ASIC), numeral
Signal processor (DSP), digital signal processing appts (DSPD), PLD (PLD), field programmable gate array
(FPGA), controller, microcontroller, microprocessor or other electronic components are realized, for performing the above method.
In the exemplary embodiment, a kind of non-transitorycomputer readable storage medium including instructing, example are additionally provided
Such as include the memory 1204 of instruction, above-mentioned instruction can be performed to complete the above method by the processor 1220 of device 1200.Example
Such as, the non-transitorycomputer readable storage medium can be ROM, it is random access memory (RAM), CD-ROM, tape, soft
Disk and optical data storage devices etc..
A kind of non-transitorycomputer readable storage medium, when the instruction in the storage medium is by the processing of device 1200
When device is performed so that device 1200 is able to carry out above-mentioned split screen window operation method, and methods described includes:
Under split screen state, the input operation in Collaborative Control region on monitoring touch-screen;
In response to monitoring the input operation in the Collaborative Control region, the corresponding behaviour of the input operation is determined
Instruct;
The corresponding operational order of the input operation is distributed into current page in each split screen window respectively to be currently running
Application;
The application that current page is currently running in each split screen window is controlled to perform the operational order.
In one embodiment, methods described may also include:
At least one predeterminable area on touch-screen is set to Collaborative Control region, the predeterminable area includes:Split screen
The status bar region of window, the bottom section of split screen window, the left field of split screen window, the right side area of split screen window, point
The bottom section in region and display screen where the line of demarcation of screen window.
In one embodiment, methods described may also include:
The corresponding executable gesture operation in each Collaborative Control region is pre-set, and will each described collaboration control
Region processed and corresponding executable gesture operation correspondence are preserved;
After the input operation in Collaborative Control region on the monitoring touch-screen, methods described also includes:
Whether determine the input operation is the corresponding executable gesture operation in the Collaborative Control region;
When the input operation is the corresponding executable gesture operation in the Collaborative Control region, the input is determined
Operate corresponding operational order.
In one embodiment, methods described may also include:
When the corresponding executable gesture operation in the non-Collaborative Control region of the input operation, forbid response described
Input operation.
In one embodiment, methods described may also include:
Prestore the corresponding relation of input operation and operational order;
The application that current page is currently running in control each split screen window performs the operational order, bag
Include:
The corresponding operational order of the input operation is determined from the corresponding relation, is controlled in each split screen window
The application that current page is currently running performs the operational order.
Those skilled in the art will readily occur to its of the disclosure after considering specification and putting into practice disclosure disclosed herein
Its embodiment.The application is intended to any modification, purposes or the adaptations of the disclosure, these modifications, purposes or
Person's adaptations follow the general principle of the disclosure and including the undocumented common knowledge in the art of the disclosure
Or conventional techniques.Description and embodiments are considered only as exemplary, and the true scope of the disclosure and spirit are by following
Claim is pointed out.
It should be appreciated that the precision architecture that the disclosure is not limited to be described above and is shown in the drawings, and
And various modifications and changes can be being carried out without departing from the scope.The scope of the present disclosure is only limited by appended claim.
Claims (11)
1. a kind of split screen window operation method, it is characterised in that including:
Under split screen state, the input operation in Collaborative Control region on monitoring touch-screen;
In response to monitoring the input operation in the Collaborative Control region, determine that the corresponding operation of the input operation refers to
Order;
By the corresponding operational order of the input operation be distributed to that current page in each split screen window is currently running respectively should
With;
The application that current page is currently running in each split screen window is controlled to perform the operational order.
2. according to the method described in claim 1, it is characterised in that methods described also includes:
At least one predeterminable area on touch-screen is set to Collaborative Control region, the predeterminable area includes:Split screen window
Status bar region, the bottom section of split screen window, the left field of split screen window, the right side area of split screen window, split screen window
The bottom section in region and display screen where the line of demarcation of mouth.
3. method according to claim 1 or 2, it is characterised in that methods described also includes:
The corresponding executable gesture operation in each Collaborative Control region is pre-set, and will each described Collaborative Control area
Domain and corresponding executable gesture operation correspondence are preserved;
After the input operation in Collaborative Control region on the monitoring touch-screen, methods described also includes:
Whether determine the input operation is the corresponding executable gesture operation in the Collaborative Control region;
When the input operation is the corresponding executable gesture operation in the Collaborative Control region, the input operation is determined
Corresponding operational order.
4. method according to claim 3, it is characterised in that methods described also includes:
When the corresponding executable gesture operation in the non-Collaborative Control region of the input operation, forbid responding the input
Operation.
5. according to the method described in claim 1, it is characterised in that methods described also includes:
Prestore the corresponding relation of input operation and operational order;
The application that current page is currently running in control each split screen window performs the operational order, including:
The corresponding operational order of the input operation is determined from the corresponding relation, is controlled current in each split screen window
The application that the page is currently running performs the operational order.
6. a kind of split screen window operation device, it is characterised in that including:
Monitoring modular, under split screen state, monitoring the input operation in the Collaborative Control region on touch-screen;
First determining module, for described defeated to being monitored in the Collaborative Control region in response to the monitoring module monitors
Enter operation, determine the corresponding operational order of the input operation;
Distribution module, the corresponding operational order of the input operation for first determining module to be determined is distributed respectively
The application being currently running to current page in each split screen window;
Control module, for controlling the application that current page is currently running in each split screen window to perform the distribution module
The operational order of distribution.
7. device according to claim 6, it is characterised in that described device also includes:
First setup module, it is described default at least one predeterminable area on touch-screen to be set into Collaborative Control region
Region includes:The status bar region of split screen window, the bottom section of split screen window, the left field of split screen window, split screen window
Right side area, the bottom section of the region where the line of demarcation of split screen window and display screen.
8. the device according to claim 6 or 7, it is characterised in that described device also includes:
Second setup module, for pre-setting the corresponding executable gesture operation in each Collaborative Control region, and by institute
State each Collaborative Control region and corresponding executable gesture operation correspondence is preserved;
Second determining module, for determining whether the input operation is the corresponding executable gesture in the Collaborative Control region
Operation;
3rd determining module, for determining that input operation is that the Collaborative Control region is corresponding when second determining module
During executable gesture operation, the corresponding operational order of the input operation is determined.
9. device according to claim 8, it is characterised in that described device also includes:
Disabled module, for when the corresponding executable gesture operation in the non-Collaborative Control region of the input operation, prohibiting
Only respond the input operation.
10. device according to claim 6, it is characterised in that described device also includes:
Memory module, the corresponding relation for prestoring input operation and operational order;
The control module, including:
Control submodule, for determining that the corresponding operation of the input operation refers in the corresponding relation that is stored from the memory module
Order, controls the application that current page is currently running in each split screen window to perform the operational order.
11. a kind of split screen window operation device, it is characterised in that including:
Processor;
Memory for storing processor-executable instruction;
Wherein, the processor is configured as:
Under split screen state, the input operation in Collaborative Control region on monitoring touch-screen;
In response to monitoring the input operation in the Collaborative Control region, determine that the corresponding operation of the input operation refers to
Order;
By the corresponding operational order of the input operation be distributed to that current page in each split screen window is currently running respectively should
With;
The application that current page is currently running in each split screen window is controlled to perform the operational order.
Priority Applications (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN201710245763.6A CN107132983B (en) | 2017-04-14 | 2017-04-14 | Split-screen window operation method and device |
Applications Claiming Priority (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN201710245763.6A CN107132983B (en) | 2017-04-14 | 2017-04-14 | Split-screen window operation method and device |
Publications (2)
Publication Number | Publication Date |
---|---|
CN107132983A true CN107132983A (en) | 2017-09-05 |
CN107132983B CN107132983B (en) | 2020-08-14 |
Family
ID=59716755
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
CN201710245763.6A Active CN107132983B (en) | 2017-04-14 | 2017-04-14 | Split-screen window operation method and device |
Country Status (1)
Country | Link |
---|---|
CN (1) | CN107132983B (en) |
Cited By (4)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN109491738A (en) * | 2018-10-30 | 2019-03-19 | 维沃移动通信有限公司 | A kind of control method and terminal device of terminal device |
CN110225180A (en) * | 2019-04-23 | 2019-09-10 | 维沃软件技术有限公司 | A kind of content input method and terminal device |
CN113094009A (en) * | 2021-03-08 | 2021-07-09 | 联想(北京)有限公司 | Display method and display device |
WO2022016987A1 (en) * | 2020-07-21 | 2022-01-27 | Oppo广东移动通信有限公司 | Response method and apparatus for application program, and electronic device, and readable storage medium |
Citations (3)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
EP2369461A2 (en) * | 2010-03-25 | 2011-09-28 | NEC CASIO Mobile Communications, Ltd. | Terminal device and control program thereof |
CN102799364A (en) * | 2012-06-27 | 2012-11-28 | 宇龙计算机通信科技(深圳)有限公司 | Terminal and terminal controlling method |
CN105589655A (en) * | 2016-03-04 | 2016-05-18 | 孙腾 | Method and system for displaying terminal device |
-
2017
- 2017-04-14 CN CN201710245763.6A patent/CN107132983B/en active Active
Patent Citations (3)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
EP2369461A2 (en) * | 2010-03-25 | 2011-09-28 | NEC CASIO Mobile Communications, Ltd. | Terminal device and control program thereof |
CN102799364A (en) * | 2012-06-27 | 2012-11-28 | 宇龙计算机通信科技(深圳)有限公司 | Terminal and terminal controlling method |
CN105589655A (en) * | 2016-03-04 | 2016-05-18 | 孙腾 | Method and system for displaying terminal device |
Cited By (6)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN109491738A (en) * | 2018-10-30 | 2019-03-19 | 维沃移动通信有限公司 | A kind of control method and terminal device of terminal device |
CN109491738B (en) * | 2018-10-30 | 2022-03-01 | 维沃移动通信有限公司 | Terminal device control method and terminal device |
CN110225180A (en) * | 2019-04-23 | 2019-09-10 | 维沃软件技术有限公司 | A kind of content input method and terminal device |
WO2022016987A1 (en) * | 2020-07-21 | 2022-01-27 | Oppo广东移动通信有限公司 | Response method and apparatus for application program, and electronic device, and readable storage medium |
CN113094009A (en) * | 2021-03-08 | 2021-07-09 | 联想(北京)有限公司 | Display method and display device |
CN113094009B (en) * | 2021-03-08 | 2024-03-01 | 联想(北京)有限公司 | Display method and display device |
Also Published As
Publication number | Publication date |
---|---|
CN107132983B (en) | 2020-08-14 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
CN106028143A (en) | Video live broadcasting method and device | |
CN106527883B (en) | Content sharing method and device and terminal | |
CN106951884A (en) | Gather method, device and the electronic equipment of fingerprint | |
CN104598093A (en) | Method and device for prompting message during screen locking | |
CN107102772B (en) | Touch control method and device | |
CN107357505A (en) | The method, apparatus and computer-readable recording medium of screenshot capture | |
CN105160239A (en) | Application program access restriction method and apparatus | |
CN105159559A (en) | Mobile terminal control method and mobile terminal | |
CN104898505A (en) | Smart scene configuration method and device | |
CN105472303A (en) | Privacy protection method and apparatus for video chatting | |
CN104238912A (en) | Application control method and application control device | |
CN105678133A (en) | Terminal unlocking method and device | |
CN105049269A (en) | Information feedback method and device | |
CN104869595A (en) | Method and device for controlling data traffic | |
CN107132983A (en) | Split screen window operation method and device | |
CN105183188A (en) | Screen control method and device of electronic equipment | |
CN104765163B (en) | Display methods, device and the intelligent glasses of framing information | |
CN105224171A (en) | icon display method, device and terminal | |
CN105515952A (en) | Multimedia message sending method and device | |
CN104267881A (en) | Toolbar operating method and device | |
CN106792041A (en) | Content share method and device | |
CN104020628A (en) | Flash lamp prompting method and device thereof | |
CN105187671A (en) | Recording method and device | |
CN104539497B (en) | Method for connecting network and device | |
CN104836880A (en) | Method and device for processing contact person head portrait |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
PB01 | Publication | ||
PB01 | Publication | ||
SE01 | Entry into force of request for substantive examination | ||
SE01 | Entry into force of request for substantive examination | ||
GR01 | Patent grant | ||
GR01 | Patent grant |