CN103116460B - The method of moving window and dual screen communicator between multi-screen device - Google Patents

The method of moving window and dual screen communicator between multi-screen device Download PDF

Info

Publication number
CN103116460B
CN103116460B CN201210458810.2A CN201210458810A CN103116460B CN 103116460 B CN103116460 B CN 103116460B CN 201210458810 A CN201210458810 A CN 201210458810A CN 103116460 B CN103116460 B CN 103116460B
Authority
CN
China
Prior art keywords
touch
display
window
sensitive display
screen
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Active
Application number
CN201210458810.2A
Other languages
Chinese (zh)
Other versions
CN103116460A (en
Inventor
S·瑟帕尔
A·德帕兹
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Flex Electronics Id Co ltd
Original Assignee
Flex Electronics Id Co ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Priority claimed from US13/223,778 external-priority patent/US20120081309A1/en
Application filed by Flex Electronics Id Co ltd filed Critical Flex Electronics Id Co ltd
Priority to CN201810310376.0A priority Critical patent/CN108228035B/en
Publication of CN103116460A publication Critical patent/CN103116460A/en
Application granted granted Critical
Publication of CN103116460B publication Critical patent/CN103116460B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Landscapes

  • User Interface Of Digital Computer (AREA)

Abstract

A kind of dual screen communicator, including:Gesture capture region, receives gesture;First touch-sensitive display, receives gesture and shows display image (such as desktop or window of application);And second touch-sensitive display, receive gesture and show display image.Middleware receives gesture, and the gesture instruction will show that image is moved to second touch-sensitive display, such as maximized window to cover a part for the two displays at the same time from first touch-sensitive display;Respond and prior to the movement of the display image to second touch-sensitive display, conversion indicator is moved to the select location that the display image will occupy from first touch-sensitive display towards second touch-sensitive display;And thereafter, the display image is moved to the select location from first touch-sensitive display towards second touch-sensitive display.

Description

The method of moving window and dual screen communicator between multi-screen device
Cross reference to related applications
The application requires U.S.Provisional Serial 61/389,000 according to 35U.S.C. § 119 (e), applies for 2010 On October 1, in submits, entitled " DUAL DISPLAY WINDOWING SYSTEM ";61/389,117, apply in October, 2010 Submit within 1st, entitled " MULTI-OPERATING SYSTEM PORTABLE DOCKETING DEVICE ";61/389,087, Shen Please submitted on October 1st, 2010, the entitled " rights and interests and priority of TABLET COMPUTING USER INTERFACE ".It is logical Cross the overall whole instructed with reference to them and whole purposes, each by above-mentioned file are incorporated herein.
Technical field
The present invention relates to display image transition indicator.
Background technology
Most of hand-held computing devices (such as cell phone, tablet computer and electronic reader) are shown using touch-screen Device, not only transmits to user and shows information, also receive the input from user interface command.Although touch-screen display may increase Add the configurability of hand-held device, and multiple user interfaces option is provided, but this flexibility typically needs to pay generation Valency.Touch-screen provides content and receives the double duty of user command, although being for a user flexible, possibility Display is seemed mixed and disorderly, and cause visual confusion, so as to cause user to baffle and the loss of productivity.
The small form factor of hand-held computing device needs to input between the region provided in the figure of display and for reception Carry out careful balance.On the one hand, small display has fettered display space, this may increase the tired of demostrating action or result It is difficult.On the other hand, either other users Interface Mechanism is overlapped in the application being carrying out or is disposed adjacent to just virtual keypad In the application of execution, so as to need to be squeezed in the application in a part for display smaller.
For individual monitor touch panel device, this balance behavior is especially difficult.Limited screen space Weaken individual monitor touch panel device.When user enters information into device by the individual monitor, in display The middle ability for explaining information may be hindered seriously, particularly when needing the complex interaction between display and interface.
The content of the invention
A kind of two-sided multi-display hand-held computing device is needed, compared with traditional unitary display hand-held computing device, It provides the power supply of enhancing and/or versatility.These are presented by the different aspect of the disclosure, embodiment and/or configuration And other demands.Meanwhile although describing the disclosure according to one exemplary embodiment, it should be appreciated that can dividually want Seek indivedual aspects of the protection disclosure.
In one embodiment, a kind of method provides following steps:
(a) gesture is received by gesture capture region and/or touch-sensitive display, the gesture instruction will show image from first Touch-sensitive display is moved to the second touch-sensitive display;And
(b) respond and prior to the movement of the display image to second touch-sensitive display, will be changed by microprocessor Designator is moved to the choosing occupied by the display image from first touch-sensitive display towards second touch-sensitive display Positioning is put;And
(c) thereafter, by the microprocessor by the display image from first touch-sensitive display towards described second Touch-sensitive display is moved to the select location.
In one embodiment, a kind of dual screen communicator, including:
(i) gesture capture region, receives gesture;And
(ii) the first touch-sensitive display, receives gesture and display shows image, wherein the display image is application Desktop and/or window;
(iii) the second touch control display, receives gesture and display shows image;
(iv) middleware, can operatively be performed the one or more of following operation:
(A) gesture is received, the gesture instruction will be extended from first touch-sensitive display shows image, even if with covering It is also most second touch-sensitive display to be not all of;And
(B) response and prior to it is described display image arrive second touch-sensitive display extension, extend conversion indicator with Covering is if not all of being also most second touch-sensitive display;And
(C) thereafter, by the display image spreading to second touch-sensitive display.
In one configuration, the display image is desktop, and receives the gesture by the touch-sensitive display.
In one configuration, the display image is window, and the gesture, and institute are received by the gesture capture region Any display image will not be shown by stating gesture capture region.
In one configuration, the window is minimized on first touch-sensitive display, and institute is maximized by gesture Window is stated to cover most first and second touch-sensitive display.
In one configuration, conversion indicator is moved to select location along the travel path of display image, is shown with preview The movement of diagram picture, and the size and shape of conversion indicator is actually identical with display image.
In one configuration, conversion indicator respectively cannot receive or provide dynamic user's input/output, and Conversion indicator has the appearance for being different from display image.
In one configuration, show that image and conversion indicator are respectively at the work of the first and second touch-sensitive displays at the same time In dynamic display location, before starting to show the movement of image, and conversion indicator includes color, the mould of user configuration Formula, design and/or photo.
In one configuration, conversion indicator is figure available (affordance), and window is controlled by multi-screen application, and And in the mobile period from the first touch-sensitive display to the second touch-sensitive display, conversion indicator actually not to user command or Request responds.
Based on particular aspects, embodiment and/or configuration, the disclosure can provide a variety of advantages.Conversion indicator can carry For user interface that is more U.S. comfortable and simplifying.It can be also used in the transition period since processing delay or other problems cause reality When border view cannot be presented.The well (well) can be to avoid other empty and aesthetically dull displays.
It will become obvious according to the disclosure these and other advantage.
Word " at least one ", " one or more " and "and/or" are open wording, and in operation, they are all It is conjunction and adversative conjunction.For example, each wording " A, B and C's is at least one ", " A, B or C's is at least one ", " A, B With the one or more of C ", " one or more of A, B or C " and " A, B and/or C " mean single A, single B, individually C, A and B joint, the joint of B and C or the joint of A, B and C.
Term "one" or " one kind " entity refer to the one or more of this entity.Equally, can replace herein Ground uses term "one" (or " one kind "), " one or more " and " at least one ".It should also be noted that can alternately it make With term "comprising", " comprising " and " having ".
Term " automatic " used herein and its variation refer to that when performing processing or operation be not by the defeated of specific people Enter any processing or operation of completion.However, processing or operation can be automatic, even if by the defeated of specific or non-specific people Enter to perform processing or operation, if receiving the input before processing or operation is performed.The input of people is considered specific , if this input influences processing or how operation will perform.With performing processing or operating the input of consistent people not It is considered as " specific ".
Term as used herein " computer-readable medium " refers to the memory and/or transmission medium of any practical, it Participate in provide instructions to processor for perform.Such medium can take various forms, including but not limit to In non-volatile media, Volatile media and transmission medium.Non-volatile media includes, for example, NVRAM or disk or CD.Volatile media includes dynamic memory, such as main storage.The common form of computer-readable medium includes, for example, Floppy disk, floppy disc, hard disk, tape or any other magnet medium, magnet-optical medium, CD-ROM, any other optical medium, Card punch, paper tape, have any other physical medium of pass, RAM, PROM and EPROM, FLASH-EPROM, be similar to The solid state medium of storage card, any other storage chip or tape, carrier wave as described below or computer can therefrom be read Any other medium taken.Email or other self-contained archival of information or the digital file attachment of archive set are considered as With the distribution medium of actual storage medium equivalent.When computer-readable medium is configured to database, it will thus be appreciated that Database can be any kind of database, such as associated, layering, object-oriented and/or similar.Therefore, originally It is open to be believed to comprise actual storage medium or distribution media and equivalent and successor media known in the art, its The software realization mode of the middle storage disclosure.
Term " desktop " refers to metaphor used to portray systems.Desktop is typically considered " surface ", is somebody's turn to do on " surface " Display application, window, control can be activated by typically comprising picture, called icon, widget, file etc. Room, file, file, document and other projects illustrated.Icon is typically selectable, to start via user interface interaction Task, so as to allow user to perform application or handle other operations.
Term " display " refers to a part for the screen for showing computer output to user.
Term " display image " refers to the image showed over the display.Typical display image is window or desktop. Display image can take up whole display or a part of display.
Term " display orientation " refers to user to watch the orientation of the rectangular display determined.Two of display orientation are most General type is longitudinally (portrait) and horizontal (landscape).In landscape mode, display orientation is determined to show The width of device is more than the height of display (for example, 4:3 ratio, it is that 4 units are wide and 3 unit height, or 16:9 ratio, It is that 16 units are wide and 9 unit height).In other words, in landscape mode, determine that the orientation of display longer dimension is basic On be horizontal, and determine that the orientation of display shorter size is substantially vertical.On the contrary, in longitudinal mode, determine aobvious Show the orientation of device so that the width of display is less than the height of display.In other words, in longitudinal mode, determine display compared with The orientation of short size is substantially horizontal, and determines that the orientation of display longer dimension is substantially vertical.Multihead display Device can have the combining display for including all screens.Device-based different orientation, combining display can have difference Display characteristic.
Term " gesture " refers to the user action for expressing expected idea, operation, purpose, result and/or effect.User action It can include operation device (for example, opening or shutoff device, change device orientation, motion track ball or roller etc.), with The movement of the related body part of device, the movement of the implementation related with device or instrument, audio input etc..Can be (such as on the screen) or gesture is obtained with the device on device, to be interacted with the device.
Term as used herein " module " refers to any known or later developed hardware, software, firmware, artificial intelligence Can, fuzzy logic circuitry or be able to carry out the function associated with element hardware and software combination.
Term " gesture capture " refers to the example of user gesture and/or the sensing of type or detection.Gesture capture can deposit It is one or more regions of screen, gesture area can over the display, it is considered touch-sensitive display herein, Or gesture area is not over the display, it is considered gesture capture region herein.
" multi-screen application " is the application for referring to show the one or more windows that can occupy multiple screens at the same time.It is logical Often, multi-screen application can be operated in a manner of single screen, wherein, one or more windows of the application are only shown on one screen, Or operated in a manner of multi-screen, wherein, show one or more windows at the same time on multiple screens.
" single screen application " is the application for referring to show the one or more windows that only can take up single screen every time.
Term " screen ", " touch-screen ", " touch screen " refer to a kind of physical arrangement, it can make user by being touched on screen Region is touched to interact with computer and provide the user with information via display.Touch-screen can be felt in a multitude of different ways User's contact is surveyed, for example, being connect by the change (for example, resistance or capacitance) in terms of electrical parameter, sound wave change, infrared radiation Recency detects, light change detection etc..For example, in electric resistance touch screen, conduction normally separated and resistance metal laminar flow in screen Galvanization.When the user touches the screen, two layers become to contact in contact position, therefore record the change of electric field and calculating The coordinate of contact position.In capacitance touch screen, capacitor layers storage electric charge, the electric charge is according to user via the contact with touch-screen And discharge, so as to cause the reduction of capacitance layer charge.The reduction is measured, and determines contact position coordinate.Touched in surface acoustic wave Touch in screen, via annotating display sound wave, and user contacts interference sound wave.Receive transducer detection user and contact situation, and Determine contact position coordinate.
Term " window " refers to the display image (typically, being rectangle) at least a portion of display, the portion The content for including or providing different from screen other parts is provided.Window can cover desktop.
Terms used herein " definite ", " calculating " and " estimation " and its change alternately use, and can include Any kind of method, step, mathematical operation or technology.
It should be understood that terms used herein " device " should give broadest possible explanation, foundation 35U.S.C., section 112, paragraph 6.Therefore, and the claim that has term " device " should cover all knots set forth herein Structure, material or process, and the equivalent of all of which.In addition, structure, material or process and their equivalent should It is included in all this described in the content of the invention, brief description of the drawings, embodiment, summary and the claim of itself In a little.
Above-mentioned is the brief content of the invention of the disclosure, to provide the understanding to disclosure some aspects.The content of the invention was both It is not range nor the disclosure and its different aspect, the exhaustive overview of embodiment and/or configuration.The content of the invention is not beaten The key or critical elements for identifying the disclosure are calculated, is also not intended to sketch out the scope of the present disclosure, but be in simplified form The selected viewpoint of the disclosure is showed, to be used as preamble detailed further presented below.As will be realized that, individually or Utilized in combination its other party of one or more disclosure of feature (feature) described above or described in detail below Face, embodiment and/or configuration are possible.
Brief description of the drawings
Figure 1A includes first view of one embodiment of multi-screen user device;
Figure 1B includes second view of one embodiment of multi-screen user device;
Fig. 1 C include the three-view diagram of one embodiment of multi-screen user device;
Fig. 1 D include the 4th view of one embodiment of multi-screen user device;
Fig. 1 E include the 5th view of one embodiment of multi-screen user device;
Fig. 1 F include the six views of one embodiment of multi-screen user device;
Fig. 1 G include the 7th view of one embodiment of multi-screen user device;
Fig. 1 H include the 8th view of one embodiment of multi-screen user device;
Fig. 1 I include the 9th view of one embodiment of multi-screen user device;
Fig. 1 J include the tenth view of one embodiment of multi-screen user device;
Fig. 2 is the block diagram of one embodiment of the hardware of the device;
Fig. 3 A are the block diagrams of one embodiment of the state model of the device of orientation and/or configuration based on the device;
Fig. 3 B are the tables of one embodiment of the state model of the device of orientation and/or configuration based on the device;
Fig. 4 A are first expressions of one embodiment of the user gesture received at device;
Fig. 4 B are second expressions of one embodiment of the user gesture received at device;
Fig. 4 C are the 3rd expressions of one embodiment of the user gesture received at device;
Fig. 4 D are the 4th expressions of one embodiment of the user gesture received at device;
Fig. 4 E are the 5th expressions of one embodiment of the user gesture received at device;
Fig. 4 F are the 6th expressions of one embodiment of the user gesture received at device;
Fig. 4 G are the 7th expressions of one embodiment of the user gesture received at device;
Fig. 4 H are the 8th expressions of one embodiment of the user gesture received at device;
Fig. 5 A are the block diagrams of one embodiment of device software and/or firmware;
Fig. 5 B are second block diagrams of one embodiment of the device software and/or firmware;
Fig. 6 A are in response in first expression of one embodiment of unit state generating means configuration;
Fig. 6 B are in response in second expression of one embodiment of unit state generating means configuration;
Fig. 6 C are in response in the 3rd expression of one embodiment of unit state generating means configuration;
Fig. 6 D are in response in the 4th expression of one embodiment of unit state generating means configuration;
Fig. 6 E are in response in the 5th expression of one embodiment of unit state generating means configuration;
Fig. 6 F are in response in the 6th expression of one embodiment of unit state generating means configuration;
Fig. 6 G are in response in the 7th expression of one embodiment of unit state generating means configuration;
Fig. 6 H are in response in the 8th expression of one embodiment of unit state generating means configuration;
Fig. 6 I are in response in the 9th expression of one embodiment of unit state generating means configuration;
Fig. 6 J are in response in the tenth expression of one embodiment of unit state generating means configuration;
Fig. 7 A-F are a series of longitudinal display orientation Snipping Tools according to one embodiment;
Fig. 8 A-E are to be shown displayed across orientation Snipping Tool according to a series of of one embodiment;
Fig. 9 is the flow chart for representing one embodiment;
Figure 10 A are the expressions of logical window stack;
Figure 10 B are another expressions of one embodiment of logical window stack;
Figure 10 C are another expressions of one embodiment of logical window stack;
Figure 10 D are another expressions of one embodiment of logical window stack;
Figure 10 E are another expressions of one embodiment of logical window stack;
Figure 11 is the block diagram of one embodiment of the logic data structure for window stack;
Figure 12 is a kind of flow chart of the one embodiment for the method for being used to create window stack;And
Figure 13 describes window according to one embodiment and stacks (stacking) configuration.
In the accompanying drawings, similar component and/or feature can have identical reference numeral.In addition, in similar component The different components of same type can be distinguished by the letter after the reference numeral of difference similar component.If in the description only With the first reference numeral, then the description is suitable for any one of the similar component with the first identical reference numeral, and No matter the second reference numeral.
Embodiment
What is presented herein is the embodiment of device.The device can be communicator, such as cell phone, or other intelligence Can device.The device can include two screens, determine their orientation and configured with providing a variety of unique displays.In addition, should Device can receive user's input in a manner of unique.The master-plan of the device and function provide the user experience of enhancing, Make the device more useful and more efficient.
Mechanical property:
Figure 1A -1J illustrate device 100 according to embodiment of the disclosure.As will be described in more detail, can be with more The different mode positioner 100 of kind, each way all provide the user different functions.Device 100 is to include main screen 104 With the multi-screen device of secondary screen 108, each screen is touch-sensitive.In embodiment, the whole front surface of screen 104 and 108 is all It can be touch-sensitive, and input can be received by the front surface of user's touch-screen curtain 104 and 108.Main screen 104 includes touching Quick display 110, it can also show information in addition to being touch-sensitive to user.Pair screen 108 includes touch-sensitive display 114, its Also information is shown to user.In other embodiments, screen 104 and 108 can include more than one display area.
Main screen 104 further includes configurable region 112, when user touches configurable 112 part of region, for specific input The configurable region 112 of configuration.Pair screen 108 also includes configurable region 116, and configurable region 116 is configured for specific input.Area Domain 112a and 116a have been configured as reception instruction user and want to watch " return " input of the information previously shown.Region 112b and 116b has been configured as receiving " menu " input that instruction user wants option of the viewing from menu.Region 112c Reception instruction user is had been configured as with 116c to want to watch " homepage " input of the information associated with " homepage " viewing. , can be with configuring area 112a-c and 116a-c for other kinds of specific in addition to above-mentioned configuration in other embodiment Input, includes the feature of control device 100, some nonrestrictive examples include adjustment whole system power supply, adjust volume, adjust Whole brightness, adjustment vibration, selects display project (on screen 104 or 108), operates camera, operates microphone, Yi Jiqi Dynamic/termination telephone calling.Equally, in certain embodiments, can with configuring area 112a-C and 116a-C for specific input, The specific input is based on the application run on the device 100 and/or the letter shown on touch-sensitive display 110 and/or 114 Breath.
In addition to touch-sensing, main screen 104 and secondary screen 108 can also include some regions, these regions are touched without user The display area for touching screen receives input from the user.For example, main screen 104 includes gesture capture region 120, and secondary screen 108 include gesture capture region 124.These regions can receive input by identifying the gesture made by user, without User is wanted substantially to touch the surface of display area.With touch-sensitive display 110 and 114 Comparatively speaking, 120 He of gesture capture region 124 generally can not be presented display image.
The two screens 104 and 108 are linked together by hinge 128, and Fig. 1 C (rearview for illustrating device 100) are clear Show to Chu.In the embodiment shown in Figure 1A -1J, hinge 128 is the center hinge for connecting screen 104 and 108, so as to when pass When closing hinge, screen 104 and 108 can be with juxtaposition (i.e. side by side), as shown in Figure 1B (front view for illustrating device 100).Can be with Hinge-opening 128, so as to position the two screens 104 and 108 with relative position different from each other.As will be retouched in more detail below State, depending on the relative position of screen 104 and 108, device 100 can have the function of different.
Fig. 1 D illustrate the right side of device 100.As shown in figure iD, secondary screen 108 is additionally included in the card slot 132 and end of its side Mouth 136.In embodiment, card slot 132 accommodates different types of card, including Subscriber Identity Module (SIM).In embodiment, hold Mouth 136 is input/output end port (I/O port), it allows device 100 to be connected to other ancillary equipment, such as display, key Disk or printing equipment.As can be appreciated like that, these are some examples, and in other embodiments, device 100 Can include other grooves and port, for example, for accommodate additional memory devices and/or for connect other ancillary equipment groove and Port.Equally, audio sockets 140 are illustrated that in Fig. 1 D, for example, it accommodates core, ring, sleeve (TRS) connector, to allow User utilizes earphone or receiver.
Device 100 further includes multiple buttons 158.For example, Fig. 1 E illustrate the left side of device 100.As referring to figure 1E, main screen 104 side includes three buttons 144,148 and 152, can configure them for specific input.For example, button can be configured 144th, 148 and 152 with many aspects of control device 100 alone or in combination.Some non-limiting examples include whole system Power supply, volume, brightness, vibration, selection display project (on screen 104 or 108), camera, microphone and initiation/termination Call.In certain embodiments, a rocker button can be merged into instead of separated button, two buttons.This cloth Put is useful in the following cases:Button is configured to control the characteristic of such as volume or brightness.Except button 144,148 and Outside 152, device 100 further includes button 156, and as shown in fig. 1F, which illustrates the top of device 100.In one embodiment, Button 156 is configured to on/off button, for the whole system power supply of control device 100.In other embodiments, except or Person is replaced outside control system power supply, and button 156 is configured to other aspects of control device 100.In certain embodiments, press The one or more of button 144,148,152 and 156 can support different user commands.For example, normal pressing has The duration of generally less than about 1 second, and be similar to and hit soon.Medium pressing has usual more than 1 second but is less than about The duration of 12 seconds.Long-press pressing element has the duration of generally about more than 12 seconds.For in each display 110 and 114 The application of upper concern, the function of button are typically specific.Such as in phone application and be based on specific button, it is normal, in Deng or long pressing may mean that end calling, increase call volume, reduce call volume and triggering mic mute.Such as In camera or Video Applications and specific button is based on, normal, medium or long pressing may mean that increase scaling, reduce contracting Put and shoot or record video.
There are multiple hardware componenies inside device 100.As illustrated in Figure 1 C, device 100 includes loudspeaker 160 and wheat Gram wind 164.Device 100 further includes camera 168 (Figure 1B).In addition, device 100 includes two position sensors 172A and 172B, They are used to determine the relative position of screen 104 and 108.In one embodiment, position sensor 172A and 172B is suddenly That effect sensor.However, in other embodiments, except or in addition to replacing hall effect sensor, other can be used Sensor.Accelerometer 176 can also include the part for device 100, with 104 He of the orientation of determining device 100 and/or screen 108 orientation.The additional hardware component that can include description in the device 100 referring to Fig. 2.
The master-plan of device 100 allows it to provide the additional function that cannot be provided in other communicators.Some work( Can be the multiple positions and orientation that can be had based on device 100.As shown in Figure 1B -1G, device 100 can be operated in and " beaten Open " position, wherein screen 104 and 108 is juxtaposed.This location provides big display area to show information to user. When position sensor 172A and 172B determine that device 100 is in an open position, they, which can be generated, can be used for triggering not With the signal of event, such as information is all shown on screen 104 and 108.Additional event can be triggered, if accelerometer 176 Determine that device 100 is in and lateral position (not shown) opposing longitudinal position (Figure 1B).
In addition to open position, device 100 can also have illustrated " closing " positions of Fig. 1 H.In addition, position passes Sensor 172A and 172B can generate the signal that instruction device 100 is in " closing " position.This, which can be triggered, causes 104 He of screen The event that information changes is shown on 108.For example, device 100 may be planned to stop at the upper display information of one of screen, for example, screen Curtain 108, because when device 100 is in " closing " position, user only may be viewed by a screen every time.In other embodiments, The signal that " closing " position is in by position sensor 172A and the 172B instruction device 100 generated can be connect with trigger device 100 Listen external call.It is somebody's turn to do " closing " position and can also be the optimum position that device 100 is used as to mobile phone.
Can also be with illustrated " stent " the position use devices 100 of Fig. 1 I.In " stent " position, screen 104 and 108 The edge of screen 104 and 108 at an angle and actually horizontal is towards outside toward each other., can be with this position Device 100 is configured to all to show information on screen 104 and 108, to allow two users to be handed over device 100 at the same time Mutually.When device 100 is in " stent " position, sensor 172A and 172B generation instruction screen 104 and 108 are at an angle each other The signal of positioning, and accelerometer 176 can generate the signal that instruction device 100 has been placed, so as to screen 104 and 108 Edge be actually horizontal.It is then possible to these signals are used in combination to generate triggering information on screen 104 and 108 Show the event changed.
Fig. 1 J illustrate the device 100 in " improved stent " position.In " improved stent " position, screen 104 Or one of 108 be used as shelf, and down against on the surface of the object of such as desk.This position is provided in transverse direction The convenient manner of information is displayed for a user in orientation.Similar to backing positions, when device 100 is in " improved stent " position When, the signal of position sensor 172A and 172B generation instruction screen 104 and 108 positioning at an angle each other.Accelerometer 176 will The signal that generation instruction device 100 has been positioned, so that one of screen 104 and 108 is face-down and actually horizontal.So Afterwards, these signals can be used to generate the event that 104 and 108 presentation of information of triggering screen changes.For example, information can no longer be shown On face-down screen, because user does not see this screen.
Can also there are transition state.When position sensor 172A and B and/or accelerometer point out screen be closed or When folding (from opening), closing transition state is identified.On the contrary, when position sensor 172A and B point out that screen is being opened Or when folding (from closing), identify opening transitional state.Typically, close and opening transitional state is time-based, or Person has the maximum duration since the starting point of sensing.In general, when one of closing and open mode are in effective, no There may be user's input.In this way, during closing or opening function, accidental user's contact screen will not be erroneously interpreted as using Family inputs.In embodiment, it is understood that there may be another transition state, when shutoff device 100.Inputted when based on some users, Such as screen 110 is double-clicked, during 114 shutoff device 100, this additional transition state allows display to switch from a screen 104 To the second screen 108.
As being realized that, the description of device 100 is merely used as illustrative purpose, and these embodiments are not It is confined to Figure 1A -1J and above-mentioned specific mechanical property.In other embodiments, device 100 can include additional feature, Including one or more additional button, groove, display area, hinge and/or locking mechanism.In addition, in embodiment, above-mentioned spy Sign can be located in the different part of device 100, and still provide similar function.So Figure 1A -1J and above-mentioned offer are retouched State be do not have it is conditional.
Ardware feature:
Fig. 2 illustrates the component of the device 100 according to the embodiment of the present disclosure.In general, device 100 includes main screen 104 and pair Screen 108.Although can usually activate main screen 104 and its component opening and closing position or state, opening is that typically in State can activate secondary screen 108 and its component, but cannot then activate in off position.However, even if in off position, lead to Cross appropriate order, user or the triggered interruption (for example, being operated in response to phone application or camera applications) of application can also be turned over The screen of transactivation, or disable main screen 104 and activate secondary screen 108.Each screen 104,108 can be touch-sensitive, and all It can include different operating areas.For example, the first operating area in each touch sensitive screen 104 and 108 can include Touch-sensitive display 110,114.In general, touch-sensitive display 110,114 can include the touch-sensitive display of full color.It is each touch-sensitive Second area in screen 104 and 108 can include gesture capture region 120,124.Gesture capture region 120,124 can be with Including outside 110,114 region of touch-sensitive display and can receive input (such as by user provide gesture in the form of) area Domain or area.However, gesture capture region 120,124 does not include the pixel that can perform display function or ability.
3rd area of touch sensitive screen 104 and 108 can include configurable region 112,116.Configurable region 112,116 Input can be received, and there are display capabilities or limited display capabilities.In embodiment, it can configure region 112,116 Can be that different input options are presented in user.For example, configurable region 112,116 can be with the Show Button or other related items Mesh.In addition, no matter either with or without any button is shown in the configurable region 112,116 of touch sensitive screen 104 or 108, display is pressed Environment that the homogeneity of button can be used and/or operated by device 100 determines.In an exemplary embodiment, touch sensitive screen 104 Include LCD device with 108, it extends to 104 and 108 at least the above area of touch sensitive screen, can provide to the user Visual output, and including condenser type input matrix, it can be received on the above-mentioned area of touch sensitive screen 104 and 108 Input from the user.
One or more display controller 216a can be provided, 216b is wrapped to control the operation of touch sensitive screen 104 and 108 Include input (touch-sensing) and output (display) function.It is each 104 He of touch-screen in Fig. 2 exemplary embodiment illustrated 108 provide single touch screen controller 216a or 216b.According to alternative embodiment, the touch for sharing or sharing can be used Screen controller 216 come control the touch sensitive screen 104 and 108 included each.According to another embodiment, touch screen controller 216 function can be incorporated in miscellaneous part, such as in processor 204.
Processor 204 can include general purpose programmable processors or controller, for performing application program or instruction.Root According at least some embodiments, processor 204 can include multiple processor cores, and/or perform multiple virtual processor.According to Another embodiment, processor 204 can include multiple concurrent physical processors.As specific example, processor 204 can include Specifically configured application-specific integrated circuit (ASIC) or other integrated circuits, DSP CONTROL device, hard-wired electronic or Logic circuit, programmable logic device OR gate array, special purpose computer etc..In general, processor 204 exercises operation executive device The program code of 100 multiple functions or the function of instruction.
Communicator 100 can also include memory 208, for related to the execution application program of processor 204 or instruction, And for the interim or long-term storage of programmed instruction and/or data.As an example, memory 208 can include RAM, DRAM, SDRAM, or other solid-state memories.Alternatively or furthermore it is also possible to provide data storage 212.With 208 phase of memory Seemingly, data storage 212 can include one or more solid state memory devices.Alternatively or in addition, data storage 212 It can include hard disk drive or other random access memory.
In order to support communication function or ability, device 100 can include cell phone module 228.As an example, honeycomb Phone module 228 can include GSM, CDMA, FDMA and/or can support voice, multimedia and/or data via cellular network The analog cellular telephone transceiver of transmission.Alternatively either in addition, device 100 can be including add or other wireless communications Module 232.As an example, other wireless communication modules 232 can include Wi-Fi, BLUETOOTH TM, WiMax is infrared Line, or other wireless communication links.Each of cell phone module 228 and other wireless communication modules 232 are permissible It is associated with shared or dedicated antenna 224.
It can include port interface 252.Port interface 252 can include all or general ports, to support to fill Put 100 interconnection with other devices or component, such as depressed place (dock), according to those of component devices 100, its can with or cannot Including additional or different ability.Except support between device 100 and another device or component exchanges communication signals it Outside, lie up (docking) port 136 and/or port interface 252 can support device 100 or the power supply from device 100 Supply.Port interface 252 further includes intelligent element, which includes being used for device or the portion of control device 100 and connection The docking modules of communication or other interactions between part.
It can include with input/output module 248 and associated port, to support on cable network or link, such as with Other communicators, server unit, and/or the communication of ancillary equipment.It can include input/output module 248 and associated Port, to support to pass through cable network or link, for example (,) it is logical with other communicators, server unit and/or ancillary equipment Letter.The example of input/output module 248 includes ethernet port, Universal Serial Bus (USB) port, electrics and electronics engineering Shi Xiehui (IEEE) 1394 or other interfaces.
Can include audio input/output interface/device 244, by analogue audio frequency be supplied to interconnection loudspeaker or its His device, and to receive analogue audio frequency from the microphone being connected or other devices.As an example, audio input/ Output interface/device 244 can include relevant loudspeaker and analog-digital converter.Alternatively or in addition, device 100 can wrap Integrated audio input/output device 256 and/or audio sockets are included, for being interconnected with external loudspeaker or microphone.Example Such as, integral speakers and integrated microphone can be provided, to support intimate talk or speakerphone operation.
For example, hardware button 158 can be included for relevant some control operations.Example includes total power switch, Volume control etc., it is such as described together with Figure 1A to 1J.It can include one or more image capture interface/devices 240, such as Camera, for capturing still image and/or video image.Alternatively or in addition, image capture interface/device 240 can be with Including scanner or code reader.Image capture interface/device 240 can include add ons or with add ons phase Association, such as flash of light or other light sources.
Device 100 can also include global positioning system (GPS) receiver 236.According to an embodiment of the invention, GPS connects Receive device 236 and may further include GPS module, absolute location information can be supplied to the miscellaneous part of device 100 by it.May be used also With including accelerometer 176.For example, relative to information and/or other functions are displayed for a user, the signal from accelerometer 176 can With for determining to display for a user the orientation and/or form of information.
The embodiment of the present invention can also include one or more position sensors 172.Position sensor 172 can provide Indicate the signal of 104 and 108 position relative to each other of touch sensitive screen.This information can be provided as input, be, for example, User-interface application provides, to determine that operator scheme, the feature of touch-sensitive display 110,114, and/or other devices 100 operate. As an example, screen position sensor 172 can include a series of hall effect sensors, multiple position switch, photoswitch, favour This steps on electric bridge, potentiometer, or is capable of providing other arrangements for the signal for indicating multiple relevant positions residing for touch-screen.
Can be by the communication between the different components of one or more 222 carrying devices 100 of bus.Furthermore it is possible to from electricity Source and/or energy supply control module 260 provide power supply for the component of device 100.For example, energy supply control module 260 can include electricity The port of pond, AC-DC converter, power control logic, and/or the external power supply for device 100 to be mutually connected to power supply.
Unit state:
Fig. 3 A and 3B represent 100 illustrative state of device.Though it is shown that multiple illustrative states and from first State to the second state conversion, it will be appreciated that illustrative state diagram can not include all possible state and/ Or from first state to all possible conversion of the second state.As illustrated in fig. 3, in state, (state represented by circle is schemed Show) between different arrows represent device 100 occur physical change, it is detected by one or more hardware and softwares Go out, the one or more of detection triggering hardware and/or software interrupt, the hardware and/or software interrupt are used for controlling and/or managing Manage the one or more functions of device 100.
As Fig. 3 A are illustrated, there are 12 kinds of exemplary " physics " states:Closing 304, conversion 208 (or open transition State), stent 312, improved stent 316, opens 320, squeezes into/get phone or communication 324, image/video capture 328, conversion 332 (or closing transition state), horizontal 340, docking (docked) 336, docking 344 and horizontal 348.It is next to Each illustrative state be device 100 physical state expression, in addition to state 324 and 328, its state is usually used respectively The icon of the international icon and camera of phone represents.
In state 304, device is closed, while device 100 is generally toward machine-direction oriented, while 104 He of main screen Pair screen 108 connects (see Fig. 1 H) back-to-back in Different Plane.From closed mode, for example, device 100 can enter docking state 336, wherein device 100 be coupled into docking station, the cable that lies up, either normally enter or with other one or more devices or Person's ancillary equipment is associated, and device 100 or can enter transverse state 340, and wherein device 100 is usually with the master towards user Screen 104 determines orientation, and main screen 104 and secondary screen 108 are to connect back-to-back.
In in off position, device may also move to transition state, wherein the device remainder base near display The double-click moved on to from a screen 104 on another screen 108, such as screen 110,114 is inputted in user.Another embodiment Including bilateral state.In bilateral state, shutoff device remainder, but also one is applied in 110 He of the first display At least one window is shown on second display 114.State based on application and the application, in the first and second displays 110, The window shown on 114 can be same or different.For example, when obtaining image with camera, which can be the View finder is shown on one display 110, and preview (the full frame and left and right mirror of photo theme is shown on second display 114 Picture).
In state 308, from closed mode 304 to half opened condition or the transition state of stent state 312, device is shown 100, it starts from main screen 104 and secondary screen 108 a little rotating around the axis overlapped with hinge.When entering stent state 312, Main screen 104 and secondary screen 108 are separated from each other, and may be constructed the structure similar to stent on the surface so as to such as device 100.
The state 316 of improved backing positions is being known as, device 100 has the phase mutually similar with stent state 312 Main screen 104 and secondary screen 108 to relation, its difference are that main screen 104 or pair shield one of 108 on surface, as shown.
State 320 is open mode, wherein, main screen 104 and secondary screen 108 are generally on same plane.From open mode, Device 100 may switch to docking state 344 or open transverse state 348.In open mode 320, main screen 104 and secondary screen 108 are generally in similar machine-direction oriented, while are generally in similar horizontal stroke in transverse state 348, main screen 104 and secondary screen 108 To orientation.
State 324 is the illustrative state of communications status, such as just answers or send respectively when device 100 and squeeze into or get During phone.Although without being illustrated for clarity, it will be appreciated that device 100 can be illustrated from Fig. 3 Meaning State Transferring is to squeeze into/get telephone state 324.In a similar manner, can from any other state of Fig. 3 enter image/ Video trapped state 328, because image/video capture state 328 allows device 100 to shoot one or more images via camera And/or shoot video with video capture device 240.
Transition state 322 is illustratively shown enters such as closed mode 304 each other according to main screen 104 and secondary screen 108 And close main screen 104 and secondary screen 108.
With reference to keyword, Fig. 3 B illustrate reception and are used to detect from first state to the input of the second State Transferring. In Fig. 3 B, usually with towards longitudinal state 352, a part for the row of transverse state 356 and longitudinal state 360 and laterally is directed toward A part for the row of state 364 represents the state of multiple combinations.
In figure 3b, keyword points out that " H " represents the input from one or more hall effect sensors, and " A " is represented Input from one or more accelerometers, " T " represent the input from timer, and " P " represents communications triggered input, and " I " represents image and/or video capture request input.Therefore, in the core 376 of the chart, an input is shown Or the combination of input, to represent how device 100 detects the conversion from the first physical state to the second physical state.
As discussion, in the core of chart 376, for example, the input received starts from longitudinally opened state to transverse direction Shown in detection-runic " HAT " of the conversion of stent state.Exemplary turn for this from longitudinally opened knife horizontal support state Change, perhaps need hall effect sensor (" H "), accelerometer (" A ") and timer (" T ").For example, timer input can come From the clock associated with processor.
In addition to vertical and horizontal state, further it is shown that the docking state triggered based on the receiving into dock signal 372 368., can be by device 100 and one or more devices 100, annex, ancillary equipment, intelligent depressed place as described above and relative to Fig. 3 Triggered etc. associated into dock signal.
User mutual:
The a variety of diagrams for the gesture input that Fig. 4 A to 4H can be identified by screen 104,108.Not only can be by the body of user Point, such as finger execution gesture, also gesture can be performed by other devices, such as input pen, it can be by the contact of screen 104,108 Sensing part and sense.Generally, based on execution gesture (directly on display 110,114 or in gesture capture region In 120,124) place and differently explain gesture.For example, the gesture in display 110,114 may be directed toward desktop or Using, and the gesture in gesture capture region 120,124 can be interpreted to be used for system.
With reference to Fig. 4 A-4H, the first gesture, touch gestures 420 are actually to fix holding for selection on screen 104,108 The continuous time.Circle 428 represents the touch or other contact types in the specific location reception of screen contact sensing part.Circle 428 can Actually to keep the length of regular time in contact position contact including border 432, the instruction of its thickness.For example, touch (tap) 420 (either short-press pressures) have the border 432a thinner than the border 432b of long pressing 424 (or normal pressing).Long-press Pressure 424 can include the contact for actually keeping the fixation in cycle longer time than touch 420 on the screen.As that will recognize , kept for the fixed duration before contact stops or moves on the screen based on touching, different definition can be recorded Gesture.
With reference to Fig. 4 C, dragging (drag) gesture 400 on screen 104,108 is with contacting mobile 436 on preferential direction Initial contact (is represented) by circle 428.Initial contact 428 can keep representing a certain amount of by border 432 on screen 104,108 The fixation of time.Typically, drag gesture requires user to contact the icon at first position, window, or other display images, Then, it will contact and the selected desired new second place of display image be moved to along drawing direction.Contact movement must not Will be along rectilinear movement, but there is the movement of free routing, as long as contact is actually continuous from first to the second place.
With reference to Fig. 4 D, light sliding (flick) gesture 404 on screen 104,108 is the contact blocked along preferential direction The initial contact (being represented by circle 428) of mobile 436 (relative to drag gestures).In embodiment, for drag gesture, The light sliding last movement for gesture has higher output speed.For example, light skating gesture can be the hand along initial contact Refer to mobile suddenly.Compared to drag gesture, light skating gesture is not usually required to from the first position of display image to predetermined second Position and screen 104,108 lasting contacts.The display image of contact is moved along the direction of light skating gesture by light skating gesture To the predetermined second place.Although display image usually can be moved on to the second place by the two gestures from first position, It is shorter than drag gesture for light slide to be that typically in the temporary duration contacted on screen and stroke.
With reference to Fig. 4 E, contraction (pinch) gesture 408 on screen 104,108 is described.Shrinking gesture 408 can originate In 428a being contacted to the first of screen 104,108 by such as the first finger and by such as second finger to screen 104,108 Second contact 428b.Can be different by the shared contact sensing part of shared screen 104,108, shared screen 104 or 108 The first and second contact 428a, b are detected in part or the different touch-sensing parts of different screen.First contact 428a Kept for the time of the first quantity, as represented by the 432a of border, and the second contact 428b keeps second time, such as border Represented by 432b.In general, the time of the first and second quantity is actually identical, and in general, the first and second contacts 428a, b actually occur at the same time.First and second contact 428a, b usually respectively further comprise corresponding first and second contacts shifting Dynamic 436a, b.First and second contacts mobile 436a, b are generally in opposite direction.In other words, the mobile 436a of the first contact 436b is contacted towards second, and the mobile 436b of the second contact contacts 436a towards first.Say simpler again, can be by user Finger touch screen 104,108 in pinching activity and complete to shrink gesture 408.
With reference to Fig. 4 F, expansion (spread) gesture 410 on screen 104,108 is described.Expansion gesture 410 can originate In 428a being contacted to the first of screen 104,108 by such as the first finger and by such as second finger to screen 104,108 Second contact 428b.Different it can be connect by sharing the shared contact sensing part of screen 104,108, sharing screen 104,108 The different contact sensing parts of sensing part or different screen are touched to detect the first and second contact 428a, b.First connects Touch 428a keep the first quantity time, as represented by the 432a of border, and second contact 428b keep the second quantity when Between, as represented by the 432b of border.In general, the time of the first and second quantity is actually identical, and in general, the first He Second contacts 428a, and b actually occurs at the same time.In general, the first and second contact 428a, b respectively further comprise corresponding first He Second contact mobile 436a, b.First and second contacts mobile 436a, b are generally in common direction.In other words, the first He Second contact mobile 436a, b are to contact 428a from first and second, and b is separated.Say simpler again, can be by the hand of user Refer to and touch screen 104,108 in expansion activity and complete expansion gesture 410.
It can combine above-mentioned gesture in any way, such as those shown in Fig. 4 G and 4H, to generate definite function knot Fruit.For example, in Fig. 4 G, by Flick gesture 420 on the direction away from Flick gesture 420 with dragging or light skating gesture 412 It is combined.In Fig. 4 H, by Flick gesture 420 on the direction close to Flick gesture 420 with dragging or light 412 phase of skating gesture With reference to.
Based on Multiple factors, receiving the function result of gesture can change, these factors include the state of device 100, show Show device 110,114, or screen 104,108, the environment associated with gesture, or the position of gesture sensing.The state of device It is usually directed to following one or more:The structure of device 100, display orientation, and other of user and the reception of device 100 are defeated Enter.Environment is usually directed to following one or more:The application-specific selected by gesture and currently performed application obscure portions, application It is single screen application or multi-screen application, and application is to show one in one or more screens or in one or more storehouses The sensing set that the sensing the feedback of position of the multi-screen application gesture of a or multiple windows is usually directed to hand gesture location coordinate is positioned at touch-sensitive Display 110,114 is still located at gesture capture region 120,124, and the sensing set of hand gesture location coordinate is and shares or different Display be associated or associated with screen 104,108, and/or which of gesture capture region partly includes hand gesture location The sensing set of coordinate.
When touch-sensitive display 110,114 receives touch, it can be used to, for example, selecting icon to start or terminate The execution of respective application, maximizes or minimizes window, the window in storehouse of resequencing, and provides user's input, example Such as via keyboard & display or other display images.When touch-sensitive display 110,114 receives dragging, it can be used to, for example, Again icon or window are navigated into the precalculated position in display, the storehouse resequenced on display, or bridge this two A display (so that selected window occupies a part for each display at the same time).When touch-sensitive display 110,114 or hand When gesture capture region 120,124 receives light sliding, it can be used to window being repositioned onto the second display from the first display Device, or bridge the two displays (so that selected window occupies a part for each display at the same time).However, it is different from Drag gesture, in general, light skating gesture cannot be used for showing that image moves on to the position that particular user is selected, and can move on to user not Configurable default location.
When touch-sensitive display 110,114 or gesture capture region 120,124, which receive, shrinks gesture, it can be used to Display area or the size (typically, when being received completely by shared display) of window are maximized or increase, will be each The top of the windows exchange of display to the storehouse of another display (typically, is shown when by different at the top of the storehouse of display When showing that device or screen receive), or display application manager (" pop-up window " of display window in storehouse).When by touch-sensitive aobvious When showing that device 110,114 or gesture capture region 120,124 receive expansion gesture, it can be used to minimize or reduce window Display area or size, by the windows exchange shown at the top of the storehouse of each display to the storehouse of another display Top (typically, when by different displays or screen reception), or display application manager is (typically, when by identical Or outside the screen of different screens during the reception of gesture capture region).
When the common display capture region in by common display or screen 104,108 receives, what Fig. 4 G were combined Gesture can be used to the display for receiving gesture, and first window stack position is saved as the first storehouse constant, while again The second window stack position in the second window stack of new sort, with the window in the display including receiving gesture.When by altogether Received with the different display capture regions in display either screen 104 108 or different display or screen When, the gesture that Fig. 4 H are combined can be used to the display of the touch part for receiving gesture, and first window stack position is protected First window storehouse constant is saved as, while the second window stack position in the second window stack of resequencing, by window bag Include in the display of light cunning or drag gesture is received.Although specific gesture and the gesture capture region in above-mentioned example is Be associated with the set of corresponding function result, however, it will be appreciated that these associations can be redefined in any way, with Different associations is generated between gesture and/or gesture capture region and/or function result.
Firmware and software:
Memory 508 can store and processor 504 can perform one or more software components.These components can be with Including at least one operating system (OS) 516a and/or 516b, frame (framework) 520, and/or from application memory 560 one or more application 564a and/or 564b.Processor 504 can receive the input for carrying out output from driver 512, with reference to Fig. 2 It is previously described.Any software that OS 516 can be made of program and data, it manages computer hardware resource and carries The common service performed for multiple applications 564.OS 516 can be any operating system, and at least in some embodiments, its It is used exclusively for mobile device, Linux, ANDROID TM, iPhone OS (IOS TM), WINDOWS PHONE 7TM etc.. By performing one or more operations and operable OS 516 to provide function to phone, as described herein.
Can be for any high layer software of user's execution specific function using 564.It can include program, example using 564 Such as E-Mail client application, Web browsing application, text application, game, media play procedure, office procedure group etc..Using 564 can be stored in application memory 560, application memory 560 can represent storage using 564 any memory or Data storage, and management software is further associated.Once it is performed, using the 564 not same districts that may operate at memory 508 In domain.
Frame 520 can allow to run multiple tasks on device with interactive any software or data.In embodiment In, at least a portion of frame 520 and separating component described below can be considered as a part for OS516 or application 564. However, these parts will be described as a part for frame 520, but these components are not limiting.Frame 520 can wrap Include, but be not limited to, multi-display management (MDM) module 524, Surface Cache module 528, window management module 532, Input management module 536, task management module 540, display controller, one or more frame buffers 548, task stack 552, One or more window stacks 550 (it is the logic arrangement of window and/or desktop in display area), and/or event buffering Device 556.
MDM module 524 includes one or more modules, these modules can be operable to the application on the screen of managing device Or the display of other data.The embodiment of MDM module 524 is described with reference to Fig. 5 B.In embodiment, MDM module 524, which receives, comes From the input of OS 516, driver 512 and application 564.These inputs assist MDM module 524 according to the preferred of application (preference) and demand (requirement), and user operation and determine how configuration and distribution display.Once Display device structure is determined, then MDM module 524 will can be bound using 564 with display device structure.It is then possible to should Configuration provides give one or more miscellaneous parts to generate display.
Surface Cache module 528 includes any memory or storage device and software associated therewith, with The one or more images of storage or caching from display screen.Each display screen can by screen with it is a series of movable and inactive Window is associated (or other display objects (for example, desktop is shown)).Active window (or other display objects) is current The window shown.Inactive window (or other display objects) has been opened and/or has shown for a moment, but present position In " below " of active window (or other display objects).In order to improve user experience, by another active window (or Other display objects) before covering, the image that can be ultimately produced with memory window (or other display objects) " screen is fast According to ".Surface Cache module 528 can be operable to the image of memory window (or other display objects) last activity, and The image that non-present is shown.Therefore, Surface Cache module 528 is by the image of inactive window (or other display objects) It is stored in data storage (not shown).
In embodiment, window management module 532 can be operable to management activity or inactive window on every screen (or other display objects).Based on from MDM module 524, the information of OS 516 or miscellaneous part, window management module 532 determine when window is movable or inactive.Then, window management module 532 is non-visual with " inactive state " placement The window (or other display objects) of change, stops the operation of application with reference to task management module task management 540.In addition, window Mouthful management module 532 can give window (either other display objects) distribution screen identifier or management and window (or its He shows object) one or more sundry items of associated data.Window management module 532, which can also be given, to be applied 564, appoints Management module 540 of being engaged in or the miscellaneous part interactive or associated with window (or other display objects) provide storage information.
Input management module 536 can be operable to the event of managing device generation.Event is any defeated in Windows Enter, for example, user interface is interacted with user.Input management module 536 and receive event, and be logically stored in event In events buffer 556.It is " down event " that event, which can include this user interface interaction, its generation when screen 104, During 108 reception touch signal from the user, the finger of user is being determined in " moving event ", its generation when screen 104,108 When being moved along screen, " upward event ", its generation is determining that user has stopped touching screen 104 when screen 104,108, When 108, etc..By inputting management module 536, these events can be received, store these events and by these event transmissions Give other modules.
Task can be application component, in order to complete some things, such as call, take pictures, send Email or Person watches map, which provides the screen interacted to the user.Can be that each task gives a window, at it Middle acquisition user interface.Typically, window filling display 110,114, but display 110 can be less than, 114, and Float over above other windows.In general, formed using multiple action by loosely limiting each other.Typically, by the task in application It is appointed as " leading " task, starts when first in application, being presented to user.Then, each task can start another It is engaged in perform different operations.
Task management module 540 can be operable to the operation for the one or more application 564 that management can be performed by device.Cause This, task management module 540 can receive signal, to perform the application stored in application memory 560.Then, task pipe Reason module 540 may be exemplified the one or more tasks or component using 564, to start the operation using 564.In addition, task Management module 540 can be based on user interface and change and stop to apply 564.Stop to be stored in using data using 564 In memory, but it may limit or stop the cycle being accessed using 564 pairs of processors.Once using become again activity, Task management module 540 can provide the access to processor again.
Display controller 544 can be operable to that display is presented and exported for multi-screen device.In embodiment, display control Device 544 processed creates and/or manages one or more frame buffers 548.Frame buffer 548 can be display output, it, which drives, comes The display of a part for the memory of the self-contained whole frame of display data.In embodiment, display controller 544 manages one Or multiple frame buffers.Frame buffer can be the frame buffer of synthesis, it can represent the whole display area of two screens. The frame buffer of this synthesis can be presented to OS 516 as single frame.According to needed for each 110,114 use of display, show Show that device controller 544 can the sub frame buffer for dividing this synthesis.Therefore, by using display controller 544, device 100 can have multiple screen displays, the software main without changing OS 516.
Application manager 562 can be the business that expression layer is provided for Windows.Therefore, application manager 562 is logical Window management module 556 is crossed to present and graphical model is provided.Equally, desktop 566 provides expression layer for application memory 560.Cause This, desktop is the graphical model that application 564 in application memory 560 provides the surface with optional application icon, can be with The application is supplied to window management module 556 for presenting.
Fig. 5 B show the embodiment of MDM module 524.MDM module 524 can be operable to determine ambient condition for device, Including but not limited to, what the orientation of device, is carrying out using 564, will how to be shown using 564, user is guiding What operation, the task dispatching shown.In order to configure display, MDM module 524 explains these environmental factors, and really Display configuration is determined, it is described such as to combine Fig. 6 A-6J.Then, MDM module 524 can will apply 564 or other device portions Part is bound with display.It is then possible to this configuration is sent to display controller 544 and/or OS 516, with generation Display.MDM module 524 can include following one or more, but be not limited to, and show configuration module 568, preferred module 572, unit state module 574, gesture module 576, demand module 580, event module 584, and/or binding module 588.
Show that configuration module 568 determines the layout of display.In embodiment, display configuration module 568 can determine environment Factor.Environmental factor can be received from other one or more 524 modules of MDM module or other sources.Then, display configuration Module 568 can determine the best configuration for display according to factor catalogue.Configuration that may be present is described with reference to Fig. 6 A-6F And some embodiments of factor associated therewith.
Preferred module 572 can be operable to determine preferred using the display of 564 or miscellaneous part.For example, using for Single display or dual display can have preferred.Preferred module 572 can determine or receive the preferred of application, and store excellent Choosing.With the change that device configures, can reexamine preferably to determine for whether using 564 preferable display can be reached Configuration.
Unit state module 574 can be operable to determine or the state of reception device.The state of device can be such as knot It is described like that to close Fig. 3 A and 3B.Display configuration module 568 can determine the configuration for display with the state of use device. Equally, unit state module 574 can receive input, and the state of interpreting means.Then, display is provided status information to Configuration module 568.
Gesture module 576 can be operable to determine whether user carries out any operation just in user interface.Therefore, gesture Module 576 can receive from task stack 552 or input the mission bit stream of management module 536.These gestures can be as With reference to defined in Fig. 4 A to 4H like that.For example, moving window causes display that series of displays frame is presented, which illustrates window Movement.Gesture module 576 can receive and explain the gesture associated with this user interface interaction.Then, will be on The information of user gesture is sent to task management module 540, to change the display of task binding.
Similar to preferred module 572, demand module 580 can be operable to determine to show for application 564 or miscellaneous part Demand.The demand for the defined display being had to comply with using that can have.Some applications need specific display orientation.Example Such as, can only be shown using " mad bird " with horizontal orientation.It can determine or receive by demand module 580 and is such Display demand.With the change that device is orientated, demand module 580 can declare the display demand using 564 again.Display configuration Module 568 can be generated to be configured with using the consistent display of display demand, as demand module 580 provides.
Similar to gesture module 576, event module 584 can be operable to determine one of application or miscellaneous part generation Or multiple events, they can influence user interface.Therefore, gesture module 576 can be from events buffer 556 or task pipe Manage module 540 and receive event information.These events can change how task is tied to display.For example, receive Email E-mail applications can cause display that new message is presented on pair screen.Event module 584 can receive and explain with The event that this application execution is associated.It is then possible to display configuration module 568 will be sent on the information of event, to repair Change the configuration of display.
Binding module 588 can be operable to match somebody with somebody application 564 or other components with what display configuration module 568 determined Put and mutually bind.In memory, bind the display configuration of each application is associated with the display applied and pattern.Therefore, tie up Cover half block 588 can be associated with the display configuration of application by application (for example, laterally, longitudinal, multi-screen etc.).Then, bind Module 588 can give display distribution to show identifier.Display identifier will be applied associated with the specific screen of device.Then, This binding is preserved, and provides it to display controller 544, OS 516, or miscellaneous part, it is aobvious this is suitably presented Show.The binding is dynamic, and can based on event, gesture, state change, associated using preferred or demand etc. Configuration change and be changed or update.
User interface configures:
Referring now to Fig. 6 A-J, the issuable various types of output configurations of device 100 explained below.
Fig. 6 A and 6B describe two different output configurations of the device 100 in first state.Specifically, Fig. 6 A are retouched The device 100 in the longitudinal state 304 closed is stated, wherein the display data in main screen 104.In this illustration, device 100 first longitudinal direction configuration 604 in via 110 display data of touch-sensitive display.As being realized that, first longitudinal direction configuration 604 can a desktop or operating system main screen.Alternatively, there may be one or more windows on machine-direction oriented, at the same time Device 100 just configures 604 display datas with first longitudinal direction.
Fig. 6 B describe the device 100 of longitudinal state 304 still in closing, but alternatively, are shown on pair screen 108 Registration evidence.In this illustration, device 100 configures 608 via 114 display data of touch-sensitive display with second longitudinal direction.
The similar or different data of 604,608 displays are configured with first or second longitudinal direction there may be.By to Device 100 provides user gesture (for example, double-clicking gesture), menu selection or other means, first longitudinal direction configuration 604 and second Conversion between longitudinal direction configuration 608 is also what be there may be.Other appropriate gestures are can also use, to be carried out between configuration Conversion.In addition, the state being moved to based on device 100, device 100 is from first or second longitudinal direction configuration 604,608 to herein The conversion of any other configuration is also what be there may be.
Alternative output can be provided by the device 100 in the second state to configure.Specifically, it is vertical to describe the 3rd by Fig. 6 C To configuration, wherein the display data at the same time in both main screen 104 and secondary screen 108.3rd longitudinal direction configuration can be considered dual vertical Export and configure to (PD).In PD output configurations, the touch-sensitive display 110 of main screen 104 describes number with first longitudinal direction configuration 604 According to, while the touch-sensitive display 114 of secondary screen 108 describes data with second longitudinal direction configuration 608.When device 100 is in what is opened During longitudinal state 320, first longitudinal direction configuration 604 and second longitudinal direction are presented while configuring 608 to be occurred.In this configuration In, device 100 can one application widget of display, two application widget (each displays in a display 110 or 114 Device 110 and 114 shows one), an application widget and a desktop, or a desktop.It also likely to be present other configurations.Should When, it is realized that the state being moved to based on device 100, device 100 arrive as described herein from the configuration 604,608 of display at the same time The conversion of what other configurations is also what be there may be.In addition, in this state, the display of application can preferably place device Into duplexmode, two of which display is all movable, so as to show different windows with identical application.For example, camera Using can show view finder and be controlled in side, while opposite side shows mirror image preview, it can pass through photo main body And see.Duplexmode can also be used comprising the game played at the same time by two players.
Two further output configurations of device 100 of Fig. 6 D and the 6E description in the third state.Specifically, Fig. 6 D are retouched The device 100 in the transverse state 340 closed is stated, wherein the display data in main screen 104.In this illustration, device 100 with the first landscape configuration 612 via 110 display data of touch-sensitive display.Just as other configurations as described herein, first laterally Configuration 612 can show desktop, and main screen, display is using one or more windows of data etc..
Fig. 6 E describe the device 100 of the transverse state 340 still in closing, but alternatively, are shown on pair screen 108 Registration evidence.In this illustration, device 100 with the second landscape configuration 616 via 114 display data of touch-sensitive display.With first Or second longitudinal direction configures the similar or different data of 612,616 displays and there may be.By providing bending to device 100 With Flick gesture or one or two of bullet or slip gesture, between the first landscape configuration 612 and the second landscape configuration 616 Conversion is also what be there may be.Other appropriate gestures are can also use, to be changed between configuration.In addition, based on dress 100 states being moved to are put, device 100 is matched somebody with somebody from the first or second landscape configuration 612,616 to as described herein any other The conversion put is also what be there may be.
Fig. 6 F describe the 3rd landscape configuration, wherein the display data at the same time in both main screen 104 and secondary screen 108.3rd Landscape configuration can be considered as dual transverse direction (LD) output configuration.In LD output configurations, the touch-sensitive display 110 of main screen 104 Data are described with the first landscape configuration 612, meanwhile, the touch-sensitive display 114 of pair screen 108 describes number with the second landscape configuration 616 According to.When device 100 is in the transverse state 340 opened, the first landscape configuration 612 and the second landscape configuration 616 while, is in It can now occur.It should be recognized that the state being moved to based on device 100, configuration 612,616 of the device 100 from display at the same time Conversion to any other configuration as described herein is also what be there may be.
Fig. 6 G and 6H describe two views of the device 100 in another state.Specifically, device 100 is described For in stent state 312.Fig. 6 G are shown can show first support output configuration 618 on touch-sensitive display 110.Fig. 6 H Second support output configuration 620 can be shown on touch-sensitive display 114 by showing.Device 100 can be configured as individually First support output configuration 618 or second support output configuration 620 are described.Alternatively, the two stents can be presented at the same time Output configuration 618,620.In certain embodiments, stent output configuration 618,620 can be with laterally exporting 612,616 phases of configuration It is seemingly or identical.Device 100 can be additionally configured to one or two of display stent output configuration 618,620, when in improvement Stent state 316 when.It should be appreciated that (such as it can be fought in order to two-player game using stent output configuration 618,620 at the same time Row warship, chess, draughts etc.), multi-person conference (wherein, the identical device 100 of two or more users to share) and its He applies.As being realized that, the state that is moved to based on device 100, device 100 is from showing one or two configurations The conversion of 618,620 to other configurations as described herein is also what be there may be.
Fig. 6 I are described when device 100 is in the longitudinal state 320 opened, another output configuration that can be provided. Specifically, device 100 can be configured as with longitudinal configuration, herein referred to as maximum (PMax) configuration 624 in longitudinal direction, across two Single continuous image is presented in a touch-sensitive display 110,114.In this configuration, data can be divided (for example, single figure Picture, application, window, icon, video etc.), and be partially shown on one of touch-sensitive display, while the other parts of data are shown Show on another touch-sensitive display.Pmax configurations 624 can aid in the display of bigger and/or more preferable resolution ratio, with For showing specific image on the device 100.Similar to other output configurations, the state being moved to based on device 100, device 100 there may be from Pmax configurations 624 to the conversion of any other output configuration as described herein.
Fig. 6 J are described when device 100 is in the transverse state 348 opened, another output configuration that can be provided. Specifically, device 100 can be configured as with landscape configuration, herein referred to as laterally (LMax) configuration 628 of maximum, across two Single continuous image is presented in a touch-sensitive display 110,114.In this configuration, data can be divided (for example, single figure Picture, application, window, icon, video etc.), and be partially shown on one of touch-sensitive display, while the other parts of data are shown Show on another touch-sensitive display.Lmax configurations 628 can aid in the display of bigger and/or more preferable resolution ratio, with For showing specific image on the device 100.Similar to other output configurations, the state being moved to based on device 100, device 100 there may be from Lmax configurations 628 to the conversion of any other output configuration as described herein.
Device 100 manages desktop and/or window with least one window stack 1700,1728, such as Figure 10 A and 10B institutes Show.Window stack 1700,1728 is the logic arrangement of multi-screen device activity and/or inactive window.For example, window stack 1700,1728 can logically be similar to card deck, wherein arranging one or more windows or desktop in order, such as scheme Shown in 10A and 10B.Active window be touch-sensitive display 110,114 it is at least one on the window that is showing.For example, window Mouth 104 and 108 is active window, and the display window 104 and 108 on touch-sensitive display 110 and 114.Inactive window is Open and shown but be now arranged in active window " below " and be not the window shown.In embodiment, it is non-live Dynamic window can be used for the application stopped, and therefore, which no longer shows activity description.For example, window 1712,1716, 1720 and 1724 be inactive window.
Window stack 1700,1728 can have a variety of arrangements or organization.In the embodiment shown in Figure 10 A, dress Putting 100 includes first storehouse 1760 associated with the first touch-sensitive display 110 and associated with the second touch-sensitive display 114 The second storehouse.Therefore, each touch-sensitive display 110,114 can be provided with associated window stack 1760,1764.This two A window stack 1760,1764 can have different multiple windows in the arrangement of respective storehouse 1760,1764.In addition, may be used also Differently to identify and manage respectively the two window stacks 1760,1764.Therefore, can be arrived according to from first window 1704 Next window 1720, to a last window 1724 and the sequential arrangement first window storehouse of most Zhongdao desktop 1722 1760, in embodiment, desktop 1722 is located at " bottom " of window stack 1760.In embodiment, desktop 1722 is not always Positioned at " bottom ", because application widget can be arranged in window stack of the desktop below 1722, and when desktop is shown, Desktop 1722 can reach " top " of storehouse and on other windows.It is also possible to from first window 1708, to next Window 1712, the second storehouse 1764, in embodiment, table are arranged to a last window 1716 and most Zhongdao desktop 1718 Face 1718 is individual desktop region, and all windows in window stack 1760 and window stack 1764 are located at together with desktop 1722 Below.Logic data structure for managing the two window stacks 1760,1764 can be as with reference to as described in Figure 11.
Another arrangement of window stack 1728 is shown in fig. 1 ob.In this embodiment, it is touch-sensitive aobvious for two Show device 110,114, there are single window storehouse 1728.Therefore, from desktop 1758, to first window 1744, to a last window Mouth 1756 arranges window stacks 1728.Window can be arranged in the position between all windows, specific without being associated with Touch-sensitive display 110,114.In this embodiment, window is the order according to window.In addition, by least one window It is determined as activity.For example, single window can be presented in two parts 1732 and 1736, the two are partially shown in On first touch sensitive screen 110 and the second touch sensitive screen 114.The single window can only occupy the single position in window stack 1728 Put, although it is shown on both displays 110,114.
Figure 10 C to 10E show another arrangement of window stack 1760.Window stack is shown with three kinds " facing " 1760.In fig 1 oc, the top of window stack 1760 is shown.The two of window stack 1760 is shown in Figure 10 D and 10E Side.In this embodiment, window stack 1760 is just as substantial amounts of program block.Overlie one another these windows.From the window of Figure 10 C The top of storehouse 1760 starts to watch, in only 1760 top of window stack that the different piece of combining display 1764 is seen Window.Combining display 1764 describes logical model for the whole display area of device 100, it can include touch-sensitive display 110 and touch-sensitive display 114.Desktop 1786 or window can take up partly or completely combining display 1764.
In an illustrated embodiment, desktop 1786 is display minimum in window stack 1760 or " program block ".Therefore, window Mouth 1 1782, window 2 1782, window 3 1768 and window 4 1770 are layerings.Window 1 1782, window 3 1768, window 2 1782 and window 4 1770 only occupy the part of combining display 1764.Therefore, another part of storehouse 1760 includes section Window 8 1774 and window 5-7 shown in 1790.In fact, only it is presented and shows top in 1764 arbitrary portion of combining display Window.Therefore, as shown in the top view of Figure 10 C, window 4 1770, window 8 1774 and window 3 1768 are shown in The top that 1760 different piece of window stack is shown.Window size can be adjusted only to occupy the one of combining display 1760 Point, so that the window that " display " window stack 1760 is relatively low.For example, window 31768 in storehouse than window 4 1770 and window 8 1774 is all low, but still can be shown.The logic data structure of management window stack can be as described in connection with fig. 11 that Sample.
When opening new window, movable window is generally placed at the top of storehouse again.However, will in storehouse Window place where and how place can with the orientation of device 100, be carrying out on the device 100 what program, How the environment of function, software etc., place storehouse etc. when opening new window and become.In order to which window is inserted into storehouse, really Determine position of the window in storehouse, and can also determine the touch-sensitive display 110,114 that windows associate arrives.Use this letter Breath, can create the logic data structure of simultaneously memory window.When the user interface arrangement that either other events or task change When, thus it is possible to vary window stack is to reflect the change of arrangement.It should be noted that these identical viewpoints as described above can With one or more desktops for managing device 100.
Figure 11 shows the logic data structure 1800 for managing window or desktop arrangement.Logic data structure 1800 Can be for storing data, no matter the arbitrary data structure of object, record, file etc..Logic data structure 1800 can be deposited Storage is in any type of database either data-storage system but regardless of agreement or standard.In embodiment, logical data Structure 1800 includes one or more, field, attribute etc..Reasonably to arrange column storage, which considers letter Cease easy storage and retrieval.Hereinafter, it should by the referred to as field such as these one or more parts, field, attribute.Should Field can be stored for window identifier 1804, size 1808, stack position identifier 1812, display identifier 1816 And/or the data of activity indicators 1820.Each window in window stack can be provided with associated logic data structure 1800.Although illustrate only single logic data structure 1800 in fig. 11, there may be more or smaller and window The logic data structure 1800 (quantity based on window in storehouse or desktop) that storehouse is used together, by 1824 table of ellipsis Show.Furthermore, it is possible in the presence of than the more or fewer fields of field shown in Figure 11, represented by ellipsis 1828.
Window identifier 1804 can include any identifier (ID), the ID relative to other windows in window stack, Associated window can be uniquely identified out.Window identifier 1804 can be globally unique identifier (GUID), digital ID, word Female numeral ID, or other kinds of identifier.In embodiment, window identifier 1804 can be one, two, Huo Zheduo A numeral, the quantity based on openable window.In alternative embodiment, the size of window identifier 1804 can be based on The quantity of the window of opening and change.When window is opened, window identifier 1804 can be static, and remain unchanged.
Size 1808 can be included in the size of the window in combining display 1760.For example, size 1808 can include The coordinate at two or more angles of window, or a coordinate and the width of window and the size of height can be included.These rulers Very little 1808 can depict which part that window can take up combining display 1760, it can take up whole combining display 1760 or only occupy the part of combining display 1760.For example, window 4 1770 can have indication window 1770 will only The size 1880 of a part for the display area of combining display 1760 is occupied, as shown in Figure 10 C to 10E.Due to that can move Window is inserted into window stack by window, so size 1808 can change.
Stack position identifier 1812 can be any identifier, which can be with position of the identification window in storehouse Put, control the record such as catalogue or storehouse of window can either be deduced out of data structure.Stack position identifier 1812 can be GUID, digital ID, alphanumeric ID, or other kinds of identifier.Each window or desktop can wrap Include stack position identifier 1812.For example, as shown in Figure 10 A, the window 1 1704 in storehouse 1 1760 can have 1 heap Stack location identifier 1812,1 identifies that window 1704 is first window of storehouse 1760, and is active window.Equally, window Mouth 6 1724 can represent that window 1724 be the 3rd window of storehouse 1760 with 3 stack position identifiers 1812,3.Window Mouth 2 1708 can also represent that window 1708 be first window of the second storehouse 1764 with 1 stack position identifiers 1812,1 Mouthful.As shown in Figure 10 B, window 1 1744 can have 1 stack position identifier 1812, be presented in part 1732 and 1736 Window 3 can have 3 stack position identifier 1812, and window 6 1756 can have 6 stack position identifier 1812.Therefore, the type based on storehouse, stack position identifier 1812 can represent the position of window in storehouse.
Display identifier 1816 can be associated with specific display with identification window or desktop, for example, first is aobvious Show the either combining display 1760 of second display 114 or the two displays composition of device 110.Although for more storehouses This display identifier 1816 is not required in system, and as shown in Figure 10 A, but display identifier 1816 can indicate scheming Whether the window in the continuous storehouses of 10B shows on a specific display.Therefore, in fig. 1 ob, window 3 can have two Part 1732 and 1736.Part I 1732 can have the display identifier 1816 for the first display, while second Part 1736 can have the display identifier 1816 for second display 114.However, in alternative embodiment, window Mouth can have two display identifiers 1816, it represents to show the window on both displays 110,114, or A display identifier 1816 with identification combining display.In another alternative embodiment, window can have Individual monitor identifier 1816, to represent in display 110,114 both upper display windows.
Similar to display identifier 1816, activity indicators may not necessarily be needed for the dual stack system of Figure 10 A 1820, because the window in stack position 1 is activity and shows.In the system of Figure 10 B, activity indicators 1820 can refer to Show which of storehouse/which window shown.Therefore, in fig. 1 ob, window 3 can have two 1732 Hes of part 1736.Part I 1732 can have activity indicators 1820, while Part II 1736 can also have activity indicators 1820.However, in alternative embodiment, window 3 can have single-unit activity designator 1820.Activity indicators 1820 can be with It is to represent that window is the simple mark or position that activity is either shown.
Figure 12 shows a kind of embodiment for the method 1900 for being used to create window stack.Meanwhile Figure 12 shows method The common order of 1900 the step of.In general, method 1900 starts from starting step 904, and end at end step 1928.Side Method 1900 can include more or less steps, or can be differently configured from the order of step illustrated in fig. 12 and arrange these The order of step.Method 1900, which may be performed that, to be performed by computer system and is encoded or store on a computer-readable medium Set of computer-executable instructions.Hereinafter, it should with reference to the described systems of Fig. 1-11, component, module, software, Data structure, user interface etc. carry out illustration method 1900.
Multi-screen device 100 can receive the activity of window, in step 1908.In embodiment, multi-screen device 100 can With by receive from touch-sensitive display 110 either 114 configurable regions 112 either 116 gesture capture regions 120 or 124, or the inputs of some other hardware sensors of user interface input can be operable to receive and receive the activity of window. Processor, which can perform task management module 540, can receive input.Input can be construed to ask by task management module 540 Perform application task, the window which will open in window stack.
In embodiment, user interface interaction is positioned over the aobvious of Multi-Display Management 524 by task management module 540 Show in the task stack 552 that configuration module 568 follows.In addition, task management module 540, which waits, comes from Multi-Display Management 524 information, to send an instruction to window management module 532, so as to create window in window stack.
When receiving the instruction from task management module 540, Multi-Display Management 524 determines again movable Window should be associated with which touch part of combining display 1760, in step 1912.For example, window 4 1770 is with closing A part into display 1764 is associated.In embodiment, the unit state module 574 of Multi-Display Management 524 can To determine how determining device orientation, or device is in any state, for example, opening, closes, longitudinal direction etc..Furthermore it is preferred that mould Block 572 and/or demand module 580 can determine how display window.Gesture module 576 can determine with window will by how Related user view is opened, the type based on gesture and the position for making gesture.
Display configuration module 568 can use the input from these modules, and estimate current window stack 1760, To determine optimum position and optimum size based on observability algorithm, so as to open window.Therefore, show that configuration module 568 is true Window to be placed on the top of window stack 1760 by fixed optimal position, in step 1916.In embodiment, which calculates Method is determined for all parts of combining display, top of their window all in storehouse.For example, the observability is calculated Method determines that window 3 1768, window 4 1770, and window 8 1774 are in the top of storehouse 1760, such as in Figure 10 C-10E Seen.When determining where to open window, display configuration module 568 can distribute display identifier to window 816 and possible size 808.Then, display identifier 816 and size 808 are returned into task management module 540.Then, Task management module 540 can distribute stack position identifier 812 to window, and the instruction of stack position identifier 812 is in window The window's position of stack.
In embodiment, task management module 540 sends window stack information and instruction, and window is presented to window pipe Manage module 532.Window management module 532 and task management module 540 can create logic data structure 800, in step 1924 In.Both task management module 540 and window management module 532 can create and manage the copy of window stack.Pass through window Communication between mouth management module 532 and task management module 540, by these copies synchronizeds of window stack or can keep phase Seemingly.Therefore, the information determined based on Multi-Display Management 524, window management module 532 and task management module 540 can With assignment sizes 808, stack position identifier 812 (for example, window 1 1782, window 4 1770, etc.), display identifier 816 (for example, touch-sensitive display 1 110, touch-sensitive display 2 114, combining display identifiers etc.), and activity indicators 820, in general, the specified campaign designator 820 always when window is in storehouse " top ".Then, window management module 532 and appoint Both business management modules 540 can store logic data structure 800.In addition, hereinafter, window management module 532 and task Management module 540 can manage window stack and logic data structure 800.
Figure 13 describes further window stack configuration.Multiple windows 1,2,3,4,5,6,7 and 8 are described, no matter it Be from identical or different multi-screen or single-screen application.Currently, touch-sensitive display 110 has in active display position Window 4, while currently, touch-sensitive display 114 has window 5 in active display position.From top to bottom, touch-sensitive display 110 storehouse has window 4, and the window 3,2 and 1 placed behind in active display position.It is touch-sensitive from top to bottom The storehouse of display 114 has window 5, and the window 6,7 and 8 placed behind in active display position.
Desktop D1, D2, D3, D4, D5 and D6 are placed behind window stack.Desktop can be regarded as different from window heap The desktop storehouse of stack.From the point of view of by this way, touch-sensitive display 110 has the corresponding desktop heap comprising desktop D3, D2 and D1 Stack, wherein desktop D1 are in 2300 stack position of bottom, and desktop D3 is in can use window 4 (based on the window's position and big Small (no matter maximize or minimize)) display top stack position, and touch-sensitive display 114 has corresponding desktop Storehouse, the desktop storehouse have the corresponding desktop storehouse comprising desktop D4, D5 and D6, and wherein desktop D6 is in 2304 heap of bottom Stack location, and desktop D4 is in can use window 5 (based on the window's position and size (no matter maximize or minimize)) The top stack position of display.Conceptually, in this illustration, desktop can be regarded as being divided into six sections of painting canvas, can touch Two sections therein are shown at any one time on quick display 110,114.When device 100 is closed, in a configuration In, adhere to this conceptual schema.In this configuration, a window and desktop storehouse (equivalent to main screen) only can be seen, still Other window and desktop storehouses are virtual;That is, they are preserved in memory, but it cannot see that them, Because secondary screen is not started.
The image transition indicator of display, is also considered as well, will be before display image (such as window or desktop) It has been shown that, because display image needs to be moved to target touch sensitive display 110,114 from starting.In response to user gesture, display Image transition indicator preview shows user's movement of image.For example, when receiving window movement gesture from the user, turn Designator is changed to be unfolded or slide below from window (will be moved), and the path planning advanced along window moves, or Moved towards target touch sensitive display, to be moved to final window destination.The target occupied by the window after movement is touch-sensitive A part for display is occupied by conversion indicator.Moved after the movement of conversion indicator is completed or in conversion indicator At other dynamic some points, moving window is to occupy as a part for the target touch sensitive display occupied by conversion indicator. In one configuration, the image transition indicator of display is actually with identical with causing the tracking user gesture of display object movement Speed (either with linear or other functions translational speed) is mobile.In configuration, the image transition indicator of display is used In multi-screen application, it needs to expand to two screens or touch-sensitive display.In this configuration, not by the image of display Conversion indicator is used for movement and applies associated display image with single-screen.It that case, mobile practical application is aobvious Diagram picture and output, without including conversion indicator.
Typically, conversion indicator be usually with will be similarly sized with the display image of shape by mobile display image, And it respectively cannot receive or provide dynamic user's input or output.Typically, although not necessarily, it is substantial list The display image of color, it, which has, is different from by the appearance of mobile display image.Conversion indicator can be shown and device 100 Manufacturer, whole seller trade mark or other trademark images that either retailer is associated.In other configurations, conversion indicator It is that user can configure.User can be by shown color or color collection, and pattern, is designed, icon, photo or other figures As electing conversion indicator as.Therefore, user can personalized conversion indicator to be adapted to he or she preference, it is whereby, different User there is different conversion indicators on their respective devices 100.In addition, user can be selected in conversion indicator The one or more audible sound played on to one or more Chosen Points of the stroke loyalty of target touch sensitive display.Example Such as, the sound of user's selection can be played, to announce to start the movement of conversion indicator, along conversion indicator travel paths Middle point, and/or when conversion indicator is in the destination in target touch sensitive display.In addition, user can make Selection adjusts the size of conversion indicator, so that its size is less than or greater than the aobvious of repositioning to forbid conversion indicator Diagram picture, and/or the movement of reclocking conversion indicator, so that its movement is faster or slower than default setting.
In one configuration, well is available to be typically used as multi-screen application rather than as single-screen application.One In a configuration, start conversion indicator, only when the gesture received corresponding to gesture capture region 120,124 moves display figure During picture.In one configuration, conversion indicator is started, only when the gesture received in response to touch-sensitive display 110,114 is moved During dynamic display image.In other configurations, conversion indicator is only associated with some display images movements or conversion.
Now, various examples will be discussed with reference to Fig. 7-8.
In fig. 7, touch-sensitive display 110,114 in longitudinal display orientation, and 1 and second table of display window respectively Face D2.Gesture 700 is received by gesture capture region 120,124.Alternatively, gesture 700 is received by touch-sensitive display 110,114. The gesture can be any appropriate gesture, including, nonrestrictive, those gestures as described above.By gesture 700, use His or her order is pointed out at family, and window 1 is moved to (target) touch-sensitive display 114 from (starting) touch-sensitive display 110.
With reference to figure 7B, the receiving in gesture is proceeded to respond to, conversion indicator 704 starts the movement of left-to-right or right-to-left (orientation based on the device 100 and touch-sensitive display), typically, from the point seemed behind window 1, Yi Jidian Start to move with the translational speed identical with tracking user gesture type.Typically, conversion indicator 704 is not touched with (starting) In the associated image storehouse previously shown of quick display 110.In other words, when receiving gesture, conversion indicator does not have Have in the movable or inactive display location for being presented on touch-sensitive display 110 or 114.When and due to conversion indicator 704 is opened up Open or mobile, the seam 708 between the first and second touch-sensitive displays 110,114 and their respective display images becomes completely Secretly, to show conversion indicator background.In other words, conversion indicator 704 is displaced outwardly from seam 708 with a direction, finally Covering, typically virtually completely covers 114 and second desktop D2 of touch-sensitive display.As shown in figures 7 b and 7 c, conversion indicator Start to cover the second desktop D2.
In fig. 7d, the second desktop D2 in touch-sensitive display 114, while window has been completely covered in conversion indicator 704 Mouth 1 remains unchanged in touch-sensitive display 110.In other words, window 1 is in the display location of the activity of touch-sensitive display 110, together When conversion indicator 704 be in the display location of the activity of touch-sensitive display 114.First and second desktop D1 and D2 respectively all in 110 and 114 inactive display location of touch-sensitive display.In other configurations, the touch-sensitive display of conversion indicator only coverage goal A part for device (114), before starting or starting window movement.
When conversion indicator 704 has moved, and some or all of target touch sensitive display that occupies is (in this example In be touch-sensitive display 114), so as to preceding display image (in this example, preceding display image second desktop D2) Partly or completely dimmed or when being covered by conversion indicator 704, first window 1 makes way, touch-sensitive aobvious to occupy target Show device 114, as Fig. 7 E and 7F are little by little shown.Untill the fixed placement triggering of conversion indicator is mobile, first window 1 after The continuous display image as source touch-sensitive display (being touch-sensitive display 110 in this illustration).In other words, window 1 is kept In the display location of the activity of touch-sensitive display 110, until conversion indicator completes its shifting to target touch sensitive display 114 It is dynamic.At that time, sliding window 1 exposes the first desktop D1 to occupy touch-sensitive display 114 at leisure to cover conversion indicator 704 The display location of activity.When window 1 complete it to target touch sensitive display 114 it is mobile when, the first desktop D1 is in touch-sensitive aobvious Show the display location of the activity of device 110, and the second desktop D2 is in the inactive display location of touch-sensitive display 114.
Fig. 8 A-E show the above-mentioned steps of the device 100 in orientation is shown displayed across, and wherein, just maximize first window To cover at least a portion of the first and second touch-sensitive displays 110 and 114.In fig. 8 a, touch-sensitive display 110,114 difference 1 and second desktop D2 of display window.Gesture 700 is received by gesture capture region 120,124.Alternatively, by touch-sensitive display 110,114 receive gesture 700.The gesture can be any appropriate gesture, including, it is nonrestrictive, it is as described above those Gesture.By gesture 700, user points out his or her order, and window 1 is moved to (mesh from (starting) touch-sensitive display 110 Mark) touch-sensitive display 114.
With reference to figure 8B, in response to the receiving of gesture, conversion indicator 704 start from any from top to bottom or from it is lower to On movement, should from the point of view of get up behind window 1 (orientation based on the device 100 and touch-sensitive display).When and by It is unfolded or moves in conversion indicator 704, in the first and second touch-sensitive display 110,114 and their respective display images Between seam 708 it is completely dimmed, to show conversion indicator background.As shown in Figure 8 B, conversion indicator starts covering second The region that desktop D2 may be viewed by.
In Fig. 8 C, conversion indicator 704 has partly or completely covered the second desktop in touch-sensitive display 114 D2, while window 1 remains unchanged in touch-sensitive display 110.In other words, window 1 is in the aobvious of the activity of touch-sensitive display 110 Show position, while conversion indicator 704 is in the display location of the activity of touch-sensitive display 114.First and second desktop D1 and D2 Respectively all in 110 and 114 inactive display location of touch-sensitive display.
When conversion indicator 704 has moved, and some or all of target touch sensitive display that occupies is (in this example In be touch-sensitive display 114), so as to preceding display image (in this example, preceding display image second desktop D2) Completely dimmed or when being covered by conversion indicator 704, first window 1 makes way, with occupy target touch sensitive display 114 with And starting touch-sensitive display 110, as Fig. 8 D and 8E gradually shown in.Thitherto, first window 1 continues touch-sensitive aobvious as source Show the display image of device (being touch-sensitive display 110 in this illustration).In other words, window 1 is maintained at touch-sensitive display 110 In the display location of activity, until conversion indicator completes its movement to target touch sensitive display 114.At that time, sliding window 1 To cover conversion indicator 704, so as to occupy the display location of starting and the activity of target touch sensitive display 110 and 114.
In different examples, middleware 520 is particularly following one or more:Multi-display management (MDM) class 524, table Face cache class 528, window management class 532, activity management classification 536, and application management classification 540, individually or jointly Detection the receiving of user gesture (step 900) of Fig. 9, and determine that received gesture control shows image, such as window Or desktop, it is moved to target touch sensitive display.In the response, middleware 520 causes conversion indicator 704 touch-sensitive aobvious from originating Show device to the movement (step 1904) of target touch sensitive display.When the coverage goal touch-sensitive display of conversion indicator 704 During selected scope, the movement of middleware 520 shows image with coverage goal touch-sensitive display (and covering conversion indicator).Logic End at step 1912.
The exemplary system and method for the disclosure related with communicator has been described.However, in order to avoid to this Disclose it is unnecessary obscure, foregoing description eliminates many known construction and devices.This omission is not counted as to requiring The limitation of the scope of protection.Sets forth specific details are to provide understanding of this disclosure.Illustrated herein it is appreciated, however, that removing Detail beyond, the disclosure can be implemented in a variety of ways.
In addition, although the exemplary aspect illustrated herein, embodiment, and/or configuration illustrate the various of the system of configuration Component, but some components of the system can be remotely located at distributed network, such as the long-range portion of LAN and/or internet Point, or inside dedicated system.It should thus be appreciated that the component of the system can be incorporated to one or more devices, Such as communicator, or configure in the specific node of distributed network, such as analog and/or digital telecommunications network, Packet network, or circuit-switched network.According to it is described above it should be recognized that and for the sake of computational efficiency, can be with The component of optional position arrangement system in the distributed network of component, the operation without influencing system.For example, different portion Part can be located in converter, such as PBX and media server, gateway, in one or more communicators, positioned at one Or at multiple user's hypothesis, or their some combinations.It is also possible in remote communication devices and associated computing device Between distribution system one or more functions part.
In addition, it should be appreciated that the various links of connecting element can be wired or Radio Link or theirs is any Combination, or data can be provided and/or be transferred to connecting element and any of data is provided and/or transmitted from connecting element Other elements that are known or developing recently.These wired or wireless links can also be safety chain, and can transmit and add Close information.For example, the transmission medium as link can be with any suitable carrier of electric signal, including coaxial cable, copper wire And fibre optics, and acoustics or form of light waves can be used, such as the life during radio wave and infrared data communication Into those.
Equally, although for special time order, discussed and illustrated flow chart, it should be appreciated that can be with There is the change, increase and omission of this order, and do not influence the operation of disclosed embodiment, configuration and aspect substantially.
It can be changed and modifications using the multiple of the disclosure.That may be present will be to provide can the disclosure some are special Sign is without providing other features.
For example, in an alternative embodiment, the preceding movement of conversion indicator has been divulged in addition to window and desktop Display image movement.
In other embodiments, conversion indicator preview from only account for touch-sensitive display maximize and cover two it is touch-sensitive The window of display.
In another alternative embodiment, in the transition period, conversion indicator covers whole touch-sensitive display, works as closing During device 100, and only main screen is movable, and/or conversion indicator covers the two touch-sensitive displays, works as device for opening When 100.It may occur below, for example, when maximizing or opening window and when covering the two touch-sensitive displays, or When the two main screens of transformer effect and secondary screen.When the device on the single touch-sensitive display of the device in closing or in opening Two touch-sensitive displays on maximized window when, conversion indicator can move out of touch-sensitive display from edge.
In other embodiments, the disclosure is suitable for other display image conversions in addition to window moves.Such In conversion, touch-sensitive display changes display image at least in part.After previous display image is removed and inputting newly Before showing image, the change of display image is indicated by using at least a portion of touch sensitive display.
In yet another embodiment, special purpose computer, the microprocessor or microcontroller of programming and outside can be combined Integrated circuit component, ASIC or other integrated circuits, the hardwire electronics of digital signal processor, such as separating element circuit Or logic circuit, such as PLD, PLA, FPGA, PAL, the programmable logic device OR gate array of special purpose computer, any similar dress Put etc. and to perform this disclosed system and method.In general, any device or equipment of the method for being able to carry out illustrating herein can It is used to perform this disclosed various aspects.Can be used for the exemplary hardware of disclosed embodiment, configuration and aspect includes Computer, hand-held device, phone (for example, honeycomb, internet enables, numeral, simulation, mixing etc.), and it is well known in the prior art Other hardware.Some of these devices include processor (for example, list or multimicroprocessor), memory, non-volatile memories Device, input unit and output device.In addition, alternative software realization mode includes but not limited to, distributed treatment or component/ The processing of object distribution formula, parallel processing, or can also virtual machine constructor processing to perform method described herein.
In yet another embodiment, the software that the software development environment of target or definite target can be used in combination comes easily Method disclosed in ground execution, software development environment provide the portable that can be used on various computers or workstation platform Source code.Alternatively, can use using standard logic or the VLSI hardware components designed or system disclosed in fully realizing. No matter system is realized according to this disclosure using software or hardware, it is specific depending on the speed and/or efficiency requirements of system Function, and specific software currently in use or hardware system either microprocessor or microcomputer system.
In yet another embodiment, disclosed method can partly be realized with software, which is stored in storage and is situated between In matter, performed under the cooperation of controller and memory, special purpose computer, microprocessor etc. on the all-purpose computer of programming. In such cases, this disclosed system and method can be performed as embedding program on a personal computer, such as small routine, Either CGI scripting is performed as residing in the resource on server or computer workstation JAVA, is performed as embedded special Routine in measuring system, system unit etc..By the way that system and/or method are practically incorporated to software and/or hardware system System, can also realize the system.
Although the present disclosure describes the aspect related with specific criteria and agreement, the component of embodiment and/or configuration realization And function, but these aspects, embodiment and/or configuration are not limited to such standard and agreement.Not mentioned herein Other similar standards and agreement are also existing, and are to be considered as included in the disclosure.In addition, the standard being mentioned herein and Agreement and the similar standard of this paper others not mentioneds and agreement can be by substantially having faster or more having for identical function The equivalent of effect is periodically replaced.These with identical function replace standard and agreement to be considered as included in the disclosure Equivalent.
In different aspect, the disclosure in embodiment and/or configuration is included substantially as described herein and record Component, method, step, system and/or equipment, including various aspects, embodiment, configures embodiment, from combination, and/or they Subset.Those skilled in the art are after the disclosure has been understood, it is understood that how to produce and for the use of disclosed, it is real Apply example and/or configuration.Different in terms of, the disclosure in embodiment and/or configuration is included in that there is no not described herein And/or record project in the case of, or on this point different aspect, embodiment and/or configuration in provide device and Step, includes in the case of there is no this project, since above device or step may be had been used for, for example, For improving performance, cost of implementation is easily realized and/or reduced.
Have been described above for purposes of illustration and description discussed above.It is not intended to the disclosure being limited to herein above Disclosed one or more form.For example, in a specific embodiment of the above, in order to simplify the purpose of the disclosure, at one Or many aspects, embodiment and/or configuration in, the different characteristic of the disclosure can be gathered together.The aspect of the disclosure, it is real Alternative aspect except those discussed above, embodiment and/or configuration can be incorporated to by applying the feature of example and/or configuration.This Disclosed method will not be interpreted to reflect in each claim to be explicitly described, and claim needs more The intention of feature.On the contrary, as following claims reflect, inventive aspect is illustrated less than single side disclosed above All features of face, embodiment and/or configuration.Therefore, following claims are incorporated to this embodiment accordingly, because Preferred embodiment for each claim separately as disclosure independence.
In addition, although this specification includes one or more aspects, embodiment and/or configuration and some changes and repaiies The description changed, but other changes, combination and modification also are located in the scope of the present disclosure, for example, after the disclosure has been understood, it Can fall into the range of the technology and knowledge of this area.It is intended that obtaining right, these rights are included in tolerance band Alternative aspect, and/or configuration, including claimed alternative, the interchangeable and/or equivalent structure of those, work( Can, scope or step, but regardless of whether disclosing these alternative, interchangeable and/or equivalent structures herein, function, Scope or step, and be not intended to be exclusively used in any theme for obtaining patent publicly.

Claims (15)

1. a kind of method for being used for the moving window between multi-screen device, the described method includes:
A kind of multi-screen device is provided, the multi-screen device includes
First screen, first screen include
First touch-sensitive display;And
First gesture capture region, wherein the first gesture capture region and described first on first screen are touch-sensitive aobvious Show that device is physically separated, and wherein described first gesture capture region does not provide display;And
Second screen, is hinged to first screen, and second screen includes
Second touch-sensitive display;And
Second gesture capture region, wherein the second gesture capture region and described second on second screen are touch-sensitive aobvious Show that device is physically separated, and wherein described second gesture capture region does not provide display;
Drag gesture is received by the first gesture capture region, application is shown activity description by the drag gesture instruction Window is moved to the second touch-sensitive display from the first touch-sensitive display;And
Response receives drag gesture and the movement prior to the window to second touch-sensitive display;
It is touch-sensitive aobvious from first touch-sensitive display to described second from least a portion of microprocessor display conversion indicator Show the movement of device, wherein the conversion indicator is shown as moving with the practically identical speed of the movement with the drag gesture Dynamic, wherein the conversion indicator is image, the conversion indicator cannot receive user's input or provide dynamic output, And wherein described conversion indicator and the window formed objects and same shape;
Showing at least a portion of the conversion indicator from first touch-sensitive display to second touch-sensitive display It is mobile while, continue to show the activity description of the window on first touch-sensitive display;And
In the conversion indicator when being shown to up to predetermined point on second touch-sensitive display, in response to the dragging hand The movement of gesture, window is from first touch-sensitive display to second touch-sensitive display described in the microprocessor display It is mobile.
2. the method for claim 1, wherein the travel path of the conversion indicator along the window is moved to choosing Positioning is put, with the movement of window described in preview.
3. the method for claim 1, wherein the conversion indicator respectively cannot receive or provide dynamic use Family inputs or output, wherein, the conversion indicator has the appearance different from the window, and wherein, described in movement Window is with before second touch-sensitive display is completely covered, the conversion indicator only covers second touch-sensitive display A part.
4. the method for claim 1, wherein when receiving gesture, the conversion indicator is not with described first In the display image storehouse that touch-sensitive display and the second touch-sensitive display are associated, wherein, start the window movement it Before, the window and conversion indicator are respectively at the movable of first touch-sensitive display and the second touch-sensitive display at the same time On display location, and wherein, the conversion indicator includes color, pattern, design and/or the photo of user configuration.
5. the method for claim 1, wherein the conversion indicator has figure available, wherein, should by multi-screen With controlling the window, and wherein, from first touch-sensitive display to the mobile period of second touch-sensitive display, The conversion indicator does not respond user command or request.
6. a kind of non-transitory computer-readable storage media, the readable storage medium storing program for executing is stored with computer-readable program generation Code, when being executed by a processor, the computer readable program code are configured to perform at least the following steps:
Drag gesture is received by the first gesture capture region of double screen communicator, the drag gesture instruction moves window It is dynamic, wherein the drag gesture has mobile direction and mobile speed, wherein the double screen communicator includes at least First screen and the second screen, wherein first screen includes first gesture capture region and the first touch-sensitive display, wherein The first gesture capture region is physically separated with first touch-sensitive display on first screen, wherein described Two screens include second gesture capture region and the second touch-sensitive display, wherein the second gesture capture region and described second Second touch-sensitive display on screen is physically separated, and drag gesture instruction by the window from described first Touch-sensitive display is moved to second touch-sensitive display, wherein the window, which is application, shows activity description, and wherein institute State first gesture capture region and display is not provided, the second gesture capture region does not provide display;
Response receives drag gesture and the movement prior to the window to second touch-sensitive display;
Show movement of at least a portion of conversion indicator from first touch-sensitive display to second touch-sensitive display, Wherein described conversion indicator is shown as moving with the practically identical speed of the movement with the drag gesture, wherein described turn It is image to change designator, and the conversion indicator cannot receive user's input or provide dynamic output, and wherein described Conversion indicator and the window formed objects and same shape;
The sound associated with the display of the movement of the conversion indicator is provided;
Showing at least a portion of the conversion indicator from first touch-sensitive display to second touch-sensitive display It is mobile while, continue to continue to show the activity description of the window on first touch-sensitive display;And
In the conversion indicator when being shown to up to predetermined point on second touch-sensitive display, in response to the dragging hand The movement of gesture, shows movement of the window from first touch-sensitive display to second touch-sensitive display.
7. medium as claimed in claim 6, wherein, the conversion indicator respectively cannot receive or provide dynamic use Family inputs or output, wherein, the conversion indicator has the appearance different from the window, and wherein, described in movement Window is with before second touch-sensitive display is completely covered, the conversion indicator only covers second touch-sensitive display A part.
8. medium as claimed in claim 6, wherein, when receiving gesture, the conversion indicator is not with described first In the display image storehouse that touch-sensitive display and the second touch-sensitive display are associated, wherein, start the window movement it Before, the window and conversion indicator are respectively at the movable of first touch-sensitive display and the second touch-sensitive display at the same time On display location, and wherein, the conversion indicator includes the color of user configuration, pattern, design and/or photo.
9. medium as claimed in claim 6, wherein, the conversion indicator has figure available, wherein, should by multi-screen With controlling the window, and wherein, from first touch-sensitive display to the mobile period of second touch-sensitive display, The conversion indicator does not respond user command or request.
10. a kind of dual screen communicator, it includes:
First screen, including
First touch-sensitive display, is operable to display and shows image, wherein, the display image is the window of application;And
First gesture capture region, is operable to receive drag gesture, wherein the first gesture capture region and described first First touch-sensitive display on screen is physically separated, and wherein described first gesture capture region does not provide display; And
Second screen, is hinged to first screen, and second screen includes
Second touch-sensitive display, is operable to display and shows image;
Second gesture capture region, is operable to receive the drag gesture, wherein the second gesture capture region with it is described Second touch-sensitive display on second screen is physically separated, and wherein described second gesture capture region do not provide it is aobvious Show;And
Middleware, the middleware are operable with least one of the following operation of execution:
The drag gesture is received by the first gesture capture region, the drag gesture instruction will be using display activity The window of content is moved to the second touch-sensitive display from first touch-sensitive display;And
Response receives drag gesture and the movement prior to the window to second touch-sensitive display;
Show movement of at least a portion of conversion indicator from first touch-sensitive display to second touch-sensitive display, Wherein described conversion indicator is shown as moving with the practically identical speed of the movement with the drag gesture, wherein described turn It is image to change designator, and the conversion indicator cannot receive user's input or provide dynamic output, and wherein described Conversion indicator and the window formed objects and same shape;
Showing at least a portion of the conversion indicator from first touch-sensitive display to second touch-sensitive display It is mobile while, continue to show the activity description of the window on first touch-sensitive display;And
In the conversion indicator when being shown to up to predetermined point on second touch-sensitive display, in response to the dragging hand The movement of gesture, shows movement of the window described in the window from first touch-sensitive display to second touch-sensitive display.
11. device as claimed in claim 10, wherein, the display image is the window minimized, wherein, described in maximization Window is to cover at least part of first touch-sensitive display and the second touch-sensitive display.
12. device as claimed in claim 10, wherein the conversion indicator is according to the side associated with the drag gesture Moved to speed, wherein, the conversion indicator is moved to select location along the travel path of the display image, with pre- Look at the extension of the display image, and wherein, the conversion indicator it is actually identical with the display image size and It is identical with the display image shape.
13. device as claimed in claim 10, wherein, the conversion indicator respectively cannot be received or provided dynamically User inputs or output, and wherein, the conversion indicator has the appearance for being different from the display image.
14. device as claimed in claim 10, wherein, when receiving gesture, the conversion indicator is not with described In the display image storehouse that one touch-sensitive display and the second touch-sensitive display are associated, wherein, starting the display image Before extension, the display image and conversion indicator are in first touch-sensitive display and the second touch-sensitive display at the same time respectively On the movable display location of device, and wherein, the color of the conversion indicator including user configuration, pattern, design and/or Photo.
15. device as claimed in claim 10, wherein, the conversion indicator has figure available, wherein, by multi-screen Window described in application control, and wherein, from first touch-sensitive display to the Growth period of second touch-sensitive display Between, the conversion indicator does not respond user command or request.
CN201210458810.2A 2011-09-01 2012-09-03 The method of moving window and dual screen communicator between multi-screen device Active CN103116460B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN201810310376.0A CN108228035B (en) 2011-09-01 2012-09-03 Method for moving window between multi-screen devices and dual-display communication device

Applications Claiming Priority (5)

Application Number Priority Date Filing Date Title
US13/223,778 US20120081309A1 (en) 2010-10-01 2011-09-01 Displayed image transition indicator
US13/223,778 2011-09-01
US38911710A 2011-10-01 2011-10-01
US38908710A 2011-10-01 2011-10-01
US38900010A 2011-10-01 2011-10-01

Related Child Applications (1)

Application Number Title Priority Date Filing Date
CN201810310376.0A Division CN108228035B (en) 2011-09-01 2012-09-03 Method for moving window between multi-screen devices and dual-display communication device

Publications (2)

Publication Number Publication Date
CN103116460A CN103116460A (en) 2013-05-22
CN103116460B true CN103116460B (en) 2018-05-04

Family

ID=48428813

Family Applications (2)

Application Number Title Priority Date Filing Date
CN201810310376.0A Active CN108228035B (en) 2011-09-01 2012-09-03 Method for moving window between multi-screen devices and dual-display communication device
CN201210458810.2A Active CN103116460B (en) 2011-09-01 2012-09-03 The method of moving window and dual screen communicator between multi-screen device

Family Applications Before (1)

Application Number Title Priority Date Filing Date
CN201810310376.0A Active CN108228035B (en) 2011-09-01 2012-09-03 Method for moving window between multi-screen devices and dual-display communication device

Country Status (1)

Country Link
CN (2) CN108228035B (en)

Families Citing this family (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US10664162B2 (en) 2013-11-18 2020-05-26 Red Hat, Inc. Multiple display management
CN109375890B (en) * 2018-09-17 2022-12-09 维沃移动通信有限公司 Screen display method and multi-screen electronic equipment
US11157047B2 (en) * 2018-11-15 2021-10-26 Dell Products, L.P. Multi-form factor information handling system (IHS) with touch continuity across displays
CN114168047B (en) * 2019-08-22 2022-08-26 华为技术有限公司 Application window processing method and device
US20210216102A1 (en) * 2020-01-10 2021-07-15 Microsoft Technology Licensing, Llc Conditional windowing model for foldable computing devices
CN115617295B (en) * 2022-10-21 2024-10-18 武汉海微科技股份有限公司 Multi-screen interaction method, device, equipment and storage medium

Citations (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US7739604B1 (en) * 2002-09-25 2010-06-15 Apple Inc. Method and apparatus for managing windows
CN101827503A (en) * 2009-03-03 2010-09-08 Lg电子株式会社 Portable terminal

Family Cites Families (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US5434965A (en) * 1992-12-23 1995-07-18 Taligent, Inc. Balloon help system
US7176943B2 (en) * 2002-10-08 2007-02-13 Microsoft Corporation Intelligent windows bumping method and system
US20090278806A1 (en) * 2008-05-06 2009-11-12 Matias Gonzalo Duarte Extended touch-sensitive control area for electronic device
US8860632B2 (en) * 2008-09-08 2014-10-14 Qualcomm Incorporated Multi-panel device with configurable interface
JP2011022842A (en) * 2009-07-16 2011-02-03 Sony Corp Display apparatus, display method, and program
CN101847075A (en) * 2010-01-08 2010-09-29 宏碁股份有限公司 Multi-screen electronic device and image display method thereof

Patent Citations (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US7739604B1 (en) * 2002-09-25 2010-06-15 Apple Inc. Method and apparatus for managing windows
CN101827503A (en) * 2009-03-03 2010-09-08 Lg电子株式会社 Portable terminal

Also Published As

Publication number Publication date
CN108228035B (en) 2021-05-04
CN103116460A (en) 2013-05-22
CN108228035A (en) 2018-06-29

Similar Documents

Publication Publication Date Title
US11537259B2 (en) Displayed image transition indicator
CN103430137B (en) Double screen user equipment and the method for controlling the size of double screen application program thereon
JP6073792B2 (en) Method and system for viewing stacked screen displays using gestures
CN103262010B (en) Appeared by the desktop utilizing gesture to move logic display storehouse
CN102999309B (en) Multihead display controls
CN103116460B (en) The method of moving window and dual screen communicator between multi-screen device
CN103282955B (en) Show desktop when the device is opened
JP6073793B2 (en) Desktop display simultaneously with device release

Legal Events

Date Code Title Description
C06 Publication
PB01 Publication
C10 Entry into substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant