CN103116460A - Conversion indicator of display image - Google Patents

Conversion indicator of display image Download PDF

Info

Publication number
CN103116460A
CN103116460A CN2012104588102A CN201210458810A CN103116460A CN 103116460 A CN103116460 A CN 103116460A CN 2012104588102 A CN2012104588102 A CN 2012104588102A CN 201210458810 A CN201210458810 A CN 201210458810A CN 103116460 A CN103116460 A CN 103116460A
Authority
CN
China
Prior art keywords
touch
sensitive display
window
gesture
display
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Granted
Application number
CN2012104588102A
Other languages
Chinese (zh)
Other versions
CN103116460B (en
Inventor
S·瑟帕尔
A·德帕兹
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Flex Electronics Id Co ltd
Original Assignee
Flex Electronics Id Co ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Priority claimed from US13/223,778 external-priority patent/US20120081309A1/en
Application filed by Flex Electronics Id Co ltd filed Critical Flex Electronics Id Co ltd
Priority to CN201810310376.0A priority Critical patent/CN108228035B/en
Publication of CN103116460A publication Critical patent/CN103116460A/en
Application granted granted Critical
Publication of CN103116460B publication Critical patent/CN103116460B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Landscapes

  • User Interface Of Digital Computer (AREA)

Abstract

The invention discloses a dual monitor communication device comprising a gesture capturing area for receiving gestures; a first touch-sensitive display for receiving the gestures and displaying display images, e.g., desktops or windows of applications; a second touch-sensitive display for receiving the gestures and displaying display images; and a middleware. The middleware is used for receiving the gestures, wherein such gesture instruction moves the display image from the first touch-sensitive display to the second touch-sensitive display, e.g., maximize the windows so as to simultaneously cover a part of the two displays; moving conversion indicators to a selected position which will be occupied by the display image from the first touch-sensitive display towards the second touch-sensitive display by responding to and being prior to a movement of the display image to the second touch-sensitive display; and then moving the display image to the selected position from the first touch-sensitive display towards the second touch-sensitive display.

Description

Show the image transitions designator
The cross reference of related application
The application requires U.S. Provisional Application sequence number 61/389,000 according to 35U.S.C. § 119 (e), applies for to submit on October 1st, 2010, is entitled as " DUAL DISPLAY WINDOWING SYSTEM "; 61/389,117, apply for and submitted on October 1st, 2010, be entitled as " MULTI-OPERATINGSYSTEM PORTABLE DOCKETING DEVICE "; 61/389,087, apply for and submitted on October 1st, 2010, be entitled as rights and interests and the right of priority of " TABLET COMPUTING USER INTERFACE ".By the whole and whole purposes of integral body with reference to their instructions, each of above-mentioned file is incorporated herein.
Technical field
The present invention relates to show the image transitions designator.
Background technology
Most of hand-held computing devices (such as cell phone, panel computer and electronic reader) use touch-screen display, not only transmit demonstration information to the user, also receive the input from user interface command.Although touch-screen display may increase the configurability of hand-held device, and provide multiple user interfaces option, this dirigibility typically to need to pay a price.Touch-screen provides the double duty of content and receives user's, although be flexibly for the user, display seemed in a jumble, and cause VC, thereby cause that the user baffles and the loss of yield-power.
The small form factor of hand-held computing device need to be carried out careful balance at the figure that shows and for receiving to input between the zone that provides.On the one hand, little display has fettered display space, and this may increase the difficulty of demostrating action or result.On the other hand, virtual keypad or other user interface mechanism overlap in the application of carrying out or are placed near the application of carrying out, thereby need this application is squeezed in the less part of display.
For the individual monitor touch panel device, this balance behavior is difficult especially.Limited screen space has weakened the individual monitor touch panel device.By this individual monitor during with the input information auto levelizer, the ability of explain information may seriously be hindered in display as the user, particularly when the complex interaction that needs between display and interface.
Summary of the invention
Need a kind of two-sided multi-display hand-held computing device, compare with traditional unitary display hand-held computing device, it provides the power supply and/or the versatility that strengthen.Presented these and other demand by different aspect of the present disclosure, embodiment and/or configuration.Simultaneously, although introduced the disclosure according to one exemplary embodiment, should be appreciated that claimed indivedual aspects of the present disclosure dividually.
In one embodiment, a kind of method provides following steps:
(a) receive gesture by gesture capture region and/or touch-sensitive display, described gesture indication will show that image moves to the second touch-sensitive display from the first touch-sensitive display; And
(b) respond and prior to the movement of described demonstration image to described the second touch-sensitive display, by microprocessor, conversion indicator moved to the select location that is occupied by described demonstration image from described the first touch-sensitive display towards described the second touch-sensitive display; And
By described microprocessor described demonstration image from described first touch-sensitive display towards described second touch-sensitive display moved to described select location (c) thereafter.
In one embodiment, a kind of dual screen communicator comprises:
(i) gesture capture region receives gesture; And
(ii) the first touch-sensitive display receives gesture and shows image, and wherein said demonstration image is desktop and/or the window of using;
(iii) the second touch control display receives gesture and shows image;
(iv) middleware, can carry out the one or more of following operation with being operated:
(A) receive gesture, described gesture indication will show image from described the first touch-sensitive display expansion, even be not or not also all most the second touch-sensitive display to cover; And
(B) respond and arrive prior to described demonstration image the expansion of described the second touch-sensitive display, even the expansion conversion indicator is not or not also all most the second touch-sensitive display to cover; And
Described demonstration image spreading arrived described second touch-sensitive display (C) thereafter.
In a configuration, described demonstration image is desktop, and receives described gesture by described touch-sensitive display.
In a configuration, described demonstration image is window, receive described gesture by described gesture capture region, and described gesture capture region can not show any demonstration image.
In a configuration, minimize described window on described the first touch-sensitive display, and maximize described window to cover most the first and second touch-sensitive displays by gesture.
In a configuration, conversion indicator moves to select location along the travel path that shows image, shows the movement of image with preview, and the size and shape of conversion indicator is in fact identical with the demonstration image.
In a configuration, conversion indicator can not receive or provide dynamic user's I/O respectively, and conversion indicator has and is different from the outward appearance that shows image.
In a configuration, show that image and conversion indicator are in respectively in the display position of activity of the first and second touch-sensitive displays simultaneously, before beginning to show the movement of image, and conversion indicator comprises user configured color, pattern, design and/or photo.
In a configuration, conversion indicator is figure available (affordance), by multi-screen application controls window, and during the movement from the first touch-sensitive display to the second touch-sensitive display, in fact conversion indicator does not respond to user command or request.
Based on particular aspects, embodiment and/or configuration, the disclosure can provide multiple advantage.Conversion indicator can provide more U.S. comfortable user interface with simplifying.It can also be used for causing actual view can not be current in the transition period due to processing delay or other problems.This well (well) can be avoided other skies and dull demonstration attractive in appearance.
To become apparent according to the disclosure these and other advantage.
Word " at least one ", " one or more " and " and/or " be open wording, in operation, they are all conjunction and adversative conjunction.For example, each wording " at least one of A, B and C ", " at least one of A, B or C ", " A, B and C's is one or more ", " A, B or C's is one or more " and " A, B and/or C " mean the associating of associating, B and C of independent A, independent B, independent C, A and B or the associating of A, B and C.
Term " one " or " a kind of " entity refer to the one or more of this entity.Can alternately use equally, in this article term " " (or " a kind of "), " one or more " and " at least one ".Should also be noted that can use alternately that term " comprises ", " comprising " and " having ".
Term used herein " automatically " and variant thereof refer to that carrying out when processing or operating be not any processing or the operation of being completed by concrete people's input.Yet, process or operation can be automatically, process or operation even carried out by concrete or non-concrete people's input, if carry out process or operation before receive this input.People's input is considered to concrete, if this input impact is processed or how operation will be carried out.The input of processing or operate consistent people with execution is not considered to " concrete ".
Term as used herein " computer-readable medium " refers to storer and/or the transmission medium of any practical, and they participate in instruction is offered processor to be used for execution.Such medium can adopt various ways, including, but not limited to, non-volatile media, Volatile media and transmission medium.Non-volatile media comprises, for example, and NVRAM or disk or CD.Volatile media comprises dynamic storage, for example primary memory.The common form of computer-readable medium comprises, for example, floppy disk, flexible plastic disc, hard disk, tape or any other magnet medium, magnet-optical medium, CD-ROM, any other optical medium, card punch, paper tape, any other physical medium, RAM, PROM and EPROM with pass, FLASH-EPROM, the solid state medium that is similar to storage card, any other storage chip or tape, carrier wave as mentioned below or computing machine any other medium that can therefrom read.The digital file attachment of Email or other self-contained archival of information or the set of filing is considered to and the storage medium of reality distribution medium of equal value mutually.When computer-readable medium is configured to database, will be appreciated that, database can be the database of any type, and is for example related, layering, OO and/or similar.Therefore, the disclosure is believed to comprise actual storage medium or distribution media and the known equivalent of prior art and follow-up medium, wherein stores software realization mode of the present disclosure.
Term " desktop " refers to the metaphor saying for trace system.Desktop is considered to " surface " usually, should " surface " comprise typically that picture, invoked icon, widget, file etc. can activate display application, window, pulpit, file, file, document and other illustrated projects.Icon is normally selectable, with via the user interface interaction initiating task, uses or process other operations thereby allow the user to carry out.
Term " display " refers to the part for the screen of exporting to user's Display control computer.
Term " demonstration image " refers to the image that represents on display.Typical demonstration image is window or desktop.Show that image can occupy whole display or a part of display.
Term " display orientation " refers to that the user is the orientation of watching the rectangular display of determining.Two of display orientation general types are vertically (portrait) and horizontal (landscape).In landscape mode, determine display orientation in case the width of display greater than the height of display (for example, the ratio of 4: 3, it is 4, and unit is wide and 3 units are high, the perhaps ratio of 16: 9, it is 16, and unit is wide and 9 units are high).In other words, in landscape mode, the orientation of determining the longer size of display is level basically, and determines that the orientation of the shorter size of display is vertical basically.On the contrary, in vertical mode, determine display orientation in case the width of display less than the height of display.In other words, in vertical mode, the orientation of determining the shorter size of display is level basically, and determines that the orientation of the longer size of display is vertical basically.The multihead display device can have the combining display that comprises all screens.Based on the different orientation of device, combining display can have different display characteristics.
Term " gesture " refers to express the user action of expection idea, operation, purpose, result and/or effect.User action can comprise that (for example, open or stopping device, modifier is orientated operating means, motion track ball or roller etc.), the movement of the body part relevant with device, the implementation relevant with device or the movement of instrument, audio frequency input etc.Can be on device (for example on screen) or obtain gesture with this device, to carry out alternately with this device.
Term as used herein " module " refers to any known or hardware, software, firmware, artificial intelligence, fuzzy logic circuit exploitation afterwards or can carry out the combination of the hardware and software of the function that is associated with element.
Term " gesture is caught " refers to the example of user's gesture and/or sensing or the detection of type.Gesture is caught the one or more zones that may reside in screen, and the gesture zone can be on display, and it can be considered to touch-sensitive display herein, and perhaps the gesture zone is not on display, and it can be considered to the gesture capture region herein.
" multi-screen application " refers to represent the application that can occupy simultaneously one or more windows of a plurality of screens.Usually, multi-screen is used and can be operated in the single screen mode, wherein, only shows one or more windows of this application on a screen, perhaps operates in the multi-screen mode, wherein, shows simultaneously one or more windows on a plurality of screens.
" single screen application " refers to represent at every turn and only can occupy the application of one or more windows of single screen.
Term " screen ", " touch-screen ", " touch screen " refer to a kind of physical arrangement, and it can make the user be undertaken mutual and provide information via display to the user by touch area on screen and computing machine.Touch-screen can be with the contact of multitude of different ways sensing user, and for example, by the variation (for example, resistance or electric capacity) of electrical quantity aspect, sound wave changes, the infrared radiation proximity detection, and light changes detection etc.For example, in electric resistance touch screen, the conduction that normally separates in screen and impedance metal level circulating current.When user's touch screen, two layers are in the contact position contact that becomes, and therefore record the variation of electric field and calculate the coordinate of contact position.In capacitance touch screen, capacitor layers stored charge, this electric charge User be via discharging with contacting of touch-screen, thereby cause the minimizing of capacitor layers electric charge.Measure this minimizing, and definite contact position coordinate.In surface acoustic wave touch screen, transmit sound wave via screen, and the user contacts the interference sound wave.Receiving transducer detects the user and contacts situation, and definite contact position coordinate.
Term " window " refers to the demonstration image (typically, being rectangle) at least a portion of display, and this part comprises or the content that is different from other parts of screen is provided.Window can be covered desktop.
Term used herein " is determined ", " calculating " and " estimation " and alternately use of variation thereof, and can comprise method, step, mathematical operation or the technology of any type.
It should be understood that term used herein " device " should give the most wide in range possible explanation, according to 35U.S.C., joint 112, paragraph 6.Therefore, and there is the claim of term " device " should cover all structures, material or the process that this paper sets forth, and their all equivalents.In addition, structure, material or process and their equivalent should be included in all these described in itself summary of the invention, description of drawings, embodiment, summary and claim.
Above-mentioned is the brief summary of the invention of the disclosure, so that the understanding to some aspects of the disclosure to be provided.This summary of the invention neither range neither the disclosure and the exhaustive overview of different aspect, embodiment and/or configuration.This summary of the invention does not plan to identify key of the present disclosure or critical element, do not plan to delineate out the scope of the present disclosure yet, but presented in simplified form selected viewpoint of the present disclosure, with as below the preamble that more describes in detail that presents.As what will recognize, land productivity is possible with above elaboration or following other aspects of one or more disclosure, embodiment and/or configuration with the feature (feature) described in detail alone or in combination.
Description of drawings
Figure 1A comprises the first view of an embodiment of multi-screen user's set;
Figure 1B comprises the second view of an embodiment of multi-screen user's set;
Fig. 1 C comprises the three-view diagram of an embodiment of multi-screen user's set;
Fig. 1 D comprises the 4th view of an embodiment of multi-screen user's set;
Fig. 1 E comprises the 5th view of an embodiment of multi-screen user's set;
Fig. 1 F comprises the six views of an embodiment of multi-screen user's set;
Fig. 1 G comprises the 7th view of an embodiment of multi-screen user's set;
Fig. 1 H comprises the 8th view of an embodiment of multi-screen user's set;
Fig. 1 I comprises the 9th view of an embodiment of multi-screen user's set;
Fig. 1 J comprises the tenth view of an embodiment of multi-screen user's set;
Fig. 2 is the block diagram of an embodiment of the hardware of this device;
Fig. 3 A is based on the block diagram of an embodiment of state model of this device of the orientation of this device and/or configuration;
Fig. 3 B is based on the table of an embodiment of state model of this device of the orientation of this device and/or configuration;
Fig. 4 A is the first expression at an embodiment of user's gesture of device place's reception;
Fig. 4 B is the second expression at an embodiment of user's gesture of device place's reception;
Fig. 4 C is the 3rd expression at an embodiment of user's gesture of device place's reception;
Fig. 4 D is the 4th expression at an embodiment of user's gesture of device place's reception;
Fig. 4 E is the 5th expression at an embodiment of user's gesture of device place's reception;
Fig. 4 F is the 6th expression at an embodiment of user's gesture of device place's reception;
Fig. 4 G is the 7th expression at an embodiment of user's gesture of device place's reception;
Fig. 4 H is the 8th expression at an embodiment of user's gesture of device place's reception;
Fig. 5 A is the block diagram of an embodiment of device software and/or firmware;
Fig. 5 B is the second block diagram of an embodiment of this device software and/or firmware;
Fig. 6 A is in response to the first expression of an embodiment of unit state generating apparatus configuration;
Fig. 6 B is in response to the second expression of an embodiment of unit state generating apparatus configuration;
Fig. 6 C is in response to the 3rd expression of an embodiment of unit state generating apparatus configuration;
Fig. 6 D is in response to the 4th expression of an embodiment of unit state generating apparatus configuration;
Fig. 6 E is in response to the 5th expression of an embodiment of unit state generating apparatus configuration;
Fig. 6 F is in response to the 6th expression of an embodiment of unit state generating apparatus configuration;
Fig. 6 G is in response to the 7th expression of an embodiment of unit state generating apparatus configuration;
Fig. 6 H is in response to the 8th expression of an embodiment of unit state generating apparatus configuration;
Fig. 6 I is in response to the 9th expression of an embodiment of unit state generating apparatus configuration;
Fig. 6 J is in response to the tenth expression of an embodiment of unit state generating apparatus configuration;
Fig. 7 A-F is a series of vertical display orientation Snipping Tool according to an embodiment;
Fig. 8 A-E is a series of horizontal display orientation Snipping Tool according to an embodiment;
Fig. 9 means the process flow diagram of an embodiment;
Figure 10 A is the expression of window logic storehouse;
Figure 10 B is another expression of an embodiment of window logic storehouse;
Figure 10 C is another expression of an embodiment of window logic storehouse;
Figure 10 D is another expression of an embodiment of window logic storehouse;
Figure 10 E is another expression of an embodiment of window logic storehouse;
Figure 11 is the block diagram for an embodiment of the logic data structure of window stack;
Figure 12 is the process flow diagram of an a kind of embodiment of the method for creating window stack; And
Figure 13 has described window stacking (stacking) configuration according to an embodiment.
In the accompanying drawings, similar parts and/or feature can have identical Reference numeral.In addition, can be distinguished by the letter after the Reference numeral of the similar parts of difference the different parts of same type in similar assembly.If only used the first Reference numeral in instructions, this description is suitable for having any one of similar parts of the first identical Reference numeral, and no matter the second Reference numeral.
Embodiment
That present at this is the embodiment of device.This device can be communicator, for example cell phone, perhaps other intelligent apparatus.This device can comprise two screens, determines that their orientation is to provide the demonstration configuration of multiple uniqueness.In addition, this device can receive in the mode of uniqueness user's input.The overall design of this device and function provide the user who strengthens to experience, and make this device more useful and more effective.
Mechanical property:
Figure 1A-1J illustrates device 100 according to embodiment of the present disclosure.As will be described in more detail, can be with multiple different mode locating device 100, every kind of mode all provides different functions to the user.Device 100 is the multi-screen devices that comprise main screen 104 and secondary screen 108, and each screen is touch-sensitive.In an embodiment, screen 104 and 108 whole front surface can be touch-sensitive, and can receive input by the front surface of user's touch screen 104 and 108.Main screen 104 comprises touch-sensitive display 110, and it also can show information to the user except touch-sensitive.Secondary screen 108 comprises touch-sensitive display 114, and it also shows information to the user.In other embodiments, screen 104 and 108 can comprise more than one viewing area.
Main screen 104 also comprises configurable regional 112, when the user touches configurable regional 112 part, for specific input configures configurable regional 112.Secondary screen 108 also comprises configurable regional 116, for specific input configures configurable regional 116.Zone 112a and 116a have been configured to receive " returning " input that indicating user wants to watch the information of previous demonstration.Zone 112b and 116b have been configured to receive indicating user and have wanted to watch " menu " input from the option of menu.Zone 112c and 116c have been configured to receive indicating user and have wanted to watch " homepage " input of watching the information that is associated with " homepage ".In other embodiments, except above-mentioned being configured to, can configuring area 112a-c and 116a-c to be used for the specific input of other types, comprise the feature of control device 100, some nonrestrictive examples comprise adjusts the whole system power supply, adjust volume, adjust brightness, adjust vibration, the selection display items display (on screen 104 or 108), the operation camera, the operation microphone, and startup/termination telephone is called out.Equally, in certain embodiments, can configuring area 112a-C and 116a-C to be used for specific input, this specific input is based on the application that operates on device 100 and/or the information that shows on touch-sensitive display 110 and/or 114.
Except touch-sensing, main screen 104 and secondary screen 108 can also comprise some zones, and these zones need not the viewing area reception of user's touch screen from user's input.For example, main screen 104 comprises gesture capture region 120, and secondary screen 108 comprises gesture capture region 124.These zones can receive input by the gesture that identification is made by the user, and do not need the user to touch substantially the surface of viewing area.With touch- sensitive display 110 and 114 Comparatively speaking, gesture capture region 120 and 124 can not present the demonstration image usually.
These two screens 104 and 108 link together by hinge 128, and Fig. 1 C (illustrating the rear view of device 100) is clearly shown that.In the embodiment shown in Figure 1A-1J, hinge 128 is the center hinge that connect screen 104 and 108, and when closing hinge with box lunch, screen 104 and 108 can juxtaposition (namely side by side), as shown in Figure 1B (illustrating the front view of device 100).Can hinge-opening 128, thus locate this two screens 104 and 108 with the relative position that differs from one another.As will be described in more detail, depend on the relative position of screen 104 and 108, device 100 can have different functions.
Fig. 1 D illustrates the right side of device 100.As shown in Fig. 1 D, secondary screen 108 also is included in draw-in groove 132 and the port one 36 of its side.In an embodiment, draw-in groove 132 holds dissimilar card, comprises Subscriber Identity Module (SIM).In an embodiment, port one 36 is input/output end port (I/O ports), and it allows device 100 to be connected to other peripherals, for example display, keyboard or printing equipment.As can recognizing, these are some examples, and in other embodiments, device 100 can comprise other grooves and port, the groove and the port that for example are used for holding additional memory devices and/or are used for connecting other peripherals.Equally, be audio sockets 140 shown in Fig. 1 D, for example, it holds core, ring, sleeve (TRS) connector, utilizes earphone or receiver to allow the user.
Device 100 also comprises a plurality of buttons 158.For example, Fig. 1 E illustrates the left side of device 100.As shown in Fig. 1 E, the side of main screen 104 comprises three buttons 144,148 and 152, can configure them for specific input.For example, can configuration button 144,148 and 152 with the many aspects of control device 100 alone or in combination.Some non-limiting examples comprise that whole system power supply, volume, brightness, vibration, selection display items display (on screen 104 or 108), camera, microphone and initiation/termination telephone call out.In certain embodiments, replace button separately, two buttons can be merged into a rocking bar button.This being arranged in following situations is useful: configuration button is controlled the characteristic such as volume or brightness.Except button 144,148 and 152, device 100 also comprises button 156, and as shown in Fig. 1 F, it illustrates the top of device 100.In one embodiment, button 156 is configured to on/off button, is used for the whole system power supply of control device 100.In other embodiments, except or replace button 156 being configured to other aspects of control device 100 the control system power supply.In certain embodiments, button 144,148,152 and 156 one or morely can support different user commands.For instance, press normally the duration that has usually less than about 1 second, and be similar to soon and hit.Medium press have common more than 1 second but less than duration of about 12 seconds.Long pressing has about duration more than 12 seconds usually.For the application of paying close attention on each display 110 and 114, the function of button is normally concrete.For example in phone application and based on specific button, normal, medium or long pressing can mean terminated call, increases call volume, reduces call volume and trigger mic mute.For example in camera or Video Applications and based on specific button, normal, medium or length is pressed can mean the increase convergent-divergent, reduce convergent-divergent and shooting or recording of video.
The device 100 inside also have a plurality of hardware componenies.As illustrated in Fig. 1 C, device 100 comprises loudspeaker 160 and microphone 164.Device 100 also comprises camera 168 (Figure 1B).In addition, device 100 comprises two position transducer 172A and 172B, and they are used to determine the relative position of screen 104 and 108.In one embodiment, position transducer 172A and 172B are hall effect sensors.Yet, in other embodiments, except or replace can using other sensors hall effect sensor.Accelerometer 176 also can be included as the part of device 100, with the orientation of definite device 100 and/or the orientation of screen 104 and 108.To describe and to install the 100 additional hardware componenies that comprise referring to Fig. 2.
The overall design of device 100 allows it that the additional function that can not provide in other communicators is provided.Some functions are based on a plurality of positions that device 100 can have and orientation.As shown in Figure 1B-1G, device 100 can operate in " opening " position, and wherein screen 104 and 108 is juxtaposed.This position provides large viewing area to show information to the user.When position transducer 172A and 172B determined device 100 and be shown in an open position, they can generate the signal that can be used to trigger different event, for example all show information on screen 104 and 108.Can trigger additional event, be in the lengthwise position (Figure 1B) opposite with the lateral attitude (not shown) if accelerometer 176 is determined device 100.
Except open position, device 100 can also have Fig. 1 H illustrated " closing " position.In addition, position transducer 172A and 172B can generate the signal that indicating device 100 is in " cutting out " position.This can trigger the event that causes that on screen 104 and 108, demonstration information changes.For example, device 100 may plan to stop at the upper demonstration information of one of screen, for example, screen 108, because when device 100 is in " closing " position, the user watches a screen at every turn only.In other embodiments, the indicating device 100 that is generated by position transducer 172A and 172B is in the signal of " cutting out " position can answer external call by flip flop equipment 100.Should " closing " position can also be with the optimum position of device 100 as mobile phone.
Can also be with Fig. 1 I illustrated " support " position operative installations 100.In " support " position, screen 104 and 108 is an angle toward each other, and in fact the screen 104 of level and 108 edge towards the outside.In this position, device 100 can be configured to all show information on screen 104 and 108, carry out mutual with device 100 simultaneously to allow two users.When device 100 is in " support " position, sensor 172A and 172B generate the signal that instruction screen 104 and 108 is an angle orientation each other, and accelerometer 176 can generate the signal that indicating device 100 has been placed, so that the edge of screen 104 and 108 is actually level.Then, can unite these signals of use to generate the event that the information demonstration changes on screen 104 and 108 that triggers.
Fig. 1 J illustrates the device 100 that is in " improved support " position.In " improved support " position, screen 104 or one of 108 is used as shelf, and faces down facing on the surface such as the object of desk.This position provides the convenient manner that shows information in horizontal orientation for the user.Be similar to backing positions, when device 100 was in " improved support " position, position transducer 172A and 172B generated the signal that instruction screen 104 and 108 is an angle orientation each other.Accelerometer 176 will generate the signal that indicating device 100 has been positioned, so that one of screen 104 and 108 face down and level in fact.Then, these signals can be used to generate the event that triggers screen 104 and 108 information demonstration change.For example, information can no longer be presented on ventricumbent screen, because the user cannot see this screen.
Can also there be transition state.When position transducer 172A and B and/or accelerometer are pointed out that screen is being closed or when folding (from opening), identify and close transition state.On the contrary, point out that screen is being opened or when folding (from closing), identify and open transition state as position transducer 172A and B.Typically, close and open transition state and be based on the time, perhaps the starting point from sensing begins to have the maximum duration.Usually, be in when effective when closing with one of open mode, can not exist the user to input.So, close or opening function during, it is that the user inputs that accidental user's contact screen will not be mistaken as.In an embodiment, may there be another transition state, when stopping device 100.When inputting based on some users, when for example double-clicking screen 110,114 stopping device 100, this additional transition state allows to show and switches to the second screen 108 from a screen 104.
As can recognizing, the description of device 100 only is used as illustrative purpose, and these embodiment are not limited to Figure 1A-1J and above-mentioned concrete mechanical property.In other embodiments, device 100 can comprise additional feature, comprises one or more additional buttons, groove, viewing area, hinge and/or locking mechanism.In addition, in an embodiment, above-mentioned feature can be arranged in the different part of device 100, and similar function still is provided.So Figure 1A-1J and the above-mentioned description that provides are hard-core.
Ardware feature:
Fig. 2 illustrates the parts according to the device 100 of disclosure embodiment.Usually, device 100 comprises main screen 104 and secondary screen 108.Although usually can activate main screen 104 and parts thereof in opening and closing position or state, usually can activate secondary screen 108 and parts thereof in open mode, but can not activate in off position.Yet, even in off position, by suitable order, the interruption that user or application trigger (for example, in response to the operation of phone application or camera applications) the also screen of turning activation, or inactive main screen 104 and activate secondary screen 108.Each screen 104,108 can be touch-sensitive, and can comprise different operating areas.For example, the first operating area in each touch sensitive screen 104 and 108 can comprise touch-sensitive display 110,114.Usually, touch-sensitive display 110,114 can comprise the touch-sensitive display of full color.Second area in each touch sensitive screen 104 and 108 can comprise gesture capture region 120,124.Gesture capture region 120,124 can comprise outside touch-sensitive display 110,114 zones and can receive the zone of input (form of the gesture that for example provides with the user) or regional.Yet gesture capture region 120,124 does not comprise the pixel that can carry out Presentation Function or ability.
Touch sensitive screen 104 and 108 the 3rd area can comprise configurable regional 112,116.Configurable regional 112,116 can receive input, and have display capabilities or limited display capabilities.In an embodiment, configurable regional 112,116 can present different input options for the user.For example, configurable regional 112,116 can the Show Button or other relevant projects.In addition, no matter touch sensitive screen 104 or 108 configurable regional 112,116 in whether show any button, the homogeneity of the Show Button can be determined by device 100 use and/or the environment that operates.In an exemplary embodiment, touch sensitive screen 104 and 108 comprises LCD device, it extends to touch sensitive screen 104 and 108 above-mentioned at least area, can provide visual output for the user, and comprise the condenser type input matrix, it can receive the input from the user on touch sensitive screen 104 and 108 above-mentioned areas.
One or more display controller 216a can be provided, and 216b comprises input (touch-sensing) and output (demonstration) function to control the operation of touch sensitive screen 104 and 108.In the illustrated one exemplary embodiment of Fig. 2, for each touch- screen 104 and 108 provides independent touch screen controller 216a or 216b.According to the embodiment of alternative, can control the touch sensitive screen 104 that comprises and each of 108 with the touch screen controller 216 that shares or share.According to another embodiment, the function of touch screen controller 216 can be incorporated in miscellaneous part, for example in processor 204.
Processor 204 can comprise general purpose programmable processors or controller, to be used for executive utility or instruction.According at least some embodiment, processor 204 can comprise a plurality of processor cores, and/or carries out multiple virtual processor.According to another embodiment, processor 204 can comprise a plurality of concurrent physical processors.As specific example, processor 204 can comprise special IC (ASIC) or other integrated circuit of special configuration, DSP CONTROL device, hardwired electronics or logical circuit, programmable logic device or gate array, special purpose computer etc.Usually, processor 204 is exercised the program code of operation actuating unit 100 several functions or the function of instruction.
Communicator 100 can also comprise storer 208, be used for relevant to processor 204 executive utilities or instruction, and be used for programmed instruction and/or data temporarily or longer-term storage.As example, storer 208 can comprise RAM, DRAM, SDRAM, or other solid-state memories.Alternatively or in addition, can also provide data-carrier store 212.Similar to storer 208, data-carrier store 212 can comprise one or more solid state memory devices.Alternatively or in addition, data-carrier store 212 can comprise hard disk drive or other random access memory.
In order to support communication function or ability, device 100 can comprise cell phone module 228.As example, cell phone module 228 can comprise GSM, CDMA, FDMA and/or via cellular network can support voice, the analog cellular telephone transceiver of multimedia and/or data transmission.Alternatively or in addition, device 100 can comprise additional or other wireless communication modules 232.As example, other wireless communication module 232 can comprise Wi-Fi, BLUETOOTH TM, WiMax, infrared ray, perhaps other wireless communication links.Each of cell phone module 228 and other wireless communication module 232 can be associated with that share or special-purpose antenna 224.
Can comprise port interface 252.Port interface 252 can comprise all or general port, with supportive device 100 interconnection with other devices or parts, for example depressed place (dock), according to component devices 100 those, it can maybe cannot comprise additional or different abilities.Except being supported between device 100 and another device or parts exchanges communication signals, lie up (docking) port one 36 and/or port interface 252 can be supported auto levelizer 100 or from the power supply supply of device 100.Port interface 252 also comprises intelligent element, and this intelligent element comprises device or the communication between parts or other the mutual modules that lies up for control device 100 and connection.
Can comprise with input/output module 248 and related port, to support on cable network or link, for example with other communicators, server unit, and/or the communication of peripherals.Can comprise input/output module 248 and related port, supporting by cable network or link, for example with the communicating by letter of other communicators, server unit and/or peripherals.The example of input/output module 248 comprises ethernet port, USB (universal serial bus) (USB) port, IEEE (IEEE) 1394 or other interfaces.
Can comprise audio frequency input/output interface/device 244, with loudspeaker or other devices that analogue audio frequency is offered interconnection, and to receive analogue audio frequency from microphone or other devices that is connected.As an example, audio frequency input/output interface/device 244 can comprise relevant loudspeaker and analog to digital converter.Alternatively or in addition, device 100 can comprise integrated audio frequency input/output device 256 and/or audio sockets, to be used for and external loudspeaker or microphone interconnection.For example, can provide integral speakers and integrated microphone, to support intimate talk or speakerphone operation.
For example, can comprise that hardware button 158 is to be used for some relevant control operation.Example comprises total power switch, and volume control etc. are as described together with Figure 1A to 1J.Can comprise one or more image capture interfaces/device 240, camera for example is to be used for capturing still image and/or video image.Alternatively or in addition, image capture interface/device 240 can comprise scanner or code reader.Image capture interface/device 240 can comprise add ons or be associated with add ons, for example glistens or other light sources.
Device 100 can also comprise GPS (GPS) receiver 236.According to embodiments of the invention, gps receiver 236 may further include the GPS module, and it can offer absolute location information the miscellaneous part of device 100.Can also comprise accelerometer 176.For example, with respect to showing information and/or other functions for the user, can be used for being defined as from the signal of accelerometer 176 orientation and/or the form that the user shows information.
Embodiments of the invention can also comprise one or more position transducers 172.Position transducer 172 can provide the signal of indication touch sensitive screen 104 and 108 position relative to each other.Can be used as input and this information is provided, for example provide for user-interface application, to determine operator scheme, the feature of touch-sensitive display 110,114, and/or other device 100 operations.As example, screen position sensor 172 can comprise a series of hall effect sensors, a plurality of position switchs, and photoswitch, Wheatstone bridge, pot perhaps can provide other layouts of the signal of indication touch-screen residing a plurality of relevant positions.
Can be by the communication between the different parts of one or more bus 222 carrying devices 100.In addition, can provide power supply for installing 100 parts from power supply and/or energy supply control module 260.For example, energy supply control module 260 can comprise battery, the AC-DC transducer, and power control logic, and/or be used for installing 100 ports that interconnect to the external power supply of power supply.
Unit state:
Fig. 3 A and the illustrative state of 3B indication device 100.Although show a plurality of illustrative states and the conversion from the first state to the second state, should recognize, illustrative constitutional diagram can not comprise all possible state and/or from the first state to the second all possible conversion of state.As illustrated in Fig. 3, the physical change that different arrow indication devices 100 between state (state by the circle expression is illustrated) occur, it is detected by one or more hardware and softwares, this detect to trigger the one or more of hardware and/or software interruption, and this hardware and/or software interruption are used for controlling and/or one or more functions of management devices 100.
As illustrated in Fig. 3 A, there are 12 kinds of exemplary " physics " states: close 304, conversion 208 (or opening transition state), support 312, improved support 316, open 320, squeeze into/get phone or communicate by letter 324, image/video capture 328, conversion 332 (perhaps closing transition state), horizontal 340, docking (docked) 336, docking 344 and horizontal 348.What be next to each illustrative state is the expression of the physical state of device 100, and except state 324 and 328, the icon of the common telephonic international icon respectively of its state and camera represents.
At state 304, device is in closed condition, and usually towards machine-direction oriented, main screen 104 connects (seeing Fig. 1 H) back-to-back with the screen of being connected 108 in Different Plane simultaneously with timer 100.From closed condition, for example, device 100 can enter docking state 336, wherein installing 100 is coupled with the station that lies up, the cable that lies up, perhaps usually enter or be associated with one or more other devices or peripherals, device 100 or can enter transverse state 340 wherein installs 100 and usually uses towards user's main screen 104 and determine orientation, and main screen 104 and be connected that to shield 108 be to connect back-to-back.
In off position, device can also move on to transition state, and wherein near the device remainder display moves on to another screen 108 from a screen 104, for example double-click on screen 110,114 based on user's input.Another embodiment comprises two-sided state.In two-sided state, the stopping device remainder, but also have one to be applied at least one window of demonstration on the first display 110 and second display 114.Based on the state of using and should using, the window that illustrates on the first and second displays 110,114 can be same or different.For example, when obtaining image with camera, this device can show view finder on the first display 110, and shows the preview (full frame and left and right mirror image) of photo theme on second display 114.
At state 308, from closed condition 304 to semi-open state or the transition state of support state 312, show device 100, it starts from main screen 104 and secondary screen 108 a bit rotating around the axle that overlaps with hinge.When entering support state 312, main screen 104 and secondary screen 108 are separated from each other, and can consist of the structure that is similar to support from the teeth outwards in order to for example install 100.
At the state 316 that is known as improved backing positions, device 100 has and support the state 312 similarly main screen 104 of relativeness and secondary screen 108 each other, and its difference is that one of main screen 104 or secondary screen 108 are positioned on the surface, as shown.
State 320 is open modes, and wherein, main screen 104 and secondary screen 108 are usually in the same plane.From open mode, device 100 can be transformed into docking state 344 or open transverse state 348.In open mode 320, main screen 104 and secondary screen 108 are in similar machine-direction oriented usually, simultaneously in transverse state 348, and main screen 104 and secondaryly shield 108 and usually be in similar horizontal orientation.
State 324 is illustrative state of communications status, for example answers just respectively or sends when device 100 and squeeze into or when getting phone.Although for clarity sake do not illustrate, should recognize, device 100 can be converted to from the illustrated free position of Fig. 3 squeezes into/gets telephone state 324.In a similar manner, can enter image/video capture state 328 from any other state of Fig. 3, because image/video capture state 328 allows devices 100 to take one or more images via camera and/or with video capture device 240 capture video.
Transition state 322 shows illustratively according to main screen 104 and secondary screen 108 and for example enters each other closed condition 304 and close main screen 104 and secondary screen 108.
With reference to key word, Fig. 3 B illustrate reception for detection of the input from the first state to the second state conversion.In Fig. 3 B, usually to use towards vertical state 352, the part of the row of transverse state 356 and a part of pointing to the row of vertical state 360 and transverse state 364 represent the state of multiple combination.
In Fig. 3 B, key word points out that " H " expression is from the input of one or more hall effect sensors, " A " expression is from the input of one or more accelerometers, " T " expression is from the input of timer, the input of " P " expression communications triggered, and " I " presentation video and/or Video Capture request input.Therefore, in the core 376 of this chart, show the combination of an input or input, how to detect conversion from the first physical state to the second physical state with indication device 100.
As discussion, at the core of chart 376, for example, shown in the detection-runic " HAT " of the conversion of the input of reception startup from vertical open mode to the horizontal support state.From vertically opening the exemplary conversion of cutter horizontal support state, perhaps need hall effect sensor (" H ") for this, accelerometer (" A ") and timer (" T ").For example, the timer input can be from the clock that is associated with processor.
Except the vertical and horizontal state, also show based on the acceptance of the signal 372 that lies up and the docking state 368 that triggers.As mentioned above and with respect to Fig. 3, can be associated to trigger with one or more devices 100, annex, peripherals, intelligent depressed place etc. the signal that lies up by device 100.
User interactions:
Fig. 4 A to 4H can be by the multiple diagram of the gesture input of screen 104,108 identification.Not only can be by user's body part, for example finger is carried out gesture, also can be installed by other, and for example input pen is carried out gesture, and it can be sensed by the contact sensing part of screen 104,108.Usually, differently explain gesture based on the place of carrying out gesture (directly on display 110,114 or in gesture capture region 120,124).For example, the gesture in display 110,114 may be pointed to desktop or application, and the gesture in gesture capture region 120,124 can be interpreted as for system.
With reference to Fig. 4 A-4H, the first gesture, touch gestures 420 is actually the fixing duration of selecting on screen 104,108.Circle 428 is illustrated in touch or other contact types of the ad-hoc location reception of screen contact sensing part.Circle 428 can comprise border 432, and in fact its thickness indication keeps the length of regular time in the contact position contact.For example, touching (tap) 420 (perhaps short pressing) has than the long thin border 432a of border 432b of 424 (perhaps normally pressing) that presses.Length is pressed 424 and can be included on screen than touching the 420 in fact fixing contacts in cycle maintenance longer time.As will be appreciated, stop or keeping the fixing duration before mobile on screen based on touching in contact, can record the gesture of different definition.
With reference to Fig. 4 C, dragging on screen 104,108 (drag) gesture 400 be on preferential direction with the initial contact (by circle 428 expressions) that contacts mobile 436.Initial contact 428 can keep fixing by a certain amount of time of border 432 expressions on screen 104,108.Typically, drag gesture requirement user contacts the icon at primary importance place, window, and perhaps other show images, subsequently, contact are moved to the new second place of selected demonstration image expectation along drawing direction.Contact is mobile be there is no need along traveling priority, but has the movement of free routing, as long as contact is actually continuous from first to the second place.
With reference to Fig. 4 D, light cunning (flick) gesture 404 on screen 104,108 is to move the initial contact (by circle 428 expressions) of 436 (with respect to drag gestures) along the contact of blocking of preferential direction.In an embodiment, for drag gesture, light sliding last movement for gesture has higher output speed.For example, gently sliding gesture can be that finger along initial contact moves suddenly.Than drag gesture, gently sliding gesture does not need usually from the primary importance that shows image to the contact lasting with screen 104,108 of the predetermined second place.Move on to the predetermined second place by light sliding gesture along the demonstration image that the direction of light sliding gesture will contact.Although these two gestures can will show that image moves on to the second place from primary importance usually, temporary duration and the stroke that usually contact on screen are shorter than drag gesture for light cunning.
With reference to Fig. 4 E, contraction (pinch) gesture 408 on screen 104,108 has been described.Shrinking gesture 408 can originate in by for example first pointing to the first contact 428a of screen 104,108 and by for example second contact 428b of second finger to screen 104,108.The contact sensing part that can be shared by shared screen 104,108, share screen 104 or 108 different parts or the different touch-sensing of different screen and partly detect the first and second contact 428a, b.The time that the first contact 428a keeps the first quantity, 432a is represented as the border, and the time of the second contact 428b maintenance second, and 432b is represented as the border.Usually, the time of the first and second quantity is actually identical, and usually, the first and second contact 428a, b occurs in fact simultaneously.The first and second contact 428a, b also comprise respectively the first and second corresponding mobile 436a of contact, b usually.The first and second mobile 436a of contact, b is in opposite direction usually.In other words, the first mobile 436a of contact is towards the second contact 436b, and the second mobile 436b of contact is towards the first contact 436a.Say simplerly again, can be by user's finger touch screen 104,108 and complete and shrink gesture 408 in the pinching activity.
With reference to Fig. 4 F, expansion (spread) gesture 410 on screen 104,108 has been described.Launching gesture 410 can originate in by for example first pointing to the first contact 428a of screen 104,108 and by for example second contact 428b of second finger to screen 104,108.The contact sensing part that can be shared by shared screen 104,108, the different contact sensing that shares the different contact sensing part of screen 104,108 or different screen partly detect the first and second contact 428a, b.The time that the first contact 428a keeps the first quantity, 432a is represented as the border, and the time of second contact 428b maintenance the second quantity, and 432b is represented as the border.Usually, the time of the first and second quantity is actually identical, and usually, the first and second contact 428a, b occurs in fact simultaneously.Usually, the first and second contact 428a, b also comprise respectively the first and second corresponding mobile 436a of contact, b.The first and second mobile 436a of contact, b is in common direction usually.In other words, the first and second mobile 436a of contact, b are that b separates from the first and second contact 428a.Say simplerly again, can be by user's finger touch screen 104,108 and complete and launch gesture 410 in the expansion activity.
Can make up by any way above-mentioned gesture, those shown in Fig. 4 G and 4H for example are to generate definite function result.For example, in Fig. 4 G, with Flick gesture 420 on away from the direction of Flick gesture 420 with drag or gently sliding gesture 412 combine.In Fig. 4 H, with Flick gesture 420 near the direction of Flick gesture 420 with drag or gently sliding gesture 412 combine.
Based on a plurality of factors, the function result that receives gesture can change, and these factors comprise the state of device 100, display 110,114, and perhaps screen 104,108, the environment that is associated with gesture, the perhaps position of gesture sensing.The state of device is usually directed to following one or more: structure, the display orientation of device 100, and user and install 100 other inputs that receive.environment is usually directed to following one or more: by the selected application-specific of gesture and the applying portion of current execution, application is that single screen is used or multi-screen is used, and use and to be at one or more screens or to show that in one or more storehouses the sensing set that sense position that the multi-screen of one or more windows should use gesture is usually directed to the hand gesture location coordinate is to be positioned at touch- sensitive display 110, 114 still are positioned at gesture capture region 120, 124, the sensing set of hand gesture location coordinate be from share or different display is associated or with screen 104, 108 are associated, and/or which of gesture capture region partly comprises the sensing set of hand gesture location coordinate.
When touch-sensitive display 110,114 receives when touching, it can be used to, for example, select icon with the execution of beginning or termination respective application, maximize or minimized window, window in the rearrangement storehouse, and provide the user to input, for example show via keyboard or other demonstration images.When touch-sensitive display 110,114 receive when dragging, it can be used to, for example, again the precalculated position in the display with icon or window position, storehouse on the rearrangement display, perhaps these two displays of cross-over connection (so that selected window occupies the part of each display simultaneously).When touch-sensitive display 110,114 or gesture capture region 120,124 receive gently when sliding, and it can be used to window is repositioned onto second display from the first display, perhaps these two displays of cross-over connection (so that selected window occupies the part of each display simultaneously).Yet, be different from drag gesture, usually, gently sliding gesture can not be used for and will show that image moves on to the position that particular user is selected, and can move on to the not configurable default location of user.
When touch-sensitive display 110,114 or gesture capture region 120,124 receive when shrinking gesture, it can be used to maximize or increase the viewing area of window or size (typically, when being received by the display that shares fully), the windows exchange that will show at the storehouse top of each display to the top of the storehouse of another display (typically, when being received by different displays or screen), perhaps display application manager (in storehouse " pop-up window " of display window).When by touch-sensitive display 110,114 or gesture capture region 120,124 receive when launching gesture, it can be used to minimize or reduce viewing area or the size of window, the windows exchange that will show at the storehouse top of each display to the top of the storehouse of another display (typically, when being received by different displays or screen), perhaps display application manager when outer gesture capture region reception (typically, when by the screen of identical or different screen).
When by common display or screen 104, when the common display capture region in 108 receives, the gesture of Fig. 4 G combination can be used to for the display that receives gesture, the first window stack position is saved as the first storehouse constant, resequence simultaneously Second Window stack position in the Second Window storehouse is to comprise the window in the display that receives gesture.When by common display or screen 104,108 or different display or screen in different display capture region when receiving, the gesture of Fig. 4 H combination can be used to for the display that touches part that receives gesture, the first window stack position is saved as first window storehouse constant, resequence simultaneously Second Window stack position in the Second Window storehouse is to be included in window in the display that receives light sliding or drag gesture.Although the concrete gesture in above-mentioned example and gesture capture region are associated with the set of corresponding function result, but should recognize, can redefine by any way these associations, to generate different associations between gesture and/or gesture capture region and/or function result.
Firmware and software:
Storer 508 can be stored and processor 504 can be carried out one or more software components.These compositions can comprise at least one operating system (OS) 516a and/or 516b, framework (framework) 520, and/or from one or more application 564a and/or the 564b of application memory 560.Processor 504 can receive the input from driver 512, and is previously described in conjunction with Fig. 2.Any software that OS 516 can be comprised of program and data, its supervisory computer hardware resource and the common service that provides a plurality of application 564 to carry out.OS 516 can be any operating system, and at least at some embodiment, it is specifically designed to mobile device, Linux, ANDROID TM, iPhone OS (IOS TM), WINDOWS PHONE 7TM etc.Can operate OS 516 so that function to be provided to phone by carrying out one or more operations, as described herein.
Using 564 can be to carry out any high layer software of specific function for the user.Use 564 and can comprise program, E-Mail client application for example, web page browsing program, text application, game, media play procedure, office procedure group etc.Use 564 and can be kept in application memory 560, application memory 560 can represent to store uses 564 any storer or data-carrier store, and management software is associated.In case be performed, use 564 and may operate in the zones of different of storer 508.
Framework 520 can be to allow to move a plurality of tasks with mutual any software or data on device.In an embodiment, at least a portion of framework 520 and separating component described below can be considered to OS516 or use a part of 564.Yet these parts will be described to the part of framework 520, but these parts are not restrictive.Framework 520 can comprise, but be not limited to multi-display management (MDM) module 524, surperficial cache module 528, window management module 532, input administration module 536, task management module 540, display controller, one or more frame buffers 548, task stack 552, one or more window stacks 550 (it is the logic arrangement of window and/or desktop in the viewing area), and/or events buffer 556.
MDM module 524 comprises one or more modules, and these modules can be operated the demonstration with the application on the screen of management devices or other data.The embodiment of MDM module 524 has been described in conjunction with Fig. 5 B.In an embodiment, MDM module 524 receives from OS 516, driver 512 and uses 564 input.These inputs assist MDM module 524 according to preferred (preference) and the demands (requirement) used, and user's operation and definitely how to configure and to distribute display.In case determined display device structure, MDM module 524 can be bound with display device structure using 564.Then, this configuration can be offered one or more miscellaneous parts shows to generate.
Surface cache module 528 comprises any storer or memory device and software associated therewith, with storage or buffer memory from one or more images of display screen.Each display screen can be associated screen (perhaps other show object (for example, desktop shows)) with a series of activities and inactive window.Active window (perhaps other show object) is the current window that is showing.Inactive window (perhaps other show object) has been opened and/or has shown for a moment, but is positioned at now active window " back " of (perhaps other show object).Experience in order to improve the user, before being covered by another active window (perhaps other show objects), " Snipping Tool " of the image that can memory window (perhaps other show objects) generates at last.Surface cache module 528 can be operated with the last movable image of memory window (perhaps other show object), but not the image of current demonstration.Therefore, surperficial cache module 528 is stored in the image of inactive window (perhaps other show object) in the data-carrier store (not shown).
In an embodiment, window management module 532 can be operated to manage activity or inactive window (perhaps other show object) on each screen.Based on from MDM module 524, the information of OS 516 or miscellaneous part, window management module 532 determine when window is movable or inactive.Then, window management module 532 is settled non-visual window (perhaps other show object) with " inactive state ", in conjunction with task management module design task management 540 operations that stop using.In addition, window management module 532 distributes screen identifier can for window (perhaps other show object), perhaps manages one or more sundry items of the data that are associated with window (perhaps other show objects).Window management module 532 can also be given application 564, task management module 540 or be provided storage information with window (perhaps other demonstration objects) miscellaneous part mutual or that be associated.
Input administration module 536 can be operated with the management devices event.Event is any input in Windows, and for example, user interface and user carry out alternately.Input administration module 536 reception events, and logically event is stored in events buffer 556.Event can be included as this user interface interaction " event downwards ", it occurs in when screen 104,108 receive from the user touch signal the time, " moving event ", it occurs in when screen 104, when 108 fingers of determining the user move along screen, " upwards event ", it occurs in determines the user when screen 104,108 and has stopped touch screen 104,108 o'clock, etc.By input administration module 536, can receive these events, store these events and send these events to other module.
Task can be application component, in order to complete some thing, for example calls, and takes pictures, and sends Email or watches map, and this application component carries out mutual screen for the user provides.Can for the given window of each task, obtain therein user interface.Typically, this window is filled display 110,114, but can be less than display 110,114, and floats over other above window.A plurality of action of usually, using by loose restriction each other form.Typically, the task in using is appointed as " master " task, when starting application first, is presented to the user.Then, each task can begin another task to carry out different operations.
Task management module 540 can be operated to manage can be by the operation of installing one or more application 564 of carrying out.Therefore, task management module 540 can receive signal, to carry out the application of storage in application memory 560.Then, one or more tasks or parts that task management module 540 can exemplary application 564 are to begin to use 564 operation.In addition, task management module 540 can change and end to use 564 based on user interface.Ending to use 564 can be kept at application data in storer, but may limit or stop using the cycle of 564 pairs of processor access.In case application becomes movable again, task management module 540 can provide again the access to processor.
Display controller 544 can be operated thinks that the multi-screen device presents and output display.In an embodiment, display controller 544 creates and/or manages one or more frame buffers 548.Frame buffer 548 can be to show output, the demonstration of the part of the storer of the next whole frame of self-contained demonstration data of its driving.In an embodiment, the display controller 544 one or more frame buffers of management.Frame buffer can be the frame buffer that synthesizes, and it can represent the whole viewing area of two screens.This synthetic frame buffer can be used as single frame and presents to OS 516.Required according to each display 110,114 use, display controller 544 can be divided this synthetic frame buffer by son.Therefore, by using display controller 544, device 100 can have a plurality of screen displays, and need not to change the main software of OS 516.
Application manager 562 can be the business of presentation layer that provides for Windows.Therefore, application manager 562 provides graphical model for presenting by window management module 556.Equally, desktop 566 provides presentation layer for application memory 560.Therefore, desktop provides the graphical model on the surface with optional application icon for the application 564 in application memory 560, this application can be offered window management module 556 and presents being used for.
Fig. 5 B shows the embodiment of MDM module 524.MDM module 524 can be operated thinks that device determines ambient condition, includes but not limited to, what the orientation of device carried out and used 564, and how use 564 will be shown, and the user is guiding what operation, the task dispatching that is showing.In order to configure display, MDM module 524 has been explained these environmental factors, and has determined to show configuration, as described in conjunction with Fig. 6 A-6J.Then, MDM module 524 can with use 564 or other device features and display bind.Then, this configuration can be sent to display controller 544 and/or OS 516, show to generate.MDM module 524 can comprise following one or more, but is not limited to, and shows configuration module 568, preferred module 572, unit state module 574, gesture module 576, demand module 580, event module 584, and/or binding module 588.
Show the definite layout that shows of configuration module 568.In an embodiment, demonstration configuration module 568 can be determined environmental factor.Can be from one or more other MDM module 524 modules or other source reception environment factors.Then, show the best configuration that configuration module 568 can be identified for showing according to the factor catalogue.In conjunction with Fig. 6 A-6F, the configuration that may exist and some embodiment of factor associated therewith have been described.
Preferred module 572 can be operated with the demonstration of determining application 564 or miscellaneous part preferred.For example, application can have preferably for single demonstration or two demonstration.Use preferred can be determined or receive to preferred module 572, and storage is preferred.Along with the variation of device configuration, can reexamine preferably to determine whether can reach demonstration configuration preferably for using 564.
Unit state module 574 can be operated to determine or the state of receiving trap.The device state can as in conjunction with Fig. 3 A and 3B described.Show the configuration that state that configuration module 568 can operative installations is identified for showing.Equally, unit state module 574 can receive input, and the state of interpreting means.Then, status information is offered demonstration configuration module 568.
Gesture module 576 can be operated to determine whether the user is just carrying out any operation on user interface.Therefore, gesture module 576 can receive the mission bit stream from task stack 552 or input administration module 536.These gestures can be as defined in conjunction with Fig. 4 A to 4H.For example, moving window causes that display presents the series of displays frame, and it illustrates the movement of window.The gesture that gesture module 576 can receive and explanation is associated with this user interface interaction.Then, will send to about the information of user's gesture task management module 540, to revise the demonstration binding of task.
Be similar to preferred module 572, demand module 580 can be operated thinks application 564 or the definite demand that shows of miscellaneous part.Application can have the demand of the demonstration of the regulation that must observe.Some are used needs specific display orientation.For example, using " mad bird " can only show with horizontal orientation.Can determine or receive such demonstration demand by demand module 580.Along with the variation of device orientation, demand module 580 can be declared 564 the demonstration demand of using again.Show that configuration module 568 can generate the demonstration configuration consistent with using the demonstration demand, provides as demand module 580.
Be similar to gesture module 576, event module 584 can be operated with one or more events of determining that application or miscellaneous part occur, and they can affect user interface.Therefore, gesture module 576 can receive event information from events buffer 556 or task management module 540.These events can change task and how be tied to demonstration.For example, the e-mail applications of reception Email can cause that display presents new message on the pair screen.Event module 584 can receive and explanation carries out with this application the event that is associated.Then, the information about event can be sent to demonstration configuration module 568, to revise the configuration that shows.
Binding module 588 can be operated with will use 564 or other parts bind mutually with showing the configuration that configuration module 568 is determined.In storer, binding is associated the demonstration configuration of each application with demonstration and the pattern used.Therefore, the binding module 588 demonstration configuration of using and using can be associated (for example, laterally vertical, multi-screen etc.).Then, binding module 588 can be given and be shown distribution explicit identification symbol.The explicit identification symbol will be used and the specific screen-dependent connection of device.Then, preserve this binding, and provide it to display controller 544, OS 516, and perhaps miscellaneous part is suitably to present this demonstration.This binding is dynamic, and can be based on changing, use with event, gesture, state the configuration change that preferred or demand etc. are associated and changing or upgrade.
The user interface configuration:
Referring now to Fig. 6 A-J, below with the issuable various types of outputs configuration of tracing device 100.
Fig. 6 A and 6B have described two different output configurations of the device 100 that is in the first state.Particularly, Fig. 6 A has described the device 100 that is in vertical state 304 of closing, and wherein shows data on main screen 104.In this example, device 100 vertically shows data via touch-sensitive display 110 in configuration 604 first.As what can recognize, first vertically configures 604 can a desktop or operating system main screen.Alternatively, can there be one or more windows on machine-direction oriented, just vertically configure 604 with first with timer 100 and show data.
Fig. 6 B has described the device 100 that still is in vertical state 304 of closing, but as an alternative, shows data on pair screen 108.In this example, device 100 vertically configures 608 with second and shows data via touch-sensitive display 114.
Vertically configure 604,608 with first or second and show that similar or different data can exist.By user's gesture (for example, double-clicking gesture) being provided for device 100, menu setecting or other means, first vertically configuration 604 and second vertically the conversion between configuration 608 also can exist.Can also use other suitable gestures, to change between configuration.In addition, based on device 100 states that move to, device 100 vertically configures 604,608 conversions to any other configuration as herein described from first or second and also can exist.
The output configuration of alternative can be provided by the device 100 that is in the second state.Particularly, Fig. 6 C has described the 3rd vertically configuration, wherein shows simultaneously data at main screen 104 and secondary screen 108 on both.The 3rd vertical configuration can be considered to dual vertically (PD) output configuration.In PD output configuration, the touch-sensitive display 110 of main screen 104 vertically configures 604 with first and has described data, and the touch-sensitive display 114 of secondary screen 108 vertically configures 608 with second and described data simultaneously.When device 100 was in vertical state 320 of opening, first vertical configuration 604 and second presented and can occur when vertically configuring 608.In this configuration, device 100 can be at a display 110 or application window of 114 interior demonstrations, two application windows (each display 110 and 114 shows), an application window and a desktop, perhaps a desktop.Also may there be other configurations.Should recognize, based on device 100 states that move to, the conversion of device 100 from the configuration 604,608 that shows simultaneously to any other configuration as herein described also can exist.In addition, when this state, the demonstration of application preferably can be placed to duplexmode with device, and wherein two displays are all movable, thereby shows different windows with identical application.For example, camera applications can show view finder and control in a side, and the opposite side display device is as preview simultaneously, and it can be seen by the photo main body.Comprise the game of being played simultaneously by two players and also can adopt duplexmode.
Fig. 6 D and 6E describe two further output configurations of the device 100 that is in the third state.Particularly, Fig. 6 D has described the device 100 that is in the transverse state 340 of closing, and wherein shows data on main screen 104.In this example, device 100 shows data with the first landscape configuration 612 via touch-sensitive display 110.Just as other configurations as herein described, the first landscape configuration 612 can show desktop, main screen, one or more windows of display application data etc.
Fig. 6 E has described the device 100 that still is in the transverse state 340 of closing, but as an alternative, shows data on pair screen 108.In this example, device 100 shows data with the second landscape configuration 616 via touch-sensitive display 114.Vertically configure 612,616 with first or second and show that similar or different data can exist.By giving one or two that device 100 provides crooked and Flick gesture or bullet or slip gesture, the conversion between the first landscape configuration 612 and the second landscape configuration 616 also can exist.Can also use other suitable gestures, to change between configuration.In addition, based on device 100 states that move to, the conversion of device 100 from the first or second landscape configuration 612,616 to any other configuration as herein described also can exist.
Fig. 6 F has described the 3rd landscape configuration, wherein shows simultaneously data at main screen 104 and secondary screen 108 on both.The 3rd landscape configuration can be considered to dual laterally (LD) output configuration.In LD output configuration, the touch-sensitive display 110 of main screen 104 is with the first landscape configuration 612 data of description, and simultaneously, the touch-sensitive display 114 of secondary screen 108 is with the second landscape configuration 616 data of description.When device 100 when being in the transverse state 340 of opening, present and to occur in the time of the first landscape configuration 612 and the second landscape configuration 616.Should recognize, based on device 100 states that move to, the conversion of device 100 from the configuration 612,616 that shows simultaneously to any other configuration as herein described also can exist.
Fig. 6 G and 6H have described two views of the device 100 that is in another state.Particularly, device 100 is described to be in support state 312.Fig. 6 G shows and can show the first support output configuration 618 on touch-sensitive display 110.Fig. 6 H shows and can show the second support output configuration 620 on touch-sensitive display 114.Device 100 can be configured to describe individually first support output configuration the 618 or second support output configuration 620.Alternatively, can present simultaneously this two support output configurations 618,620.In certain embodiments, support output configuration 618,620 can be similar or identical with horizontal output configuration 612,616.Device 100 can also be configured to show that support output configures one or two of 618,620, when being in improved support state 316.Should be appreciated that adopt simultaneously support output configuration 618,620 can be so that two-player game (for example battleship, chess, draughts etc.), multi-person conference (wherein, two or more users share identical device 100) and other application.As what can recognize, based on device 100 states that move to, device 100 also can exist from showing one or two configuration 618,620 conversions to other configurations as herein described.
Fig. 6 I has described when device 100 is in vertical state 320 of opening, another output configuration that can provide.Particularly, device 100 can be configured to vertical configuration, is called as (PMax) configuration 624 of vertical maximum herein, presents single continuous image across two touch-sensitive displays 110,114.In this configuration, can dividing data (for example, single image is used window, icon, video etc.), and partial display is on one of touch-sensitive display, other partial displays of data are on another touch-sensitive display simultaneously.Pmax configuration 624 can help larger display and/or better resolution, to be used for showing specific image on device 100.Be similar to other output configurations, based on device 100 states that move to, the conversion of device 100 from Pmax configuration 624 to any other output configuration as herein described can exist.
Fig. 6 J has described when device 100 is in the transverse state 348 of opening, another output configuration that can provide.Particularly, device 100 can be configured to landscape configuration, is called as (LMax) configuration 628 of horizontal maximum herein, presents single continuous image across two touch-sensitive displays 110,114.In this configuration, can dividing data (for example, single image is used window, icon, video etc.), and partial display is on one of touch-sensitive display, other partial displays of data are on another touch-sensitive display simultaneously.Lmax configuration 628 can help larger display and/or better resolution, to be used for showing specific image on device 100.Be similar to other output configurations, based on device 100 states that move to, the conversion of device 100 from Lmax configuration 628 to any other output configuration as herein described can exist.
At least one window stack 1700,1728 of device 100 use is managed desktop and/or window, as shown in Figure 10 A and 10B.Window stack 1700,1728th, the logic arrangement of the activity of multi-screen device and/or inactive window.For example, window stack 1700,1728 logically can be similar to card deck, wherein arranges in order one or more windows or desktop, as shown in Figure 10 A and 10B.Active window is the window that is showing at least one of touch-sensitive display 110,114.For example, window 104 and 108 is active windows, and on touch- sensitive display 110 and 114 display window 104 and 108.Inactive window be opened and shown, but be in now active window " back " and be not the window that is showing.In an embodiment, inactive window can be used for the application of termination, and therefore, this window is the show events content no longer.For example, window 1712,1716,1720 and 1724 is inactive windows.
Window stack 1700,1728 can have multiple layout or organizational structure.In the embodiment shown in Figure 10 A, device 100 comprises the first storehouse 1760 that is associated with the first touch-sensitive display 110 and the second storehouse that is associated with the second touch-sensitive display 114.Therefore, each touch-sensitive display 110,114 can have the window stack 1760,1764 that is associated.These two window stacks 1760,1764 can have different a plurality of windows of arranging at storehouse separately 1760,1764.In addition, can also differently identify and manage respectively these two window stacks 1760,1764.Therefore, can be according to from first window 1704, to next window 1720, arrange first window storehouse 1760 to the order of last window 1724 and Zhongdao desktop 1722, in an embodiment, desktop 1722 is positioned at " bottom " of window stack 1760.In an embodiment, desktop 1722 always is not positioned at " bottom " because application window can be arranged in the window stack of desktop below 1722, and when desktop shows, desktop 1722 can arrive storehouse " top " and on other windows.Equally, can be from first window 1708, to next window 1712, arrange the second storehouse 1764 to last window 1716 and Zhongdao desktop 1718, in an embodiment, desktop 1718 is individual desktop zones, be arranged in together with desktop 1722 window stack 1760 and window stack 1764 all windows below.The logic data structure that is used for these two window stacks 1760,1764 of management can be as described in conjunction with Figure 11.
The another kind that window stack 1728 has been shown in Figure 10 B is arranged.In this embodiment, for two touch-sensitive displays 110,114, there is single window storehouse 1728.Therefore, from desktop 1758, to first window 1744, arrange window stack 1728 to last window 1756.Window can be arranged in position between all windows, and need not to be associated with specific touch-sensitive display 110,114.In this embodiment, window is according to the order of window.In addition, at least one window is defined as movable.For example, can present single window in two parts 1732 and 1736, these two partial displays are on the first touch sensitive screen 110 and the second touch sensitive screen 114.This single window can only be occupied the single position in window stack 1728, although it is to be presented at display 110,114 on both.
Figure 10 C to 10E shows window stack 1760 another arrangement.Illustrate window stack 1760 with three kinds " facing ".In Figure 10 C, show the top of window stack 1760.The both sides of window stack 1760 have been shown in Figure 10 D and 10E.In this embodiment, window stack 1760 is just as a large amount of program blocks.Stacking these windows each other.Begin to watch the window that window stack 1760 tops are only arranged of seeing in the different piece of combining display 1764 from the top of the window stack 1760 of Figure 10 C.Combining display 1764 is described logical model for the whole viewing area of device 100, and it can comprise touch-sensitive display 110 and touch-sensitive display 114.Desktop 1786 or window can occupy combining display 1764 partly or completely.
In an illustrated embodiment, desktop 1786 is demonstration or " program blocks " minimum in window stack 1760.Therefore, window 11782, window 21782, window 31768 and window 41770 are layerings.Window 11782, window 31768, window 21782 and window 41770 only occupy the part of combining display 1764.Therefore, the window 81774 shown in another part section of comprising 1790 of storehouse 1760 and window 5-7.In fact, only present and show the window on top in combining display 1764 arbitrary portions.Therefore, as shown in the top view of Figure 10 C, window 41770, window 81774 and window 31768 are shown as being in the top that window stack 1760 different pieces show.Can adjust window size only occupying the part of combining display 1760, thus the lower window of " demonstration " window stack 1760.For example, window 31768 is all lower than window 41770 and window 81774 in storehouse, but still can be shown.The logic data structure of management window storehouse can be as described in conjunction with Figure 11.
When opening new window, movable window is placed on the top of storehouse usually again.Yet, in storehouse, window is placed on and where and how places and can carry out the environment of what program, function, software etc. on device 100 along with device 100 orientation, how to place storehouse etc. and become when opening new window.For window is inserted in storehouse, determine the position of window in storehouse, and can determine the touch-sensitive display 110,114 that window is associated with.Use this information, can create the also logic data structure of memory window.When arrangement that user interface or other events or task change, can change window stack to reflect the variation of arrangement.It should be noted, these identical viewpoints as above can be used for one or more desktops of management devices 100.
Figure 11 shows the logic data structure 1800 for management window or desktop arrangement.Logic data structure 1800 can be to store data, no matter the arbitrary data structure of object, record, file etc.Logic data structure 1800 can be stored in the database or data-storage system of any type, and no matter agreement or standard.In an embodiment, logic data structure 1800 comprises one or more, field, attribute etc.With rational arrangement storage data, this is reasonably arranged and has considered that information is easy to storage and retrieval.Hereinafter, should be with these one or more parts, field, attributes etc. are referred to as field.This field can be stored for window identifier 1804, size 1808, stack position identifier 1812, the data of display identifier 1816 and/or activity indicators 1820.Each window in window stack can have the logic data structure 1800 that is associated.Although only show single logic data structure 1800 in Figure 11, can there be the more or less logic data structure that uses 1800 (based on the quantity of window in storehouse or desktop) together with window stack, by suspension points 1824 expressions.In addition, can there be the field more more or less than field shown in Figure 11, by suspension points 1828 expressions.
Window identifier 1804 can comprise any identifier (ID), and this ID can identify with respect to other windows in window stack the window that is associated uniquely.Window identifier 1804 can be global unique identification symbol (GUID), digital ID, alphanumeric ID, the perhaps identifier of other types.In an embodiment, window identifier 1804 can be one, and two, perhaps a plurality of numerals are based on the quantity of openable window.In the embodiment of alternative, the size of window identifier 1804 can change based on the quantity of the window of opening.When window was opened, window identifier 1804 can be static, and remains unchanged.
Size 1808 can be included in the size of the window in combining display 1760.For example, size 1808 can comprise the coordinate at two or more angles of window, perhaps can comprise the size of width and the height of a coordinate and window.These sizes 1808 can be depicted which part that window can occupy combining display 1760, the part that it can occupy whole combining display 1760 or only occupy combining display 1760.For example, window 41770 can have the size 1880 of a part that indication window 1770 will only occupy the viewing area of combining display 1760, as shown in Figure 10 C to 10E.Due to can moving window or window is inserted window stack, so size 1808 can change.
Stack position identifier 1812 can be any identifier, this identifier can identification window the position in storehouse, perhaps can be from the control record of window in data structure, for example catalogue or storehouse are inferred.Stack position identifier 1812 can be GUID, digital ID, alphanumeric ID, the perhaps identifier of other types.Each window or desktop can comprise stack position identifier 1812.For example, as shown in Figure 10 A, the window 11704 in storehouse 11760 can have 1 stack position identifier 1812,1 and identify first window that window 1704 is storehouses 1760, and is active window.Equally, can to have 3 stack position identifier, 1812,3 expression windows 1724 are the 3rd windows of storehouse 1760 to window 61724.It is first windows of the second storehouse 1764 that window 21708 can also have 1 stack position identifier, 1812,1 expression windows 1708.As shown in Figure 10 B, window 1 1744 can have 1 stack position identifier 1812, and the window 3 that presents in part 1732 and 1736 can have 3 stack position identifier 1812, and window 6 1756 can have 6 stack position identifier 1812.Therefore, based on the type of storehouse, stack position identifier 1812 can represent the position of window in storehouse.
Display identifier 1816 can identification window or desktop be associated with specific display, for example, the first display 110 or second display 114, the perhaps combining display 1760 that forms of these two displays.Although do not need this display identifier 1816 for many stack system, as shown in Figure 10 A, whether display identifier 1816 can indicate the window in the continuous storehouse of Figure 10 B to be presented on specific display.Therefore, in Figure 10 B, window 3 can have two parts 1732 and 1736.First 1732 can have the display identifier 1816 for the first display, and second portion 1736 can have the display identifier 1816 for second display 114 simultaneously.Yet in the embodiment of alternative, window can have two display identifiers 1816, and its expression can at display 110,114 both upper these windows that show, perhaps have a display identifier 1816 of identification combining display.In the embodiment of another alternative, window can have individual monitor identifier 1816, to be illustrated in display 110,114 both upper display windows.
Be similar to display identifier 1816, can need activity indicators 1820 for the dual stack system of Figure 10 A, because the window in stack position 1 is movable and shows.In the system of Figure 10 B, activity indicators 1820 can indicate showing in storehouse which/which window.Therefore, in Figure 10 B, window 3 can have two parts 1732 and 1736.First 1732 can have activity indicators 1820, and second portion 1736 also can have activity indicators 1820 simultaneously.Yet in the embodiment of alternative, window 3 can have single-unit activity designator 1820.Activity indicators 1820 can mean that window is movable or shown simple sign or position.
Figure 12 shows a kind of embodiment of the method 1900 for creating window stack.Simultaneously, Figure 12 shows the common order of the step of method 1900.Usually, method 1900 starts from beginning step 904, and ends at ending step 1928.Method 1900 can comprise more or less step, perhaps can be different from the order of step illustrated in fig. 12 and arrange the order of these steps.Method 1900 can be performed as the set of computer-executable instructions of being carried out and being encoded or store by computer system on computer-readable medium.Hereinafter, should be with reference to coming illustration method 1900 in conjunction with the described system in Fig. 1-11, parts, module, software, data structure, user interface etc.
Multi-screen device 100 can receive window activity, in step 1908.In an embodiment, multi-screen device 100 can be by receiving from touch- sensitive display 110 or 114, configurable regional 112 or 116, gesture capture region 120 or 124, the input that perhaps can be operated to receive some other hardware sensor of user interface input comes the activity of receive window.The processor administration module 540 of can executing the task can receive input.Task management module 540 can be interpreted as input request and carry out application task, and this application task is with the window in the opening window storehouse.
In an embodiment, task management module 540 is positioned over user interface interaction in the task stack 552 that the demonstration configuration module 568 of multi-display administration module 524 follows.In addition, task management module 540 is waited for the information from multi-display administration module 524, instruction being sent to window management module 532, thereby creates window in window stack.
When the instruction that receives from task management module 540, multi-display administration module 524 determines which of combining display 1760 again movable window should touch part with and be associated, in step 1912.For example, window 4 1770 is associated with the part of combining display 1764.In an embodiment, the unit state module 574 of multi-display administration module 524 can be determined how to confirm device orientation, and perhaps what state device is in, and for example, opens, and closes, and is vertical etc.In addition, preferred module 572 and/or demand module 580 can be determined the display window with how.Gesture module 576 can determine will how to be opened relevant user view with window, based on type and the position of making gesture of gesture.
Show that configuration module 568 can use the input from these modules, and estimate current window stack 1760, determining optimum position and optimum dimension based on the observability algorithm, thus opening window.Therefore, show that the position of configuration module 568 definite the bests is window is placed on the top of window stack 1760, in step 1916.In an embodiment, this observability algorithm is that all parts of combining display are determined, their window all is in the top of storehouse.For example, this observability algorithm is determined window 3 1768, window 4 1770, and window 8 1774 is in the top of storehouse 1760, as seeing at Figure 10 C-10E.When where determining opening window, show that configuration module 568 can distribute display identifier 816 and possible size 808 to window.Then, display identifier 816 and size 808 are returned to task management module 540.Then, task management module 540 can be distributed stack position identifier 812 to window, and 812 indications of stack position identifier are in the window's position on window stack top.
In an embodiment, task management module 540 send window stack information and instructions are to present to window window management module 532.Window management module 532 and task management module 540 can create logic data structure 800, in step 1924.Task management module 540 and window management module 532 both can create and the copy of management window storehouse.By the communication between window management module 532 and task management module 540, can or keep similar with these copies synchronized of window stack.Therefore, based on the information that multi-display administration module 524 is determined, window management module 532 and task management module 540 can assignment sizes 808, stack position identifier 812 (for example, window 1 1782, window 4 1770, Deng), display identifier 816 (for example, touch-sensitive display 1 110, touch-sensitive display 2 114, combining display identifier etc.), and activity indicators 820, usually, specified campaign designator 820 always when window is in storehouse " top ".Then, window management module 532 and task management module 540 both can stored logic data structure 800.In addition, hereinafter, window management module 532 and task management module 540 can management window storehouse and logic data structures 800.
Figure 13 has described further window stack configuration.A plurality of windows 1,2,3,4,5,6,7 and 8 have been described, no matter they are to use from identical or different multi-screens or single screen curtain.Current, touch-sensitive display 110 has window 4 at movable display position, and simultaneously current, touch-sensitive display 114 has window 5 at movable display position.From the top to the bottom, the storehouse of touch-sensitive display 110 has window 4 at movable display position, and at the window 3,2 and 1 of placing thereafter.From the top to the bottom, the storehouse of touch-sensitive display 114 has window 5 at movable display position, and at the window 6,7 and 8 of placing thereafter.
At window stack back placement desktop D1, D2, D3, D4, D5 and D6.Desktop can be regarded as the desktop storehouse that is different from window stack.by this way, touch-sensitive display 110 has the desktop of comprising D3, the corresponding desktop storehouse of D2 and D1, wherein desktop D1 is in bottom 2300 stack position, and desktop D3 is in the top stack position that the enough windows 4 of energy (window based position and size (no matter maximization or minimized)) show, and touch-sensitive display 114 has corresponding desktop storehouse, this desktop storehouse has the desktop of comprising D4, the corresponding desktop storehouse of D5 and D6, wherein desktop D6 is in bottom 2304 stack position, and desktop D4 is in the top stack position that the enough windows 5 of energy (window based position and size (no matter maximization or minimized)) show.Conceptive, in this example, desktop can be regarded as and be divided into the painting canvas of six sections, can show at any one time wherein two sections on touch-sensitive display 110,114.When device 100 is in closed condition, in a configuration, adhere to this conceptual schema.In this configuration, only can see a window and desktop storehouse (being equivalent to main screen), but other window and desktop storehouse are virtual; That is to say, they are kept in storer, but can not see them, because be not activated secondary screen.
The image transitions designator that shows also is considered to well, will show that image (for example window or desktop) shows before, because show that image need to be from the initial target touch-sensitive display 110,114 that moves to.In response to user's gesture, the image transitions designator preview of demonstration shows that the user of image moves.For example, when receiving window from the user and move gesture, conversion indicator is launched or slides from window (will be moved) back, and moves along the path planning that window is advanced, perhaps the head for target touch-sensitive display moves, to move to final window destination.It is occupied that the part of the target touch-sensitive display that is occupied by the window after movement is converted designator.Other points of some that moves after the movement of completing conversion indicator or in conversion indicator place, moving window is to occupy the part by the occupied target touch-sensitive display of conversion indicator.In a configuration, the image transitions designator of demonstration is in fact to show that the identical speed of track user gesture that object the moves translational speed of linear or other functions (perhaps with) is mobile with causing.In configuration, the image transitions designator that shows is used for multi-screen uses, it need to expand to two screens or touch-sensitive display.In this configuration, the image transitions designator that shows is used for the mobile demonstration image that is associated with the application of single screen curtain.In the sort of situation, demonstration image and the output of mobile practical application, and need not to comprise conversion indicator.
Typically, conversion indicator is the demonstration image of the common similar size and shape of demonstration image with being moved, and can not receive respectively or provide dynamic user to input or output.Typically, although not necessarily, it is monochromatic in fact demonstration image, and it has the outward appearance that is different from the demonstration image that will be moved.Conversion indicator can show the manufacturer with device 100, the trade mark that wholesale dealer or retailer are associated or other trademark images.In other configurations, conversion indicator is that the user is configurable.The user can be with shown color or color collection, pattern, and design, icon, photo or other images are elected conversion indicator as.Therefore, the user can personalized conversion indicator to be fit to he or she preference, whereby, different users has different conversion indicator on their device 100 separately.In addition, the user can be chosen in one or more sound of hearing that conversion indicator is play to one or more Chosen Points of the stroke loyalty of target touch-sensitive display.For example, can play the sound that the user selects, to announce the movement of beginning conversion indicator, at the intermediate point place along the conversion indicator travel paths, and/or when conversion indicator is in destination in the target touch-sensitive display.In addition, the user can make one's options to forbid conversion indicator, adjusts the size of conversion indicator, so that its size is less than or greater than the demonstration image of reorientating, and/or the movement of reclocking conversion indicator, so that its movement is faster or slower than default setting.
In a configuration, well is available, is typically used as multi-screen and uses rather than use as the single screen curtain.In a configuration, start conversion indicator, only when the gesture that receives corresponding to gesture capture region 120,124 and during the mobile display image.In a configuration, start conversion indicator, only when the gesture that receives in response to touch-sensitive display 110,114 and during the mobile display image.In other configurations, conversion indicator only shows that with some image moves or photograph is related.
Now, with reference to Fig. 7-8, various examples are discussed.
In Fig. 7 A, touch-sensitive display 110,114 is in vertical display orientation, and difference display window 1 and the second desktop D2.Receive gesture 700 by gesture capture region 120,124.Alternatively, receive gesture 700 by touch-sensitive display 110,114.This gesture can be any suitable gesture, comprises, and is nonrestrictive, those gestures as above.By gesture 700, the user points out his or her order, so that window 1 is moved to (target) touch-sensitive display 114 from (initial) touch-sensitive display 110.
With reference to figure 7B, continuation is in response to the acceptance of gesture, the movement (based on the orientation of described device 100 and touch-sensitive display) of conversion indicator 704 beginning left-to-right or right-to-left, typically, from seeming to be positioned at the point of window 1 back, and typically begin mobile with the translational speed identical with the track user gesture.Typically, conversion indicator 704 is not in the image storehouse of the previous demonstration that is associated with (initial) touch-sensitive display 110.In other words, when receiving gesture, conversion indicator is not presented in the movable or inactive display position of touch-sensitive display 110 or 114.When and launch or mobile due to conversion indicator 704, the seam 708 between the first and second touch-sensitive displays 110,114 and their demonstration image separately is fully dimmed, to show the conversion indicator background.In other words, conversion indicator 704 outwards moves with a direction from seam 708, and final the covering typically virtually completely covers touch-sensitive display 114 and the second desktop D2.As shown in Fig. 7 B and 7C, conversion indicator begins to cover the second desktop D2.
In Fig. 7 D, conversion indicator 704 has covered the second desktop D2 in touch-sensitive display 114 fully, and window 1 remains unchanged in touch-sensitive display 110 simultaneously.In other words, window 1 is in the display position of touch-sensitive display 110 activities, and conversion indicator 704 is in the display position of touch-sensitive display 114 activities simultaneously.The first and second desktop D1 and D2 are in respectively touch- sensitive display 110 and 114 inactive display positions.In other configurations, conversion indicator is the part of coverage goal touch-sensitive display (114) only, before starting or the beginning window moves.
When conversion indicator 704 has moved, and the some or all of target touch-sensitive display (being touch-sensitive display 114 in this example) that occupies, in order to show the preceding image (in this example, this shows image the second desktop D2 the preceding) partly or completely dimmed or when being converted designator 704 and covering, first window 1 makes way, occupying target touch-sensitive display 114, as Fig. 7 E and 7F little by little as shown in.Until triggering, the fixed placement of conversion indicator moves the demonstration image that first window 1 continues as source touch-sensitive display (being touch-sensitive display 110) in this example.In other words, window 1 remains in the display position of touch-sensitive display 110 activities, until conversion indicator is completed it to the movement of target touch-sensitive display 114.At that time, moving window 1 exposes the first desktop D1 to occupy the display position of touch-sensitive display 114 activities at leisure to cover conversion indicator 704.When window 1 was completed it to target touch-sensitive display 114 mobile, the first desktop D1 was in the display position of touch-sensitive display 110 activities, and the second desktop D2 is in the inactive display position of touch-sensitive display 114.
Fig. 8 A-E shows the above-mentioned steps of device 100 in horizontal display orientation, therein, is just maximizing first window to cover at least a portion of the first and second touch-sensitive displays 110 and 114.In Fig. 8 A, touch-sensitive display 110,114 is display window 1 and the second desktop D2 respectively.Receive gesture 700 by gesture capture region 120,124.Alternatively, receive gesture 700 by touch-sensitive display 110,114.This gesture can be any suitable gesture, comprises, and is nonrestrictive, those gestures as above.By gesture 700, the user points out his or her order, so that window 1 is moved to (target) touch-sensitive display 114 from (initial) touch-sensitive display 110.
With reference to figure 8B, in response to the acceptance of gesture, conversion indicator 704 begin from any from top to bottom or movement from top to bottom, this point seems the back (based on the orientation of described device 100 and touch-sensitive display) at window 1.When and launch or mobile due to conversion indicator 704, the seam 708 between the first and second touch-sensitive displays 110,114 and their demonstration image separately is fully dimmed, to show the conversion indicator background.As shown in Fig. 8 B, conversion indicator begins to cover the zone that the second desktop D2 can watch.
In Fig. 8 C, conversion indicator 704 has partly or completely covered the second desktop D2 in touch-sensitive display 114, and window 1 remains unchanged in touch-sensitive display 110 simultaneously.In other words, window 1 is in the display position of touch-sensitive display 110 activities, and conversion indicator 704 is in the display position of touch-sensitive display 114 activities simultaneously.The first and second desktop D1 and D2 are in respectively touch- sensitive display 110 and 114 inactive display positions.
When conversion indicator 704 has moved, and the some or all of target touch-sensitive display (being touch-sensitive display 114 in this example) that occupies, in order to show the preceding image (in this example, this shows image the second desktop D2 the preceding) fully dimmed or when being converted designator 704 and covering, first window 1 makes way, to occupy target touch-sensitive display 114 and initial touch-sensitive display 110, illustrate gradually as Fig. 8 D and 8E.Thitherto, first window 1 continuation is as the demonstration image of source touch-sensitive display (being touch-sensitive display 110 in this example).In other words, window 1 remains in the display position of touch-sensitive display 110 activities, until conversion indicator is completed it to the movement of target touch-sensitive display 114.At that time, moving window 1 to be covering conversion indicator 704, thereby occupies the display position of initial sum target touch- sensitive display 110 and 114 activities.
In different examples, middleware 520, particularly following one or more: multi-display management (MDM) class 524, surperficial high-speed cache class 528, window management class 532, activity management classification 536, with application management classification 540, detect separately or jointly receive (step 900 of Fig. 9) of user's gesture, and determine the gesture control display image that receives, for example window or desktop, move to the target touch-sensitive display.In response, middleware 520 causes the movement (step 1904) of conversion indicator 704 from initial touch-sensitive display to the target touch-sensitive display.During the selected scope of coverage goal touch-sensitive display, middleware 520 mobile display images are with coverage goal touch-sensitive display (and covering conversion indicator) when conversion indicator 704.Logic ends at step 1912.
Of the present disclosure example system and the method relevant with communicator have been described.Yet for fear of to unnecessary the obscuring of the disclosure, foregoing description has omitted a lot of known construction and devices.This omits the restriction that is not counted as claimed scope.Sets forth specific details is to provide understanding of the present disclosure.Yet, should be appreciated that the detail of setting forth except this paper, can implement in every way the disclosure.
In addition, although this paper illustrated exemplary aspect, embodiment, and/or configuration has represented the various parts of the system of configuration, but some parts of this system can be remotely located at distributed network, and the remote portion of LAN and/or the Internet for example perhaps is positioned at dedicated system inside.Therefore, should be appreciated that and can incorporate the parts of this system into one or more devices, for example communicator, perhaps be configured on the specific node of distributed network, for example simulates and/or the digital-telecommunication network packet network, perhaps circuit-switched network.Should recognize according to aforementioned description, and for for the purpose of counting yield, the parts of optional position arrangement system that can be in the distributed network of parts, and do not affect the operation of system.For example, different parts can be arranged in converter, for example PBX and media server, and gateway is arranged in one or more communicators, is positioned at one or more user supposition place, perhaps their some combinations.Equally, can be between remote communication devices and the calculation element that is associated one or more funtion parts of distribution system.
In addition, be to be appreciated that, the various links of Connection Element can be wired or Radio Link or their combination in any, perhaps data can be provided and/or are transferred to Connection Element and provide and/or any other element known or nearest exploitation of the transmission of data from Connection Element.These wired or wireless links can also be safety chains, and can transmit the information of encryption.For example, as the transmission medium of link can electric signal any suitable carrier, comprise concentric cable, copper wire and fiber optics, and can adopt acoustics or form of light waves, those that for example generate during radiowave and infrared data communication.
Equally, although for the special time order, discussed and illustrated process flow diagram, should be appreciated that change, increase and the omission that this order can occur, and do not affect in essence the operation of disclosed embodiment, configuration and aspect.
Can use a plurality of variation of the present disclosure and modification.What will exist is, providing can feature more of the present disclosure and other features are not provided.
For example, in the embodiment of an alternative, conversion indicator is mobile movement of having divulged the demonstration image except window and desktop the preceding.
In other embodiments, the conversion indicator preview covers the window of two touch-sensitive displays from only accounting for a touch-sensitive display maximization.
In the embodiment of another alternative, in the transition period, conversion indicator covers whole touch-sensitive display, and when stopping device 100, and main screen is only arranged is movable, and/or conversion indicator covers this two touch-sensitive displays, when device for opening 100.The situation of back may occur, for example, when maximization or opening window and when covering this two touch-sensitive displays, perhaps when conversion affect these two main screens and secondary the screen.When on the single touch-sensitive display of the device of closing or on two touch-sensitive displays of the device of opening during maximized window, conversion indicator can be moved out of touch-sensitive display from the edge.
In other embodiments, the disclosure is applicable to other demonstration image transitions except window moves.In such conversion, touch-sensitive display changes the demonstration image at least in part.After removing previous demonstration image and before the new demonstration image of input, come the change of indicated number image by at least a portion that covers display with conversion indicator.
In yet another embodiment, can be in conjunction with special purpose computer, the microprocessor of programming or microcontroller and external integrated element, ASIC or other integrated circuit, digital signal processor, such as hardwire electronics or the logical circuit of separating element circuit, PLD for example, PLA, FPGA, PAL, the programmable logic device of special purpose computer or gate array, any similar devices etc. are carried out this disclosed system and method.Usually, any device or the equipment that can carry out the illustrated method of this paper can be used to carry out this disclosed various aspects.The exemplary hardware that can be used to disclosed embodiment, configuration and aspect comprises computer, hand-held device, phone (for example, honeycomb, the Internet-enabled, numeral, simulation mixes etc.), and other hardware well known in the prior art.Some of these devices comprise processor (for example, list or multimicroprocessor), storer, nonvolatile memory, input media and output unit.In addition, the software realization mode of alternative includes but not limited to, distributed treatment or component/object distributed treatment, parallel processing, but perhaps also virtual machine constructor process to carry out method described herein.
In yet another embodiment, the software that can be combined with the software development environment of target or definite target is easily carried out disclosed method, and software development environment provides the portable that can be used on various computing machines or workstation platform source code.Alternatively, can or fully realize disclosed system with the hardware components of Application standard logic or VLSI design.No matter adopt software or hardware openly to realize system according to this, depend on speed and/or the efficiency requirements of system, specific function, and the specific software of using or hardware system or microprocessor or microcomputer system.
In yet another embodiment, can partly realize disclosed method with software, this software is stored on storage medium, and at controller and storer, special purpose computer is carried out on the multi-purpose computer of programming under the cooperation of microprocessor etc.In these situations, this disclosed system and method can be performed as the program that is embedded on personal computer, such as small routine, JAVA or CGI scripting, be performed as the resource that resides on server or computer workstation, be performed as routine in the special measurement system of embedding, system unit etc.By incorporating practically system and/or method into software and/or hardware system, also can realize this system.
Although the disclosure has been described the aspect relevant with specific criteria and agreement, the parts of embodiment and/or Configuration and function, these aspects, embodiment and/or configuration are not limited to such standard and agreement.Other similar standards that this paper does not mention and agreement also and are believed to comprise in the disclosure.In addition, other similar standards and the agreement do not mentioned of standard mentioned in this article and agreement and this paper can periodically be replaced by the faster or more effective equivalent that has in fact identical function.These replacement standards and agreement with identical function are considered as included in equivalent of the present disclosure.
At different aspect, the disclosure in embodiment and/or configuration comprises in fact as parts described herein and record, method, and step, system and/or equipment comprise various aspects, embodiment, configuration embodiment, combination, and/or their subset certainly.Those skilled in the art will understand that how to produce and use disclosed aspect, embodiment and/or configuration after having understood the disclosure.Aspect different, the disclosure in embodiment and/or configuration is included in the situation of the project that does not exist this paper there is no description and/or record, perhaps at the different aspect about this point, generator and step in embodiment and/or configuration, in the situation that do not exist this project to include, owing to may being used to said apparatus or step, for example, be used for improving performance, easily realize and/or reduce and realize cost.
For the purpose that illustrates and illustrate by the agency of above-mentioned discussion.Above do not plan the disclosure is limited to one or more form disclosed herein.For example, in embodiment above, in order to simplify purpose of the present disclosure, aspect one or more, in embodiment and/or configuration, different characteristic of the present disclosure can be gathered together.Aspect of the present disclosure, the feature of embodiment and/or configuration can be incorporated those the aspect of alternative except above-mentioned discussion, embodiment and/or configuration into.This disclosed method will not be interpreted as reflecting that in each claim claim needs the intention of more feature in order clearly to describe.On the contrary, reflect as subsequently claim, inventive aspect has been showed and has been less than single above-mentioned disclosed aspect, all features of embodiment and/or configuration.Therefore, incorporate accordingly claim subsequently into this embodiment, because each claim is separately as disclosure preferred embodiment independently.
In addition, although this instructions has comprised the description of one or more aspects, embodiment and/or configuration and some variation and modification, but other variations, combination and modification also are positioned at the scope of the present disclosure, for example, after having understood the disclosure, they can fall in the scope of the technology of this area and knowledge.What plan is to obtain right; these rights comprise the aspect of the alternative in tolerance band; and/or configuration; the alternative, the interchangeable and/or equivalent structure that comprise claimed those, function, scope or step; and no matter whether this paper discloses structure these alternatives, interchangeable and/or equivalent; function, scope or step, and there is no to plan to be exclusively used in publicly any theme that obtains patent.

Claims (21)

1. method, described method comprises:
By at least one the reception gesture in gesture capture region and touch-sensitive display, described gesture indication will show that image moves to the second touch-sensitive display from the first touch-sensitive display; And
Response and prior to described demonstration image to the movement of described the second touch-sensitive display, by microprocessor, conversion indicator is moved to the select location that will be occupied by described demonstration image from described the first touch-sensitive display to described the second touch-sensitive display; And
By described microprocessor described demonstration image from described first touch-sensitive display to described second touch-sensitive display moved to described select location thereafter.
2. the method for claim 1, wherein described demonstration image is desktop, and wherein, receives described gesture by described touch-sensitive display.
3. the method for claim 1, wherein described demonstration image is window, wherein, receive described gesture by described gesture capture region, and wherein, described gesture capture region can not show any demonstration image.
4. the method for claim 1, wherein, described conversion indicator moves to described select location along the travel path of described demonstration image, with the movement of the described demonstration image of preview, and wherein, the size and shape of described conversion indicator is in fact identical with described demonstration image.
5. the method for claim 1, wherein, described conversion indicator can not receive or provide dynamic user to input or output respectively, wherein, described conversion indicator has the outward appearance that is different from described demonstration image, and wherein, with before covering described the second touch-sensitive display fully, described conversion indicator only covers the part of described the second touch-sensitive display at mobile described demonstration image.
6. the method for claim 1, wherein, when receiving gesture, described conversion indicator not with demonstration image storehouse that described the first touch-sensitive display and the second touch-sensitive display are associated in, wherein, before the movement of the described demonstration image of beginning, described demonstration image and conversion indicator are in respectively on the display position of activity of described the first touch-sensitive display and the second touch-sensitive display simultaneously, and wherein, described conversion indicator comprises user configured color, pattern, design and/or photo.
7. the method for claim 1, wherein, described conversion indicator is the figure available, wherein, by the described window of multi-screen application controls, and wherein, during from described the first touch-sensitive display to the movement of described the second touch-sensitive display, in fact described conversion indicator does not respond to user command or request.
8. nonvolatile computer-readable medium, described medium comprises the microprocessor executable instruction, described instruction can be operated to carry out the following step at least:
By at least one the reception gesture in gesture capture region and touch-sensitive display, described gesture indication will show that image moves to the second touch-sensitive display from the first touch-sensitive display; And
Response and prior to described demonstration image to the movement of described the second touch-sensitive display, conversion indicator is moved to the select location that will be occupied by described demonstration image from described the first touch-sensitive display to described the second touch-sensitive display; And
Described demonstration image from described first touch-sensitive display to described second touch-sensitive display moved to described select location thereafter.
9. medium as claimed in claim 8, wherein, described demonstration image is desktop, and wherein, receives described gesture by described touch-sensitive display.
10. medium as claimed in claim 8, wherein, described demonstration image is window, wherein, receive described gesture by described gesture capture region, and wherein, described gesture capture region can not show any demonstration image.
11. medium as claimed in claim 8, wherein, described conversion indicator moves to described select location along the travel path of described demonstration image, with the movement of the described demonstration image of preview, and wherein, the size and shape of described conversion indicator is in fact identical with described demonstration image.
12. medium as claimed in claim 8, wherein, described conversion indicator can not receive or provide dynamic user to input or output respectively, wherein, described conversion indicator has the outward appearance that is different from described demonstration image, and wherein, with before covering described the second touch-sensitive display fully, described conversion indicator only covers the part of described the second touch-sensitive display at mobile described demonstration image.
13. medium as claimed in claim 8, wherein, when receiving gesture, described conversion indicator not with demonstration image storehouse that described the first touch-sensitive display and the second touch-sensitive display are associated in, wherein, before the movement of the described demonstration image of beginning, described demonstration image and conversion indicator are in respectively on the display position of activity of described the first touch-sensitive display and the second touch-sensitive display simultaneously, and wherein, described conversion indicator comprises user configured color, pattern, design and/or photo.
14. medium as claimed in claim 8, wherein, described conversion indicator is the figure available, wherein, by the described window of multi-screen application controls, and wherein, during from described the first touch-sensitive display to the movement of described the second touch-sensitive display, in fact described conversion indicator does not respond to user command or request.
15. a dual screen communicator, it comprises:
The gesture capture region is used for receiving gesture;
The first touch-sensitive display is used for receiving gesture and shows image, and wherein, described demonstration image is the window of application and at least one of desktop;
The second touch-sensitive display is used for receiving gesture and shows image; And
Middleware, described middleware can operate to carry out at least one of following operation:
Receive gesture, described gesture indication will show that image is from described the first touch-sensitive display expansion, to cover most at least described the second touch-sensitive display;
Response and the expansion of arriving described the second touch-sensitive display prior to described demonstration image, the expansion conversion indicator is to cover most at least described the second touch-sensitive display; And
Described demonstration image spreading arrived described second touch-sensitive display thereafter.
16. device as claimed in claim 15, wherein, described demonstration image is desktop, and wherein, receives described gesture by described touch-sensitive display.
17. device as claimed in claim 15, wherein, described demonstration image is minimized window, wherein, receive described gesture by described gesture capture region, wherein, described gesture capture region can not show any demonstration image, and wherein, maximize described window to cover most at least described the first touch-sensitive display and the second touch-sensitive display.
18. device as claimed in claim 15, wherein, described conversion indicator moves to described select location along the travel path of described demonstration image, with the expansion of the described demonstration image of preview, and wherein, the size and shape of described conversion indicator is in fact identical with described demonstration image.
19. device as claimed in claim 15, wherein, described conversion indicator can not receive or provide dynamic user to input or output respectively, and wherein, described conversion indicator has the outward appearance that is different from described demonstration image.
20. device as claimed in claim 15, wherein, when receiving gesture, described conversion indicator not with demonstration image storehouse that described the first touch-sensitive display and the second touch-sensitive display are associated in, wherein, before the expansion of the described demonstration image of beginning, described demonstration image and conversion indicator are in respectively on the display position of activity of described the first touch-sensitive display and the second touch-sensitive display simultaneously, and wherein, described conversion indicator comprises user configured color, pattern, design and/or photo.
21. device as claimed in claim 15, wherein, described conversion indicator is the figure available, wherein, by the described window of multi-screen application controls, and wherein, during from described the first touch-sensitive display to the expansion of described the second touch-sensitive display, in fact described conversion indicator does not respond to user command or request.
CN201210458810.2A 2011-09-01 2012-09-03 The method of moving window and dual screen communicator between multi-screen device Active CN103116460B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN201810310376.0A CN108228035B (en) 2011-09-01 2012-09-03 Method for moving window between multi-screen devices and dual-display communication device

Applications Claiming Priority (5)

Application Number Priority Date Filing Date Title
US13/223,778 US20120081309A1 (en) 2010-10-01 2011-09-01 Displayed image transition indicator
US13/223,778 2011-09-01
US38911710A 2011-10-01 2011-10-01
US38908710A 2011-10-01 2011-10-01
US38900010A 2011-10-01 2011-10-01

Related Child Applications (1)

Application Number Title Priority Date Filing Date
CN201810310376.0A Division CN108228035B (en) 2011-09-01 2012-09-03 Method for moving window between multi-screen devices and dual-display communication device

Publications (2)

Publication Number Publication Date
CN103116460A true CN103116460A (en) 2013-05-22
CN103116460B CN103116460B (en) 2018-05-04

Family

ID=48428813

Family Applications (2)

Application Number Title Priority Date Filing Date
CN201210458810.2A Active CN103116460B (en) 2011-09-01 2012-09-03 The method of moving window and dual screen communicator between multi-screen device
CN201810310376.0A Active CN108228035B (en) 2011-09-01 2012-09-03 Method for moving window between multi-screen devices and dual-display communication device

Family Applications After (1)

Application Number Title Priority Date Filing Date
CN201810310376.0A Active CN108228035B (en) 2011-09-01 2012-09-03 Method for moving window between multi-screen devices and dual-display communication device

Country Status (1)

Country Link
CN (2) CN103116460B (en)

Cited By (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US10664162B2 (en) 2013-11-18 2020-05-26 Red Hat, Inc. Multiple display management
CN114945899A (en) * 2020-01-10 2022-08-26 微软技术许可有限责任公司 Conditional window model for foldable computing devices

Families Citing this family (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN109375890B (en) * 2018-09-17 2022-12-09 维沃移动通信有限公司 Screen display method and multi-screen electronic equipment
US11157047B2 (en) * 2018-11-15 2021-10-26 Dell Products, L.P. Multi-form factor information handling system (IHS) with touch continuity across displays
CN110618769B (en) * 2019-08-22 2021-11-19 华为技术有限公司 Application window processing method and device

Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20040066408A1 (en) * 2002-10-08 2004-04-08 Microsoft Corporation Intelligent windows bumping method and system
US20090278806A1 (en) * 2008-05-06 2009-11-12 Matias Gonzalo Duarte Extended touch-sensitive control area for electronic device
US20100085274A1 (en) * 2008-09-08 2010-04-08 Qualcomm Incorporated Multi-panel device with configurable interface
US7739604B1 (en) * 2002-09-25 2010-06-15 Apple Inc. Method and apparatus for managing windows
CN101827503A (en) * 2009-03-03 2010-09-08 Lg电子株式会社 Portable terminal

Family Cites Families (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US5434965A (en) * 1992-12-23 1995-07-18 Taligent, Inc. Balloon help system
JP2011022842A (en) * 2009-07-16 2011-02-03 Sony Corp Display apparatus, display method, and program
CN101847075A (en) * 2010-01-08 2010-09-29 宏碁股份有限公司 Multi-screen electronic device and image display method thereof

Patent Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US7739604B1 (en) * 2002-09-25 2010-06-15 Apple Inc. Method and apparatus for managing windows
US20040066408A1 (en) * 2002-10-08 2004-04-08 Microsoft Corporation Intelligent windows bumping method and system
US20090278806A1 (en) * 2008-05-06 2009-11-12 Matias Gonzalo Duarte Extended touch-sensitive control area for electronic device
US20100085274A1 (en) * 2008-09-08 2010-04-08 Qualcomm Incorporated Multi-panel device with configurable interface
CN101827503A (en) * 2009-03-03 2010-09-08 Lg电子株式会社 Portable terminal

Cited By (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US10664162B2 (en) 2013-11-18 2020-05-26 Red Hat, Inc. Multiple display management
CN114945899A (en) * 2020-01-10 2022-08-26 微软技术许可有限责任公司 Conditional window model for foldable computing devices

Also Published As

Publication number Publication date
CN108228035A (en) 2018-06-29
CN108228035B (en) 2021-05-04
CN103116460B (en) 2018-05-04

Similar Documents

Publication Publication Date Title
CN103262010B (en) Appeared by the desktop utilizing gesture to move logic display storehouse
CN103076967B (en) Change bifocal method and double screen communication equipment for responding gesture
CN103270505B (en) Intelligent plate divides screen
CN102999309B (en) Multihead display controls
JP6073792B2 (en) Method and system for viewing stacked screen displays using gestures
JP2014508977A6 (en) Smart pad split screen
CN103116460A (en) Conversion indicator of display image
CN103282955B (en) Show desktop when the device is opened
JP6073793B2 (en) Desktop display simultaneously with device release

Legal Events

Date Code Title Description
C06 Publication
PB01 Publication
C10 Entry into substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant