CN103282955A - Displaying the desktop upon device open - Google Patents

Displaying the desktop upon device open Download PDF

Info

Publication number
CN103282955A
CN103282955A CN2011800580177A CN201180058017A CN103282955A CN 103282955 A CN103282955 A CN 103282955A CN 2011800580177 A CN2011800580177 A CN 2011800580177A CN 201180058017 A CN201180058017 A CN 201180058017A CN 103282955 A CN103282955 A CN 103282955A
Authority
CN
China
Prior art keywords
window
display
equipment
desktop
screen
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Granted
Application number
CN2011800580177A
Other languages
Chinese (zh)
Other versions
CN103282955B (en
Inventor
S.瑟帕尔
P.E.里维斯
A.德帕兹
R.W.施罗克
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Flex Electronics Id Co ltd
Original Assignee
Flex Electronics Id Co ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Flex Electronics Id Co ltd filed Critical Flex Electronics Id Co ltd
Priority claimed from PCT/US2011/053953 external-priority patent/WO2012044806A2/en
Publication of CN103282955A publication Critical patent/CN103282955A/en
Application granted granted Critical
Publication of CN103282955B publication Critical patent/CN103282955B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F1/00Details not covered by groups G06F3/00 - G06F13/00 and G06F21/00
    • G06F1/16Constructional details or arrangements
    • G06F1/1613Constructional details or arrangements for portable computers
    • G06F1/1615Constructional details or arrangements for portable computers with several enclosures having relative motions, each enclosure supporting at least one I/O or computing function
    • G06F1/1616Constructional details or arrangements for portable computers with several enclosures having relative motions, each enclosure supporting at least one I/O or computing function with folding flat displays, e.g. laptop computers or notebooks having a clamshell configuration, with body parts pivoting to an open position around an axis parallel to the plane they define in closed position
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F1/00Details not covered by groups G06F3/00 - G06F13/00 and G06F21/00
    • G06F1/16Constructional details or arrangements
    • G06F1/1613Constructional details or arrangements for portable computers
    • G06F1/1633Constructional details or arrangements of portable computers not specific to the type of enclosures covered by groups G06F1/1615 - G06F1/1626
    • G06F1/1637Details related to the display arrangement, including those related to the mounting of the display in the housing
    • G06F1/1641Details related to the display arrangement, including those related to the mounting of the display in the housing the display being formed by a plurality of foldable display components
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F1/00Details not covered by groups G06F3/00 - G06F13/00 and G06F21/00
    • G06F1/16Constructional details or arrangements
    • G06F1/1613Constructional details or arrangements for portable computers
    • G06F1/1633Constructional details or arrangements of portable computers not specific to the type of enclosures covered by groups G06F1/1615 - G06F1/1626
    • G06F1/1637Details related to the display arrangement, including those related to the mounting of the display in the housing
    • G06F1/1647Details related to the display arrangement, including those related to the mounting of the display in the housing including at least an additional display
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]
    • G06F3/0481Interaction techniques based on graphical user interfaces [GUI] based on specific properties of the displayed interaction object or a metaphor-based environment, e.g. interaction with desktop elements like windows or icons, or assisted by a cursor's changing behaviour or appearance
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]
    • G06F3/0481Interaction techniques based on graphical user interfaces [GUI] based on specific properties of the displayed interaction object or a metaphor-based environment, e.g. interaction with desktop elements like windows or icons, or assisted by a cursor's changing behaviour or appearance
    • G06F3/0483Interaction with page-structured environments, e.g. book metaphor
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]
    • G06F3/0487Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser
    • G06F3/0488Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser using a touch-screen or digitiser, e.g. input of commands through traced gestures
    • G06F3/04883Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser using a touch-screen or digitiser, e.g. input of commands through traced gestures for inputting data by handwriting, e.g. gesture or text
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/14Digital output to display device ; Cooperation and interconnection of the display device with other functional units
    • G06F3/1423Digital output to display device ; Cooperation and interconnection of the display device with other functional units controlling a plurality of local displays, e.g. CRT and flat panel display

Landscapes

  • Engineering & Computer Science (AREA)
  • Theoretical Computer Science (AREA)
  • General Engineering & Computer Science (AREA)
  • Physics & Mathematics (AREA)
  • Computer Hardware Design (AREA)
  • Human Computer Interaction (AREA)
  • General Physics & Mathematics (AREA)
  • Mathematical Physics (AREA)
  • User Interface Of Digital Computer (AREA)

Abstract

Systems and methods are provides for displaying a desktop for a multi-screen device in response to opening the device. The window stack can change based on the change in the orientation of the device. The system can receive an orientation change that transitions the device from a closed state to an open state. A previously created in the stack can expand over the area of the two or more displays comprising the device when opened. A desktop expands to fill the display area and be displayed on the second of the displays when the device is opened.

Description

When opening, equipment shows desktop
Cross reference to related application
The application is according to 35U.S.C. § 119(e) require following U.S. Provisional Application No.: the 61/389th, No. 000, on October 01st, 2010 submit to be entitled as " dual screen window system "; The 61/389th, No. 117, on October 01st, 2010 submit to be entitled as " the portable the service of connection devices of multiple operating system "; The 61/389th, No. 087, on October 01st, 2010 submit to be entitled as " panel computer user interface "; The 61/458th, No. 150, on November 17th, 2010 submit to be entitled as " double screen email client "; XX/XXXXXX number, in be entitled as " mobile device of submitting to 09 month XX day in 2011 ".Each whole instructions and whole purposes in the aforementioned document is incorporated in this in full with it by reference.
Background technology
A considerable amount of handheld computing devices use touch screen display such as mobile phone, panel computer and electronic reader, show information not only to offer the user, but also receive input from user interface command.Though touch screen display can increase the configurability of handheld device, and provide miscellaneous user interface option, this dirigibility that cost is normally arranged.Although be flexibly to the user, content being provided and receiving the double duty of user command of touch screen may be obscured and shown and cause visual confusion, thereby causes the user to baffle and the loss of yield-power.
Careful balance between the zone that the figure that the small and exquisite profile needs of hand-held computing equipment are shown and confession receive input.On the one hand, little display limitations display space, this may increase interpreter operation or result's difficulty.On the other hand, dummy keyboard other user interface scheme is added to or adjacent positioned to the application program of carrying out, this needs application program to be extruded to the littler part of display.
For single touch screen equipment that shows, this balancing run is difficult especially.Single touch screen equipment that shows is weakened by their limited screen space.When the user is input to information in the equipment by unitary display, particularly when requiring between display and the interface sophisticated interaction, explain that the ability of the information in the display may seriously be hindered.
Summary of the invention
Need the handheld computing device of two multi-displays, function and/or the versatility of comparing enhancing with existing individual monitor handheld computing device can be provided.Need being solved by various aspects of the present disclosure, embodiment and/or configuration of these and other.In addition, although the disclosure proposes with exemplary embodiment, should be understood that various aspects of the present disclosure prescription separately.
In an embodiment, provide a kind of computer-readable medium, it comprises makes that processor is the instruction of multi-screen equipment control window stack.Computer executable instructions comprises the instruction of the direction change that receives in the multi-screen equipment, wherein the direction change makes multi-screen equipment be transformed into open mode from closed condition, wherein, in off position, activity during first display, and in open mode, first display and second display all are movable; Determine whether desktop should be in the shown instruction of second display; And, should be after second display be shown at definite desktop, show desktop and show the instruction of first window at first display at second display.
In another embodiment, equipment comprise at least two displays, storer and with storer and at least two displays in each processor of communicating by letter.Equipment can be operated to provide and meet demonstration, and wherein, described composite display comprises a touch-sensitive display in off position; The direction that receives multi-screen equipment changes, wherein said direction changes makes multi-screen equipment be transformed into open mode from closed condition, wherein, in open mode, composite display comprises the first that is associated with first touch-sensitive display and the second portion that is associated with second touch-sensitive display; Extended desktop is to cover this composite display; And determine that first window is shown and desktop is shown at the second portion of described composite display in the first of composite display; And at second touch-sensitive display demonstration desktop and at first touch-sensitive display demonstration, first window.
In another embodiment, the method for the device rendered demonstration of multi-screen comprises: when multi-screen equipment in off position the time, provide the composite display of expanding first touch display part at least; When equipment in off position, show first window at the top of window stack; The direction that receives multi-screen equipment changes, and wherein the direction change is the conversion from the closed condition to the open mode; Change composite display at least part of with first touch-sensitive display of expansion multi-screen equipment and second touch-sensitive display, wherein, the first of composite display is associated with first touch-sensitive display, and the second portion of composite display is associated with second touch-sensitive display; Determine that desktop is associated with composite display; Determine that desktop is associated with composite display; Revise desktop to expand on the composite display; Determine that first window is at the top of the window stack of the first of composite display; Determine the top of the window stack of desktop in the second portion of composite display; Show first window at first touch-sensitive display; And show desktop at second touch-sensitive display.
The disclosure can provide many advantages, and this depends among specific aspect, the embodiment and/or configuration.Window stack is arranged provides the user advantage of the logic arrangement of the maintenance window of understanding easily." deck " arranges activity and the inactive window that allows between two displays of user's rapid navigation.These and other advantage sees to be apparent from the disclosure.
Phrase " at least one ", " one or more " and " and/or " be open statement, it is also separating of connection in operation.For example, each statement " at least one among A, B and the C ", " at least one among A, B or the C ", " among A, B and the C one or more ", " among A, B or the C one or more " and " A, B and/or C " refer to an independent A, independent B, independent C, A and B together, A and C together, B and C together or A, B and C together.
Term " one " or " one " entity refer to one or more these entities.Therefore, term " " (or " "), " one or more " and " at least one " can be exchanged use in this article.It should also be noted that term " comprises ", " comprising ", and " having " can exchange use.
As used herein term " automatically " and variant thereof refer to not have any processing or the operation of the artificial input of essence when handling or operate.But even artificial input essence or immaterial has been used in the execution of handling or operating, if receive input before handling or operate execution, processing or operation also can be automatic.Artificial input is considered to essence, if how this input influence is carried out handles or operation.The artificial input of agreeing the execution of processing or operation is not considered to " essence ".
Term used herein " computer-readable medium " refers to and participates in providing instruction to any tangible storage and/or the transmission medium of processor for execution.This medium can be taked many forms, includes but not limited to non-volatile media, Volatile media and transmission medium.Non-volatile media comprises, for example, and NVRAM or disk or CD.Volatile media comprises dynamic storage, for example primary memory.The common form of computer-readable medium comprises, for example, floppy disk, flexible disk, hard disk, tape or any other magnetic medium, magnet-optical medium, CD-ROM, any other optical medium, card punch, paper tape, any other physical medium, RAM, PROM and EPROM, FLASH-EPROM with the form in hole, the solid state medium of similar storage card, any other memory chip or box, carrier wave described below or computing machine any other medium that can therefrom read.The digital file attachment of Email or other carry the distributed medium that news file or archives group are considered to be equivalent to tangible storage medium.When computer-readable medium is configured to database, should be appreciated that this database can be the database of any kind, such as relation, level, OO etc.Therefore, the disclosure is believed to comprise equivalent and the subsequent medium of tangible storage medium or distributed medium and prior art approval, and storage therein realizes software of the present disclosure.
Term " desktop " refers to the metaphor for descriptive system.Desktop is considered to " surface " usually, and it generally includes picture, icon, widget, file etc., and it can activate application program, window, cabinet, file, file, literary composition retaining and other figures that illustrate.Icon generally all is selectable, coming initiating task by user interface interaction, thereby allows user's executive utility or carries out other operation.
Term " screen ", " touch screen " or " touch screen " refer to comprise the physical arrangement of one or more nextport hardware component NextPorts, and it provides the ability that presents user interface and/or receive user's input for equipment.Screen can comprise the combination in any in gesture capture region, touch-sensitive display and/or configurable zone.This equipment can have the one or more physical screens that are embedded in the hardware.Yet screen also can comprise can be from the peripherals of the outside that equipment mounts and dismounts.In an embodiment, a plurality of external units can be connected to this equipment.Therefore, in an embodiment, screen can make the user can be mutual by the zone on the touch screen and this equipment, and provides information by display to the user.Touch screen can contact with some different mode sensing user, changes detection etc. as variation, sound wave variation, the approaching detection of infrared radiation, light by change electrical quantity (for example, resistance or electric capacity).In the resistive touch screen, for example, the conduction of the common separation on the screen pass through electric current with metal level resistance.When user's touch screen, two layers contact at contact position, thereby notice the variation in electric field and calculate the coordinate of contact position.In capacitive touch screen, the capacitor layers charge stored, it is discharged into the user when contacting with touch screen, causes that the electric charge in capacitor layers reduces.Measure this minimizing and definite position contacting coordinate.In a surface acoustic wave touch screen curtain, the sound wave that sends by screen, and contact the interference sound wave by the user.Receiving transducer detects the user and contacts example and determine the position contacting coordinate.
Term " display " refers to show that a computing machine is to the part of one or more screens of user's output.Display can be single screen display or the multi-panel monitor displaying that is called composite display.Composite display can comprise the touch-sensitive display of one or more screens.Single physical screen can comprise independently a plurality of displays of logic eye management of conduct.Therefore, though partly at identical physical screen, different contents may be displayed on the independent display.
Term " image of demonstration " refers to the image that produces at display.The typical image that shows is window or desktop.The display that shown image can take partly or entirely.
The mode of orientation when term " display direction " refers to that rectangular display is watched by the user.Modal two types display direction is vertical and horizontal.Under transverse mode, display is oriented such that the width of display is greater than the height (such as the high 4:3 ratio that is wide and 3 units of 4 units, or 16 units are wide and the high 16:9 ratio of 9 units) of display.In other words, substantially by horizontal orientation, and the short size of display is vertically oriented the longer size of display substantially under transverse mode.By contrast, at vertical pattern, display is oriented such that the width of display is the height less than display.In other words, under vertical pattern, the short size of display is basically by horizontal orientation, and being vertically oriented basically of the longer size of display.
Justice specified in term " composite display " can comprise the logical organization of the display of one or more screens.Multi-panel monitor displaying can be associated with the composite display that has comprised all screens.Composite display can have different display characteristics based on the different direction of equipment.
Term " gesture " refers to express the user action of idea, action, implication, result and/or the achievement of intention.User action can comprise maneuvering device (for example, opening or closing direction, motion track ball or the roller etc. of equipment, change equipment), and body part is relevant to the movement of this equipment, the enforcement of relevant this equipment or the movement of instrument, audio frequency input etc.Gesture can go up or make with the mutual equipment of this equipment at equipment (for example, on screen).
Term used herein " module " refers to carry out the combination of hardware, software, firmware, artificial intelligence, fuzzy logic or the hardware and software any known or that develop later on of the function that is associated with this element.
Term " gesture is caught " refers to the example of user's gesture and/or sensing or the detection of type.Gesture is caught one or more zones that can occur in screen, and the gesture zone can be on the display screen, and it can be called as and is touch-sensitive display there; Or and displays separated, it can be called as the gesture capture region there.
" multi-screen application program " refers to enable the application program of a plurality of patterns.The multi-screen application model can include, but not limited to a single screen pattern (wherein in display application program on the single screen) or compound display mode (wherein display application program on two or more screens).The multi-screen application program can have the different layouts of optimizing at pattern.Therefore, the multi-screen application program has at single screen or the different layouts that can cross over the composite display of two or more screens.Different layouts can have different screen/display size and/or configuration, can present the multi-screen User Interface on it.Different layouts allows application program at the user interface of the type optimization application such as (for example single screen or a plurality of screens) of display.Under single screen pattern, the multi-screen application program can present the information of a windowpane.In compound display mode, the information that the multi-screen application program can present a plurality of windowpanes maybe can provide bigger and abundanter demonstration, and this is because there is more space to be used for displaying contents.The multi-screen application program can be designed according to system assignment to variation and pattern in display (single or compound) the dynamically adapting equipment of multi-screen application program.In alternative embodiment, the user can make the request applications that uses gesture carry out the transition to different patterns, and if the pattern that display can be used for asking, this equipment can allow application program to transfer to this display and transition mode.
" single screen application program " refer to can single screen pattern application program.Therefore, in single screen application program only to produce a window, can not be in different mode or different demonstration dimensions.Single screen application program can not be in the several modes of the multi-screen application program of discussion.
Term " window " typically refers to rectangle, i.e. the image of the demonstration on display at least part of wherein comprises or the content different with the remainder of screen be provided.This window may hide desktop.
Term " is determined ", " calculating (calculate) " and " calculating (compute) " and variant thereof, and is as used herein, can exchange use, and comprise method, process, mathematical operation or the technology of any kind.
Should be understood that, according to 35USC, the 112nd part, the 6th section term used herein " device " should give its most wide in range possible explanation.Therefore, should comprise all structures described in this paper, material or action in conjunction with term " device " claim, and all equivalents.In addition, its structure, material or action and equivalent thereof should comprise the counterpart that all are described in content of the present invention, description of drawings, embodiment, summary and claims.
More than be simplification summary of the present disclosure, so that the understanding of some aspect of the present disclosure to be provided.This summary is neither widely, neither the disclosure and the exhaustive overview of various aspects, embodiment and/or configuration.Its purpose is both uncertain key of the present disclosure or important element, does not also describe the scope of the present disclosure, but presents selected concept of the present disclosure with the form of simplifying, and introduces in greater detail as given below.As being understood that, in the time of alone or in combination, other side of the present disclosure, embodiment and/or configuration may utilize one or more features that set forth above or that be discussed in more detail below.
Description of drawings
Figure 1A comprises first view of the embodiment of multi-screen subscriber equipment;
Figure 1B comprises second view of the embodiment of multi-screen subscriber equipment;
Fig. 1 C comprises the three-view diagram of the embodiment of multi-screen subscriber equipment;
Fig. 1 D comprises the 4th view of the embodiment of multi-screen subscriber equipment;
Fig. 1 E comprises the 5th view of the embodiment of multi-screen subscriber equipment;
Fig. 1 F comprises the six views of the embodiment of multi-screen subscriber equipment;
Fig. 1 G comprises the 7th view of the embodiment of multi-screen subscriber equipment;
Fig. 1 H comprises the 8th view of the embodiment of multi-screen subscriber equipment;
Fig. 1 I comprises the 9th view of the embodiment of multi-screen subscriber equipment;
Fig. 1 J comprises the tenth view of the embodiment of multi-screen subscriber equipment;
Fig. 2 is the block diagram of embodiment of the hardware of equipment;
Fig. 3 A is based on the block diagram of embodiment of state model of the equipment of the direction of equipment and/or configuration;
Fig. 3 B is based on the table of embodiment of state model of the equipment of the direction of equipment and/or configuration;
Fig. 4 A is first expression of the embodiment of user's gesture of receiving at equipment;
Fig. 4 B is second expression of the embodiment of user's gesture of receiving at equipment;
Fig. 4 C is the 3rd expression of the embodiment of user's gesture of receiving at equipment;
Fig. 4 D is the 4th expression of the embodiment of user's gesture of receiving at equipment;
Fig. 4 E is the 5th expression of the embodiment of user's gesture of receiving at equipment;
Fig. 4 F is the 6th expression of the embodiment of user's gesture of receiving at equipment;
Fig. 4 G is the 7th expression of the embodiment of user's gesture of receiving at equipment;
Fig. 4 H is the 8th expression of the embodiment of user's gesture of receiving at equipment;
Fig. 5 A is the block diagram of the embodiment of device software and/or firmware;
Fig. 5 B is second block diagram of the embodiment of device software and/or firmware;
Fig. 6 A is in response to first expression of the embodiment of the equipment configuration that equipment state generates;
Fig. 6 B is in response to second expression of the embodiment of the equipment configuration that equipment state generates;
Fig. 6 C is in response to the 3rd expression of the embodiment of the equipment configuration that equipment state generates;
Fig. 6 D is in response to the 4th expression of the embodiment of the equipment configuration that equipment state generates;
Fig. 6 E is in response to the 5th expression of the embodiment of the equipment configuration that equipment state generates;
Fig. 6 F is in response to the 6th expression of the embodiment of the equipment configuration that equipment state generates;
Fig. 6 G is in response to the 7th expression of the embodiment of the equipment configuration that equipment state generates;
Fig. 6 H is in response to the 8th expression of the embodiment of the equipment configuration that equipment state generates;
Fig. 6 I is in response to the 9th expression of the embodiment of the equipment configuration that equipment state generates;
Fig. 6 J is in response to the tenth expression of the embodiment of the equipment configuration that equipment state generates;
Fig. 7 A is the expression of window logic storehouse;
Fig. 7 B is that the another kind of the embodiment of window logic stack is represented;
Fig. 7 C is that the another kind of the embodiment of window logic stack is represented;
Fig. 7 D is that the another kind of the embodiment of window logic stack is represented;
Fig. 7 E is that the another kind of the embodiment of window logic stack is represented;
Fig. 7 F is that the another kind of the embodiment of window logic stack is represented;
Fig. 7 G is that the another kind of the embodiment of window logic stack is represented;
Fig. 7 H is that the another kind of the embodiment of window logic stack is represented;
Fig. 7 I is that the another kind of the embodiment of window logic stack is represented;
Fig. 7 J is that the another kind of the embodiment of window logic stack is represented;
Fig. 8 is the block diagram of embodiment of the logic data structure of window stack;
Fig. 9 is based on the process flow diagram that window stack changes the embodiment of the method that shows;
In the accompanying drawings, similar parts and/or feature can have identical Reference numeral.In addition, the various assemblies of same type can be distinguished by the letter of the similar parts of the differentiation of Reference numeral back.If only use first Reference numeral in this manual, no matter what so described second Reference numeral is, any one that is suitable in the similar parts with first identical Reference numeral described.
Embodiment
In this paper is the embodiment of equipment.This equipment can be communication facilities, as cell phone, or other smart machine.This equipment can comprise two screens be used to the demonstration configuration that is directed to provide several uniquenesses.In addition, this equipment can receive user's input with unique mode.The global design of equipment and function provide the user of enhancing to experience, and make equipment more useful and more efficient.
Mechanical equivalent of light feature:
Figure 1A-1J shows equipment 100 according to embodiment of the present disclosure.As described in greater detail below, equipment 100 can be positioned in some different modes, its each provide different functions for the user.Equipment 100 is the multi-screen equipment that comprises main screen 104 and auxiliary screen 108, and the two all is touch-sensitive.In an embodiment, screen 104 and 108 whole front surface can be touch-sensitives, and can receive the input on the front surface of touch screen 104 and 108 by the user.Main screen 104 comprises touch-sensitive display 110, and it also shows information to the user except being the touch-sensitive.Auxiliary screen 108 comprises touch-sensitive display 114, and it also shows information to the user.In other embodiments, screen 104 and 108 viewing areas that can comprise more than.
Main screen 104 also comprises configurable regional 112, this configurable regional 112 specific input that has been arranged to when the user touches the part of configuring area 112.Auxiliary screen 108 comprises that also configuration comes for configurable regional 116 of specific input.Zone 112a and 116a have been configured to receive " returning " input that the indication user wants to check the information that showed in the past.Zone 112b and 116b have been configured to receive " menu " input that the indication user wants to check menu option.Zone 112c has been configured to receive " homepage " input that the indication user wants to check the information relevant with " homepage " view with 116c.In other embodiments, except above-mentioned configuration, zone 112a-c and 116a-c can be arranged to the specific input of other type of the characteristic that comprises opertaing device 100, and some non-limiting examples comprise the power supply of adjusting total system, adjust volume, adjust brightness, adjust vibration, select items displayed (on screen 104 or 108), operation camera, operation microphone and initiation/termination telephone calling.And in certain embodiments, regional 112a-C and 116a-C can be arranged to according to the application program of operation on the equipment 100 and/or be presented on touch-sensitive display 110 and/or 114 the specific input of the information that shows.
Except touch-sensing, main screen 104 and auxiliary screen 108 can also comprise reception from the zone of user's input, and do not need the viewing area on user's touch screen.For example, main screen 104 comprises gesture capture region 120, and auxiliary screen 108 comprises gesture capture region 124.These zones can receive input by the gesture that the identification user makes, and need not the surface of user's actual touch viewing area.Compare with 114 with touch-sensitive display 110, gesture capture region 120 and 124 can not present the image of demonstration usually.
As (rear view of equipment 100 is shown) that Fig. 1 C is clearly shown that, two screens 104 and 108 are linked together by hinge 128.Hinge 128 in the embodiment shown in Figure 1A-1J is the center hinge that connect screen 104 and 108, makes when hinge is closed, and (front view of equipment 100 is shown) as shown in Figure 1B, screen 104 and 108 is set up in parallel (that is, side by side).Can hinge-opening 128 so that two screens 104 are positioned at different relative positions with 108.As in greater detail following, equipment 100 can have different functions with 108 relative position according to screen 104.
Fig. 1 D shows the right side of equipment 100.As shown in Fig. 1 D, auxiliary screen 108 also is included in card slot 132 and the port one 36 of its side.Card slot 132 among the embodiment holds dissimilar cards, comprises subscriber identity module (SIM).Port one 36 in an embodiment is input/output end port (I/O ports), and its permission equipment 100 is connected to other peripherals, such as display, keyboard or printing device.Be understandable that these only are some examples, in other embodiments, equipment 100 can comprise other slots and the port that holds extra memory device and/or connect other peripherals such as being used for.Fig. 1 D also shows audio jack 140, and described audio jack 140 can hold for example end, ring, sleeve (TRS) connector, utilizes headphone or headsets to allow the user.
Equipment 100 also comprises a plurality of buttons 158.For example, Fig. 1 E shows the left side of equipment 100.As shown in Fig. 1 E, main screen 104 sides comprise three buttons 144,148 and 152, and they can be arranged to specific input.For example, button 144,148 and 152 can be configured to make up or some aspects of opertaing device 100 separately.Some nonrestrictive examples comprise the startup/termination of selection (on screen 104 or 108), camera, microphone and call of power supply, volume, brightness, vibration, the items displayed of total system.At some embodiment, replace independent button, two buttons can be combined into the rocking bar button.It is this that to be arranged under the situation of feature that button is configured to control example such as volume or brightness and so on be useful.Except button 144,148 and 152, as show shown in Fig. 1 F at top of equipment 100, equipment 100 also comprises button 156.In one embodiment, button 156 is configured to the on/off button for the power supply of the total system of opertaing device 100.Except or replace the control system power supply, in other embodiments, button 156 is configured to other aspects of opertaing device 100.In certain embodiments, one or more buttons 144,148,152 can be supported different user commands with 156.For example, normally press and have usually less than about 1 second duration, and be similar to fast and rap.Medium pressing had common 1 second or above but be less than about 12 seconds duration.Long presses the duration that has usually about 12 seconds or longer time.The function of button normally application program is specific, and described application program is current focus on each display 110 and 114.For example, in phone application, according to specific button, normal, medium or long pressing can mean and finish conversation, the increase of call volume, and minimizing and the switch microphone of call volume are quiet.For example, in the application of camera or video camera, according to specific button, normal, medium or long pressing can mean the increase zoom, reduces zoom, and takes pictures or recording of video.
Also have some nextport hardware component NextPorts in the equipment 100.Shown in Fig. 1 C, equipment 100 comprises loudspeaker 160 and microphone 164.Equipment 100 also comprises camera 168(Figure 1B).In addition, equipment 100 comprises: two position transducer 172A and 172B, they are used to determine the relative position of screen 104 and 108.In one embodiment, position transducer 172A and 172B are hall effect sensors.Yet in other embodiments, other sensor can be additional to or replace hall effect sensor to use.Accelerometer 176 be can also comprise as the part of equipment 100, the direction of equipment 100 and/or the direction of screen 104 and 108 are used for determining.Around Fig. 2 the additional internal nextport hardware component NextPort that can be included in the equipment 100 is described below.
The extra function that it can provide other communication facilities not provide is provided in the global design of equipment 100.Some functions are based on all places and the direction that equipment 100 can have.Shown in Figure 1B-1G, equipment 100 can be operated in the position of " opening ", and wherein screen 104 and 108 is side by side.This position can make big viewing area be used for demonstration information and give the user.When position transducer 172A and 172B determined that equipment 100 is shown in an open position, they can produce to trigger the signal of different events, as demonstration information on two screens 104 and 108.If accelerometer 176 determines that equipment 100 in the lengthwise position (Figure 1B) relative with the lateral attitude (not shown), then may trigger extra event.
Except open position, equipment 100 also may have " closing " position, shown in Fig. 1 H.Equally, position transducer 172A and 172B can produce the signal that indicating equipment 100 is in " cutting out " position.This can trigger the event of the demonstration change in information that causes on screen 104 and/or 108.For example, because the user once can only check a screen when equipment 100 is in " closing " position, so equipment 100 can be programmed therein and stops demonstration information on the screen (for example, screen 108).In other embodiments, the indicating equipment 100 that is produced by position transducer 172A and 172B is in the signal of " cutting out " position, and the equipment 100 that can trigger is answered incoming call." closing " position can also be used to the optimum position that utilizes equipment 100 as mobile phone.
Shown in Fig. 1 I, equipment 100 also can use the position at " support ".In the position of " support ", screen 104 and 108 is relative to each other at an angle to each other and towards the outside, screen 104 and 108 edge approximate horizontal.On this position, equipment 100 can be configured to demonstration information on screen 104 and 108, and is simultaneously mutual with equipment 100 to allow two users.When equipment 100 is positions at " support ", sensor 172A and 172B generate instruction screen 104 and 108 and are positioned at an angle to each other signal, and accelerometer 176 can produce indicating equipment 100 and has been placed and makes that the edge of screen 104 and 108 is the signal of level basically.The signal use that can be combined then is to produce the event of the change in information that is used for triggering the demonstration on screen 104 and 108.
Fig. 1 J is illustrated in the equipment 100 of the position of " support of modification ".In the position of " support of modification ", screen 104 or one of 108 is as base and face down on the object surfaces such as desk.This position provides a kind of mode easily that shows information in a lateral direction to the user.Similar to backing positions, when equipment 100 was in " support of modification " position, position transducer 172A and 172B generated instruction screen 104 and 108 and are positioned in signal at an angle to each other.Accelerometer 176 will generate indicating equipment 100 and be positioned, made one of screen 104 and 108 to face down and be the signal of level basically.Signal can be used to generate the event that the demonstration of the information that triggers screen 104 and 108 changes then.For example, because the user can not see screen, so information can not be displayed on the ventricumbent screen.
Transition state also is possible.When position transducer 172A and B and/or accelerometer indicated number screen (from opening) are being closed or are folding, confirm the transition state of closing.On the contrary, as position transducer 172A with the B instruction screen just is being opened or when folding (from closing), confirm the transition state of opening.The transition state of closing and opening is normally time-based, or begins that from the starting point that senses the maximum duration is arranged.Usually, when to close with one of open mode be effective, what do not have that the user imports was possible.In this mode, the user of the chance between the functional period of closing or opening of screen contact is not misinterpreted as user's input.In an embodiment, when equipment 100 was closed, another kind of transition state was possible.When equipment 100 was closed, based on some user's inputs (for example double-click on the screen 110,114), this extra transition state can switch to auxiliary screen 108 from a screen 104 with display.
Be understandable that the description of equipment 100 only is used for illustrative purposes, embodiment is not limited to the specific mechanical property shown in aforesaid and Figure 1A-1J.In other embodiments, equipment 100 can comprise extra feature, comprising one or more extra buttons, slot, viewing area, hinge and/or locking mechanism.In addition, in an embodiment, above-mentioned feature also can be positioned at the different piece of equipment 100, and similar function still is provided.Therefore, Figure 1A-1J is nonrestrictive with the description that provides above.
Ardware feature:
Fig. 2 illustrates the assembly according to the equipment 100 of embodiment of the present disclosure.In the ordinary course of things, equipment 100 comprises main screen 104 and auxiliary screen 108.Though normally enable main screen 104 and its assembly under two positions or the state opening and closing, normally under the state of opening, enable auxiliary screen 108 and its assembly and forbidding down in off position.Yet, even when following time in off position, the interruption of user or application triggers (for example, in response to the operation of phone application or camera applications) by suitable order can the upset activity screen, or forbidding main screen 104 and enable auxiliary screen 108.Each screen 104,108 can be touch-sensitive, and can comprise different operating areas.For example, first operating area in each touch-sensitive screen 104 and 108 can comprise touch-sensitive display 110,114.In the ordinary course of things, touch-sensitive display 110,114 can comprise the touch-sensitive display of full color.Second operating area in each touch screen 104 and 108 can comprise gesture capture region 120,124.Gesture capture region 120,124 can comprise outside touch-sensitive display 110,114 zones and zone or scope that input (for example, the form of the gesture that provides with the user) can be provided.Yet gesture capture region 120,124 does not comprise can carry out Presentation Function or ability pixel.
The screen 104 of touch-sensitive and 108 the 3rd zone can comprise configurable regional 112,116.Configurable regional 112,116 can receive input, and have demonstration or limited display ability.In an embodiment, configurable regional 112, the 116 different input options that can present to the user.For example, configurable regional 112,116 can the Show Button or other relevant entry.In addition, the sign of the button of demonstration, or any button whether be displayed on touch sensitive screen 104 or 108 configurable regional 112,116 in, can from the context that equipment 100 is used and/or operates, determine.In the exemplary embodiment, touch sensitive screen 104 and 108 comprise the screen 104 of crossing over touch-sensitive at least and 108, can those regional liquid crystal displays of vision output be provided and on touch sensitive screen 104 and 108 those zones, can receive electric capacity input matrix from user's input to the user.
One or more display controller 216a can be provided, and 216b controls the operation of touch sensitive screen 104 and 108, comprises the function of input (touch-sensing) and output (demonstration).In exemplary embodiment as shown in Figure 2, for each touch screen 104 and 108 provides independent touch screen controller 216a or 216b.According to alternate embodiment, touch screen controller 216 common or that share can be used for touch sensitive screen 104 that control is included and each of 108.According to other embodiment, the function of touch screen controller 216 can be merged in other assemblies, such as processor 204.
Processor 204 can comprise general purpose programmable processors or the controller for executive utility programming or instruction.According at least some embodiment, processor 204 can comprise a plurality of processor cores, and/or realizes a plurality of virtual processors.According to other embodiment, processor 204 can comprise a plurality of concurrent physical processors.As specific example, processor 204 can comprise computing machine of the ASIC(Application Specific Integrated Circuit) (ASIC) of special configuration or other integrated circuit, digital signal processor, controller, hard-wired electronics or logical circuit, programmable logic device (PLD) or gate array, specific use etc.Processor 204 works to move programming code or the instruction of the various functions of realization equipment 100 usually.
Communication facilities 100 can also comprise storer 208, and it is used for the execution be associated with processor 204 application programmings or instruction, and is used for the temporary transient or longer-term storage of programmed instruction and/or data.As example, storer 208 can comprise RAM, DRAM, SDRAM or other solid-state memories.Alternatively or additionally, can provide data storage device 212.Be similar to storer 208, data storage device 212 can comprise solid-state memory device.Alternatively or in addition, data storage device 212 can comprise hard disk drive or other random access memory.
Aspect support communication function or ability, equipment 100 can comprise cell phone module 228.As example, cell phone module 228 can comprise can be by GSM, CDMA, FDMA and/or the analog cellular telephone transceiver of cellular network support voice, multimedia and/or data transmission.Replacedly or additionally, equipment 100 can comprise additional or other wireless communication module 232.As example, other wireless communication module 232 can comprise Wi-Fi, bluetooth TM, WiMax, infrared ray or other wireless communication link.In cell phone module 228 and other the wireless communication module 232 each can be associated with shared or special-purpose antenna 224.
Can comprise port interface 252.Port interface 252 can comprise that support equipment 100 is interconnected to the proprietary or general port of other equipment or assembly (as docking adapter (dock)), and other equipment or assembly can or can not comprise additional function or the function different with the function of the equipment of being integrated into 100.Except the exchange of support equipment 100 and other equipment or communication between components signal, craft port (docking port) 136 and/or port interface 252 can be supported equipment 100 or supply with from the power supply of equipment 100.Port interface 252 also comprises the element of intelligence, this element comprise for opertaing device 100 and the equipment that is connected or communication between components or other mutual to connection module.
Can comprise input/output module 248 and associated ports with for example support with other communication facilities, server apparatus and/or peripherals pass through cable network or communicating by letter of linking.The example of input/output module 248 comprises ethernet port, USB (universal serial bus) (USB) port, Institute for Electrical and Electronics Engineers 1394 or other interfaces.
Can comprise that audio frequency input/output interface/equipment 244 thinks that be mutually related loudspeaker or other equipment provide analogue audio frequency, and receive the analogue audio frequency input from the microphone that connects or other equipment.As example, audio frequency input/output interface/equipment 244 can comprise amplifier and the analogue-to-digital converters that are associated.Replacedly or additionally, equipment 100 can comprise integrated audio input/output device 256 and/or be used for audio jack with external loudspeaker or microphone interconnection.For example, can provide integrated loudspeaker and integrated microphone, with near conversation or the speakerphone operation supporting.
Can comprise hardware button 158, with for example related use with specific control operation.As in conjunction with the description of Figure 1A to 1J, example comprises main power switch, volume control etc.Can comprise that one or more picture catching interface/equipment 240(is such as camera), be used for catching static and/or video image.Alternatively or additionally, picture catching interface/equipment 240 can comprise scanner or code reader.Picture catching interface/equipment 240 can comprise or can be associated with extra element (such as flashlamp or other light source).
Equipment 100 can also comprise GPS (GPS) receiver 236.According to embodiments of the invention, the GPS module of other assemblies that can provide absolute location information to arrive equipment 100 is provided gps receiver 236.Can also comprise accelerometer 176.For example, and show information and/or other functions to the user explicitly, can be used for determining to show to the user direction and/or the form of this information from the signal of accelerometer 176.
Embodiments of the invention can also comprise one or more position transducers 172.Position transducer 172 can provide the signal of indication touch sensitive screen 104 and 108 position relative to each other.This information can be used as input and offers for example user-interface application program, to determine touch-sensitive display 110,114 operator scheme, characteristic and/or 100 operations of other equipment.As example, screen position sensor 172 can comprise a series of hall effect sensor, multiposition switches, photoswitch, and Wheatstone bridge, potentiometer maybe can provide other layouts of the signal of a plurality of relative positions of indicating the touch screen place.
The various communication between components of equipment 100 can be carried out by one or more buses 222.In addition, can provide the assembly of power to equipment 100 from power source and/or power control module 260.Power control module 260 can (for example) comprise battery, AC-DC converter, power control logic and/or be used for interconnect equipment 100 to the port of outside power supply.
Equipment state:
The exemplary status of Fig. 3 A and 3B indication equipment 100.Though some exemplary states are shown, and the transition from first state to second state, be appreciated that all possible state and/or all possible transition from first state to second state may not be contained in the constitutional diagram of example.As shown in Figure 3, the physical change that different arrow indication equipment 100 between the state (being illustrated by the state of representing in circle) takes place, described variation is detected by one or more hardware and softwares, described detection triggers the one or more interruptions in hardware and/or the software, and described interruption is used for one or more functions of control and/or management equipment 100.
As shown in Figure 3A, 12 exemplary " physics " states are arranged: close 304, transition 308(or open transition state), support 312, the support of revising 316, open 320, incoming call/outgoing call or communicate by letter 324, image/video catches 328, transition 332(or close transition state), horizontal 340, butt joint 336, butt joint 344 and horizontal 348.Except state 324 and 328, the state next door shown in each is the diagram of the physical state of equipment 100, and the state in the state 324 and 328 is represented respectively by the international icon of phone and the icon of camera usually.
In state 304, this equipment is in off position, and equipment 100 is directed in a longitudinal direction usually, main screen 104 and auxiliary screen 108 back-to-back on different planes (referring to Fig. 1 H).Equipment 100 can enter for example mated condition 336 from closed condition, wherein equipment 100 is coupled to Docking station, butt joint cable, or usually connect with one or more other equipment or peripherals or related, or enter transverse state 340, wherein equipment 100 is oriented to main screen 104 user orienteds usually, and main screen 104 and auxiliary screen 108 are back-to-back.
Down, this equipment also can be transferred to transition state in off position, and wherein this equipment keeps closing, but shows that importing (for example double-click on screen 110,114) based on the user transfers to another screen 108 from a screen 104.Another embodiment of the present invention comprises bilateral (biliteral) state.Under bilateral state, this equipment is still closed, but single application program shows a window at first display 110 and second display 114 at least.Window displayed on first and second displays 110,114 can be identical or different based on the state of application program and this application program.For example, when obtaining image with camera, this equipment can show view finder at first display 110, and shows the preview (full frame and left-to-right mirror image) of photo theme at second display 114.
At state 308, that is, from closed condition 304 to semi-open state or the transition state of support state 312, equipment 100 is shown opens, its main screen 104 and auxiliary screen 108 rotate around the point of the axis that overlaps with hinge.In case enter support state 312, main screen 104 and auxiliary screen 108 are separated from one another, make that for example equipment 100 can be positioned on the surface with the structure of similar support.
At state 316, the backing positions that is called modification, equipment 100 has and similar main screen 104 and auxiliary screen 108 relativeness to each other in support state 312, and its difference is that in main screen 104 or the auxiliary screen 108 is placed from the teeth outwards, as shown in the figure.
State 320 is open modes, and wherein main screen 104 and auxiliary screen 108 are normally at grade.The transverse state 348 that equipment 100 can carry out the transition to mated condition 344 or open from open mode.In open mode 320, usually all on similar direction longitudinally, and in transverse state 348, main screen 104 and auxiliary screen 108 are normally on similar horizontal direction at main screen 104 and auxiliary screen 108.
State 324 is diagrams of communications status, for example when equipment 100 receives just respectively or be in incoming call or exhalation.Though for clarity sake not shown, be to be understood that the call state 324 that equipment 100 can be from any status transition shown in Figure 3 to incoming call/exhalation shown in the figure.In a similar fashion, can enter image/video trap state 328 by any other state from Fig. 3, image/video trap state 328 makes equipment 100 take one or more images and/or utilize video capturing device 240 capture video by camera.
Transition state 322 schematically illustrated main screens 104 and auxiliary screen 108 are closed to enter for example closed condition 304.
Reference key message in the reference diagram, Fig. 3 illustrate the input that is received to detect the transition from first state to second state.In Fig. 3 B, the various combinations of state are shown, on the whole, the directed vertical state 352 of the part of row, transverse state 356, the directed vertical state 360 of a part and the transverse state 364 of row.
In Fig. 3 B, key message indication " H " expression is from the input of one or more hall effect sensors, " A " expression is from the input of one or more accelerometers, " T " expression is from the input of timer, " P " expression communications triggered input, " I " presentation video and/or video capture request input.Therefore, at the middle body 376 of chart, indication equipment 100 is shown how detects input or the input combination that carries out the transition to second physical state from first physical state.
As discussing, at the middle body of chart 376, the input that receives enable from for example vertically open mode to the detection of the transition of horizontal support state (" HAT " that show with runic).This exemplary transition for from the state that vertically is opened to horizontal support may need hall effect sensor (" H "), accelerometer (" A ") and timer (" T ") input.The timer input can obtain from the clock that for example is associated with processor.
Except the vertical and horizontal state, also show mated condition 368, it is triggered based on the reception of docking signal 372.As discussed above, and contact Fig. 3, can be by equipment 100 and one or more other related butt joint signals that trigger of equipment 100, accessory, peripheral hardware, intelligent docking adapter etc.
User interactions:
Fig. 4 A to 4H describes the various diagrammatic representations of the gesture input that screen 104,108 can identify.These gestures can be not only carried out by user's body part (such as finger), also can be carried out by other equipment such as pointer, described pointer can by screen 104,108 contact sensing portion sensed to.Generally speaking, where carry out (directly on the display 110,114 or in gesture capture region 120,124) according to gesture, gesture is differently explained.For example, can be directed to desktop or application program in display 110,114 gesture, gesture capture region 120,124 gesture can be interpreted as for system.
With reference to figure 4A-4H, the gesture of the first kind, touch gestures 420 is at screen 104,108 static basically on a selected time span.Touch or other contact types that the specific location of the contact detecting on the circle 428 expression screens receives.Circle 428 can comprise border 432, and the thickness indication on border 432 keeps static time span basically in this contact of contact position.For example, rap 420(or short by) have than long by 424(or press normally) the thinner border 432a of border 432b.Long can relating to by 424, keep the rest time section than rapping 420 longer contacts on screen basically.As being understood that, can be according to the gesture that contact stops or the static time span of touch maintenance is registered different definition before the movement on the screen.
With reference to Fig. 4 C, on screen 104,108 to drag gesture 400 are initial contacts (by circle 428 expressions) and move 436 in the contact of selected direction.Initial contact 428 can be static in maintenance on the screen 104,108, in a certain amount of time of border 432 expressions.Dragging gesture needs the user to contact icon, window or other demonstration image in primary importance usually, subsequently, moves on the desired new second place of selected demonstration image in the drawing direction contact.As long as be continuous basically from first contact to the second place, described contact movement needn't be point-blank, but any mobile route is arranged.
With reference to Fig. 4 D, be that initial contact (by circle 428 expressions) and the mobile 436(of the contact of blocking on selected direction are with respect to dragging gesture in 104,108 the gesture 404 of flicking on the screen).In an embodiment, compare with dragging gesture, flicking in the last movement of gesture has higher rate of withdraw.For example, flicking gesture can be that initial contact back finger knocks fast.Compare with dragging gesture, the gesture of flicking do not need usually from the described primary importance of shown image to the predetermined second place, with screen 104,108 continue contact.The demonstration image of contact is moved to the predetermined second place by the gesture of flicking in the direction of the gesture of flicking.Though two gestures can move shown image usually from the primary importance to the second place, aspect duration and the travel distance that contacts on screen, flick gesture usually than dragging gesture still less.
With reference to Fig. 4 E, described the gesture 408 of handling knob on screen 104,108.The gesture of handling knob 408 can (for example by first finger) to screen 104,108 first contact 428 and (for example by second finger) to screen 104,108 second contact the 428b initiation.First and second contact 428a, the b can be by common screen 104,108 common contact sensing portion, by the different contact detecting of common screen 104 or 108, or is detected by the different contact detecting of different screen.Shown in the 432a of border, the first contact 428a is held very first time amount, and shown in the 432b of border, the second contact 428b is held second time quantum.First and second time quantums are normally substantially the same, and first and second contact 428a, the b occur usually basically simultaneously.First and second contact 428a, the b also comprise corresponding first and second contact mobile 436a, the b usually respectively.First and second contact mobile 436a, the b are usually in opposite direction.Change kind of a mode, the mobile 436a of first contact is towards the second contact 436b, and the mobile 436b of described second contact is towards the first contact 436a.More briefly, the gesture 408 of handling knob can be made touch screen 104,108 with pinching by user's finger and finishes.
Described expansion gesture 410 on screen 104,108 with reference to Fig. 4 F.Expansion gesture 410 can by (for example by first finger) to screen 104,108 the first contact 428a and (for example by second finger) to screen 104,108 second contact the 428b initiation.First and second contact 428a, the b can be by common screen 104,108 common contact detecting, detect by common screen 104,108 different contact detecting or by the different contact detecting of different screens.Shown in the 432a of border, the first contact 428a is held very first time amount, and shown in the 432b of border, the second contact 428b is held second time quantum.First and second time quantums are normally substantially the same, and first and second touch 428a, b occurs usually basically simultaneously.Touch 428a, b first and second and also comprise corresponding first and second contact mobile 436a, the b usually respectively.First and second contact mobile 436a, the b are usually on common direction.Change kind of a mode, first and second contact mobile 436a, the b are away from described first and second contact 428a, the b.More briefly, expansion gesture 410 can be finished with expansion action touch screen 104,108 by user's finger.
Such as by shown in Fig. 4 G and the 4H those, above-mentioned gesture can be combined to produce definite function result by any way.For example, in Fig. 4 G, from rap gesture 420 away from direction on, rap gesture 420 and drag or flick gesture 412 combinations.In Fig. 4 H, on the direction of rapping gesture 420, rap gesture 420 and drag or flick gesture 412 combinations.
The function result who receives gesture can depend on a number of factors and change, comprising equipment 100, display 110,114 or the sensed position of arriving of screen 104,108 state, the context that is associated with this gesture or gesture.The state of equipment typically refers to one or more configurations, the display direction of equipment 100 and user and other inputs that is received by equipment 100.Context is often referred to the part in the selected one or more specific application programs of gesture and the current application program of carrying out, whether this application program is the application program of list or multi-screen, and whether this application program is the multi-screen application program that shows one or more window in one or more storehouses or one or more screen.The set that the position that senses of gesture typically refers to the position coordinates of the gesture that senses is at touch-sensitive display 110,114 or on gesture capture region 120,124, the set of the position coordinates of the gesture that senses still is associated with different display or screen 104,108 with common, and/or what of gesture capture region partly comprises the set of the position coordinates of the gesture that senses.
When touch-sensitive display 110,114 receives when rapping, use this to rap, for example, select icon to start or to stop the execution of corresponding application program, with maximization or minimized window, the window of rearrangement in the storehouse, and such as showing by keyboard or the image of other demonstration provides the user to import.Receive when dragging when touching sensitive display 110,114, can use this to drag, for example, to reorientate icon or the window desired position in the display, at the display storehouse of resequencing, or cross over two displays (making the window of selecting occupy the part of each display simultaneously).When touch-sensitive display 110,114 or gesture capture region 120,124 receive when flicking, can use this to flick that window is reoriented to second display or crosses over two displays (making the window of selecting occupy the part of each display simultaneously) from first display.Yet, be different from and drag gesture, do not use usually and flick gesture shown image is moved to specific user selected position, but to the not configurable default location of user.
When touch-sensitive display 110,114 or gesture capture region 120,124 receive when handling knob gesture, the described gesture of handling knob can be used for minimizing or increasing the size (usually when being received fully by common display) of viewing area or window, the window at top that is presented at the storehouse of each display is switched to the top (usually when by different displays or screen reception) of the storehouse of other display, perhaps display application Program Manager (in storehouse " pop-up window " of display window).When the sensitive display 110 that is touched, 114 or gesture capture region 120,124 when receiving, the expansion gesture can be used for maximizing or reducing the size of viewing area or window, to switch to the top (normally when being received by different displays or screen) of the storehouse of other display at the window at the top of the storehouse that shows each display, perhaps the display application Program Manager (usually when by on the identical or different screens from shielding the gesture capture region when receiving).
When receiving the combination gesture of Fig. 4 G by the common demonstration capture region in common display or screen 104,108, the combination gesture of Fig. 4 G can be used to the display that receives this gesture to keep the first window stack invariant position in first storehouse, and the second window stack position of reordering in second window stack simultaneously is to comprise window in the display of the gesture that receives.When by common display or screen 104,108 or the different demonstration capture region of different display or screen when receiving the combination gesture of Fig. 4 H, the combination gesture of Fig. 4 H can be used for keeping the first window stack invariant position in first window stack at the display that raps part that receives gesture, the second window stack position of reordering in second window stack simultaneously is to comprise window in the display that flicks or drag gesture in reception.Though specific gesture in the aforementioned embodiment and gesture capture region are associated with function result's corresponding set, should be understood that, these associations can redefine by any way, to produce different associations between gesture and/or gesture capture region and/or functional outcome.
Firmware and software:
Storer 508 can be stored, and processor 504 can be carried out one or more component softwares.These assemblies can comprise at least one operating system (OS) 516, application manager 562, desktop 566 and/or from one or more application program 564a and/or the 564b of application storage device 560.OS516 can comprise framework 520, one or more frame buffer 548, as previous one or more drivers 512 described in conjunction with Figure 2 and/or kernel 518.Any software that OS516 can be made up of program and data, its supervisory computer hardware resource, and provide public service for the execution of various application programs 564.OS516 can be any operating system, and at least in certain embodiments, is exclusively used in equipment, includes but not limited to Linux, ANDROID TM, iPhone OS (IOS TM), WINDOWS PHONE7TM etc.As described herein, OS516 comes operationally to provide function for mobile phone by carrying out one or more operations.
Application program 564 can be to carry out any more senior software of specific function for the user.564 programs that can comprise such as email client, Web browser, short message application program, recreation, media player, office software etc. of application.Application program 564 can be stored in the application storage device 560, and application storage device 560 can be represented any storer and the data storage device for application storing 564, and management software associated with it.In case carry out, application program 564 may operate in the different zone of internal memory 508.
Framework 520 can be to allow a plurality of tasks to move to carry out mutual any software or data at equipment.In an embodiment, the discrete assembly at least part of and that hereinafter describe of framework 520 can be considered to the part of operating system 516 or application program 564.Yet these parts will be described to the part of framework 520, but these assemblies are not limited to this.Framework 520 can include, but are not limited to multi-screen display management (MDM) module 524, surperficial cache module 528, window management module 532, input manager module 536, task management module 540, Application models manager 542, display controller, one or more frame buffer 548, and it is window in the viewing area and/or the logic arrangement of desktop for task stack 552, one or more window stack heap 550() and/or events buffer 556.
MDM module 524 comprises one or more modules, is used for operationally managing application program on the screen of equipment or the demonstration of other data.The embodiment of MDM module 524 is described in conjunction with Fig. 5 B.In an embodiment, MDM module 524 receives the state that equipment 100 is determined in input constantly from other OS516 assemblies and the application program 564 of for example driver 512 and so on.This input assists MDM module 524 how to determine to dispose and distribute demonstration according to the preference of application program and requirement and user's action.In case determined to show configuration, MDM module 524 can arrive display by binding application program 564.Configuration can be provided for one or more other assemblies and comes to generate window by display then.
Surface cache module 528 comprises any storer or memory storage and software associated with it, stores or the image of the one or more windows of high-speed cache.Series of activities and/or inactive window (or other demonstration objects such as desktop shows) can be associated with each display.Current show events window (or other show object).Inactive window (or other show objects) is opened, and some the time shown, but current shown.Experience in order to improve the user, before window carries out the transition to inactive state from active state, " screenshot capture " of the image of last generation that can memory window (or other show objects).Surface cache module 528 can operate to store the current bitmap that does not have the last live image of shown window (or other show object).Therefore, surperficial cache module 528 is stored the inactive window image of (or other show object) in the data storage.
In an embodiment, window management module 532 can operate to manage the movable or inactive window (or other show object) on each display.Based on the information from MDM module 524, OS516 or other assemblies, window management module 532 determines when window (or other show object) is visible or inactive.Window management module 532 can place " inactive state " to sightless window (or other show object), and in conjunction with the task management module, task management 540 suspends the operation of application program.In addition, window management module 532 can by with MDM module 524 collaborative interactives with the display identifier allocation to window (or other show objects), or one or more sundry items of the data that are associated with this window (or other demonstration objects) of management.Window management module 532 can also provide institute's canned data to application program 564, task management module 540 or other assemblies interactive with this window (or other show object) or that be associated.Window management module 532 can also be associated incoming task based on the displaing coordinate in window focus and the motion space with window.
Input manager module 536 can operate the event of management equipment institute.Event is any input in Windows, for example, and with user's user interface interaction.Input manager module 536 reception events and storage event in events buffer 556 logically.Event can comprise following user interface interaction: when screen 104,108 receives " event downwards " that the touch signal from the user takes place, when screen 104,108 is determined " moving event " that users' finger takes place when screen moves, determine users when screen 104,108 and stopped touch screen 104,108 " event makes progress " etc.These events are transfused to administration module 536 and receive, store and be forwarded to other modules.Input manager module 536 can also be mapped to screen input the motion space, and this is the ultimate of all physics available on the equipment and virtual demonstration.
The motion space is the Virtual space, and it comprises all touch-sensitive displays 110,114 " tiling " together, with the physical size of imitation equipment 100.For example, when equipment 100 is unfolded, the size of motion space can be 960x800, and this can be the quantity in the pixel of the viewing area of two touch-sensitive displays 110,114 merging.(40,40) touch first touch-sensitive display 110 if the user is in the position, and the full screen window can receive the touch event of position (40,40).If the position (40 of second touch-sensitive display 114 that the user touches, 40), the full screen window can receive position (520,40) touch event, this be because second touch-sensitive display 114 on the right side of first touch display 110, this touch so the width of first touch display 110 that equipment 100 can be by 480 pixels is setovered.When hardware event took place and have positional information from driver 512, framework 520 can be changed physical location (up-scale) to the motion space, and this is can be based on equipment towards with state and different because of the position of this event.The motion space can be the U.S. Patent application the 13/187th that is entitled as " system and method that is used for receiving the gesture input of crossing over a plurality of input equipments ", submits on July 20th, 2011, No. 026 described motion space, in order to instruct and whole purpose, in this mode by reference its full content is incorporated in this.
Task can be application program, and the subtask can provide the application component of window, and the user can be by doing some things with it, such as calling, take pictures, send Email or consulting a map alternately.Can give the window that each task is drawn user interface therein.Window is filled display (for example, touch-sensitive display 110,114) usually, but also can be less than display 110,114 and float over the top of other windows.Application program normally is made up of a plurality of subtasks of loosely binding each other.Generally, a task in the application program is designated as " master " task, and it presents to the user when starting application program first.Each task can start another task or subtask to carry out different operations then.
Task management module 540 can operate to manage the operation of the one or more application programs 564 that can be carried out by equipment.Therefore, task management module 540 can receive that signal starts, time-out, termination etc. are stored in application program or application program subtask in the application storage device 560.Task management module 540 then can instantiation application program 564 one or more tasks or the subtask with the operation of beginning application program 564.In addition, task management module 540 can start, time-out or terminated task or subtask, with as the result of user's input or as the result from the signal of framework for cooperation 520 assemblies.Task management module 540 be in charge of application program (task and subtask), the life cycle when stopping to application program during from application program launching.
The processing of task stack 552 nonproductive task administration modules 540, its logical organization for being associated with task management module 540.All tasks on task stack 552 service equipments 100 and the state of subtask.When some assemblies of operating system 516 need the transition in its life cycle of task or subtask, the OS516 assembly can be notified task management module 540.Task management module 540 can use identification information in task stack 552 location tasks or subtask then, and the signal which type of life cycle transition is the indication task need carry out is sent to task or subtask.Notice task or subtask transition allow task or subtask to prepare for the life cycle state transition.Task management module 540 can be executed the task or the status transition of subtask then.In an embodiment, status transition may need to trigger OS kernel 518, with terminated task when needs stop.
In addition, task management module 540 can be based on suspend this application program 564 from the information of window management module 532.Suspend application program 564 and can in internal memory, keep application data, present window or user interface but can limit or stop application program 564.In case application program becomes movable again, task management module 540 can trigger application program again and present its user interface.In an embodiment, if task is suspended, if task finishes, then task can be preserved the state of task.Under halted state, the application program task may not receive input, because this application window is sightless to the user.
Frame buffer 548 is be used to the logical organization that presents user interface.Frame buffer 548 can be created and destroy to OS kernel 518.Yet display controller 544 can write view data in frame buffer 548 for visible window.Frame buffer 548 can be associated with one or more screens.Frame buffer 548 can be by dynamically controlling with the mutual of operating system nucleus 518 with the related of screen.Can create compound demonstration by a plurality of screens are associated with single frame buffer 548.The graph data that is used for presenting the window user interface of application program then can be written to the single frame buffer 548 for compound demonstration, and it is output to a plurality of screens 104,108.Display controller 544 can be directed to User Interface a part that is mapped to specific display 110,114 frame buffer 548, therefore, and explicit user interface on a screen 104 or 108 only.Display controller 544 can extend to the control to user interface a plurality of application programs, is a plurality of display control user interfaces that are associated with frame buffer 548 or its part.A plurality of physical screens 104,108 that component software on this method compensation display controller 544 uses.
Application manager 562 is application programs that presentation layer is provided for Windows.Therefore, application manager 562 provides the graphical model that is presented by task management module 540.Equally, desktop 566 provides presentation layer for application storage device 560.Therefore, desktop for the application program 564 in the application storage device 560 provide have selectable application program image target surface, can offer the graphical model that window manager 556 presents.
In addition, this framework can comprise Application models manager (AMM) 542.Application manager 562 can with the AMM542 interface.In an embodiment, AMM542 receives the state change information of state about application program (move or suspend) from equipment 100.AMM542 can be associated with bitmap images the task of activity (operation or time-out) from surperficial cache module 528.In addition, AMM542 can be converted to the window logic storehouse that remains on task manager module 540 linearity (" film " or " a secondary card ") tissue (organization), when using when carrying out the window ordering from screen gesture capture region 120 user to feel described linearity group.In addition, AMM542 can provide the tabulation of executive utility for application manager 562.
The embodiment of MDM module 524 is shown in Fig. 5 B.MDM module 524 can operate to determine the state of the environment of equipment, whether direction, the equipment 100 that includes but not limited to this equipment is opened or closed, what application program 564 is being carried out, application program how 564 shown, user carry out which type of action, shown task dispatching.In order to dispose display, as described in conjunction with Fig. 6 A-6J, MDM module 524 is explained these environmental factors and is determined to show configuration.Then, MDM module 524 can be tied to display with application program 564 or other apparatus assemblies.Then, this configuration other assembly that can be sent in display controller 544 and/or the OS516 generates demonstration.MDM module 524 can comprise one or more, but is not limited to, and shows configuration module 568, preference module 572, device state module 574, gesture module 576, requires module 580, event module 584 and/or binding module 588.
Show configuration module 568 definite layouts that show.In an embodiment, demonstration configuration module 568 can be determined environmental factor.Can be from one or more other MDM modules 524 or other source reception environment factor.Show that then configuration module 568 can determine the best configuration that shows from list of factors.Some embodiment of possible configuration and factor associated with it are described in conjunction with Fig. 6 A-6F
Preference module 572 can operate to determine the demonstration preference of application program 564 or other assembly.For example, application program can have the preference of list or dual screen.If equipment 100 is under the state that can adapt to this preference pattern, the demonstration preference that preference module 572 can be determined application program (for example, by checking the preference setting of application program) and can allow application program 564 to change to a pattern (for example, single screen, double screen, maximum etc.).Yet even a pattern is available, some user interface policy may not allow this pattern.Because the configuration change of equipment can check that preference is to determine whether to realize the better demonstration configuration of application program 564.
Device state module 574 can operate the state of definite or receiving equipment.The state of equipment can be described in conjunction with Fig. 3 A and Fig. 3 B.Show that configuration module 568 can use the state of equipment to determine the configuration that shows.Therefore, device state module 574 can receive input, and explains the state of this equipment.Provide status information to arrive then and show configuration module 568.
Gesture module 576 illustrates as the part of MDM module 524, and still, in an embodiment, gesture module 576 can be independently framework 520 assemblies that separate from MDM module 524.In an embodiment, gesture module 576 can operate to determine whether that the user carries out any operation in any part of user interface.In alternative embodiment, 576 of gesture modules receive operating user interfaces from configurable regional 112,116.Gesture module 576 can receive by input manager module 536 and occur in configurable regional 112,116(or other possible interface regions) touch event, and can (by service orientation, speed, distance, time and other various parameters) explain touch event, to determine what gesture the user carries out.When explaining gesture, gesture module 576 can be initiated the processing to gesture, and can administrative institute need the window animation by cooperating with other framework 520 assemblies.Gesture module 576 cooperates with Application models manager 542 to collect about which application program when user's gesture is carried out is moved the order status information that (movable or suspend) and application program must occur.Gesture module 576 can also (from surperficial cache module 528) receives reference and the movable window of bitmap, make when gesture takes place, it can the indicated number controller 544 moving windows on display 110,114 how.Therefore, when these windows on display screen 110,114 when moving, the application program of time-out may manifest to be moved.
In addition, gesture module 576 can receive mission bit stream from task management module 540 or input manager module 536.These gestures can define in conjunction with Fig. 4 A to 4H.For example, moving window makes display present the display frame of the movement of a series of diagram windows.The gesture that is associated with such user interface interaction can be received and explained by gesture module 576.Relevant with user's gesture then information is sent to the demonstration binding that task management module 540 is revised task.
Require module 580, be similar to preference module 572, can operate to determine the display requirement of application program 564 or other assemblies.Application program can have the one group of display requirement that must observe.Some application programs need specific display direction.For example, application program " bird of indignation " can only be with horizontal demonstration.Such display requirement can be by requiring module 580 to determine or reception.Because the direction of equipment changes, and requires module 580 can determine the display requirement of application program 564 again.Show configuration module 568 can generate as require module 580 that provide, according to the demonstration configuration of application program display requirement.
Event module 584 is similar to gesture module 576, can operate to determine to influence one or more events user interface, that take place with application program or other assemblies.Therefore, event module 584 can receive event information from events buffer 556 or task management module 540.These events can change task and how be tied to display.Event module 584 can be from other framework 520 assembly collection status change informations, and take action according to state change information.In example, when mobile phone is opened or closed or when when changing, new message can present at auxiliary screen.Event module 584 can receive and explain the state variation based on event.Information about event can be sent to the configuration that demonstration configuration module 568 is revised demonstration then.
Binding module 588 can be operated application program 564 or other component bindings are arrived the configuration that shows that configuration module 568 is determined.The demonstration configuration that is bundled in the internal memory each application program is associated with demonstration and the pattern of application program, therefore, binding module 588 can with the demonstration configuration of application program and application program (as laterally, vertically, multi-screen etc.) be associated.Then, binding module 588 can distribute the display identifier to display.The display identifier is associated the particular display of application program with equipment 100.Other assembly that this binding was stored and offered other assembly of display controller 544, OS516 then or correctly presents demonstration.Binding is dynamic, and can be based on the configuration change or the renewal that are associated with event, gesture, state variation, application program preference or requirement etc.
The user interface configuration:
With reference now to Fig. 6 A-J,, the various types of output configurations that realized by equipment 100 will be described below.
Fig. 6 A and 6B have described two of the equipment 100 of first state different outputs configurations.Particularly, Fig. 6 A has described equipment 100 and has closed vertical state 304, and wherein data are displayed on the main screen 104.In this example, equipment 100 shows data with the first vertically configuration 604 by touch-sensitive display 110.Be understandable that first vertically disposes 604 shows desktop or operating system home screen only.Replacedly, when equipment 100 vertically disposes 604 demonstration data with first, can present one or more windows at longitudinal direction.
Fig. 6 B has described equipment 100 and has remained and closing vertical state 304 times, but shows data at auxiliary screen 108.In this example, equipment 100 shows data with the second vertically configuration 608 by touch-sensitive display 114.
Can vertically dispose 604,608 with first or second and show similar or different data.Also can come transition between first vertical configuration 604 and second vertically disposes 608 by offering equipment 100 user's gestures (for example, double-clicking gesture), menu selection or other modes.Also can adopt other suitable gesture to come transition between configuration.In addition, be moved to which state according to equipment 100, also can make equipment 100 vertically dispose 604,608 from first or second and carry out the transition to any other configuration described herein.
Equipment 100 at second state can adapt to another kind of output configuration.Particularly, Fig. 6 C has described the 3rd vertically configuration, and wherein data are simultaneously displayed on main screen 104 and the auxiliary screen 108.It is that two vertically (PD) output is disposed that the 3rd vertical configuration can be called as.In PD output configuration, when the touch-sensitive display 114 of auxiliary screen 108 was described data with second vertical configuration 608, the touch-sensitive display 110 of main screen 104 was described data with first vertical configuration 604.When equipment 100 is when opening vertical state 320, first vertically configuration 604 and second vertically present and can take place configuration 608 time.In this configuration, equipment 100 can show an application window, two application windows (in each display 110 and 114 respectively one), application window and a desktop or a desktop in a display 110 or 114.Other configuration is possible.Should be understood that be moved to which state according to equipment 100, equipment 100 is shown when disposing 604,608 carry out the transition to any other configuration described herein.In addition, under this state, the demonstration preference of application program can be placed into equipment bilateral pattern, and two displays all are movable under this pattern, to show different windows in identical application program.For example, camera application program can show view finder and control in a side, and opposite side shows the mirror image preview that can be seen by the photo theme.Relate to the advantage that recreation that two players play simultaneously also can utilize bilateral pattern.
Fig. 6 D and 6E have described two other output configuration at the equipment 100 of the third state.Particularly, Fig. 6 D has described to be displayed on the equipment of closing transverse state 340 100 on the main screen 104 in data.In this example, equipment 100 shows data with first landscape configuration 612 by touch-sensitive display 110.Just as other configurations of explanation herein, first landscape configuration 612 can show desktop, home screen, one or more windows of display application routine data etc.
Fig. 6 E has described to remain the equipment 100 closing transverse state 340, but data are displayed on the auxiliary screen 108.In this example, equipment 100 shows data with second landscape configuration 616 by touch-sensitive display 114.Can vertically dispose 612,616 with first or second and show similar or different data.Also can by equipment 100 distortion is provided and rap gesture or flick and the gesture of sliding in one or two come transition between first landscape configuration 612 and second landscape configuration 616.Also can adopt the transition between configuration of other suitable gesture.In addition, also can be moved to which state according to equipment 100 equipment 100 is carried out the transition to any other configuration described herein from described first or second landscape configuration 612,616.
Fig. 6 F has described the 3rd landscape configuration, and wherein data are simultaneously displayed on main screen 104 and the auxiliary screen 108.The 3rd landscape configuration can be called as two laterally (LD) output configuration.In LD output configuration, the touch-sensitive display 110 of main screen 104 is described data with first landscape configuration 612, and the touch-sensitive display 114 of auxiliary screen 108 is described data with second landscape configuration 616.When equipment 100 when opening transverse state 340, present and may take place in the time of first landscape configuration 612 and second landscape configuration 616.Should be understood that also can be moved to which state according to equipment 100, display device 100 is shown when disposing 612,616 carry out the transition to any other configuration described herein.
Fig. 6 G and Fig. 6 H have described two views at the equipment 100 of another kind of state.Particularly, equipment 100 is described as be in support state 312.Fig. 6 G shows first support output configuration 618 and can be displayed on the touch-sensitive display 110.Fig. 6 H shows second support output configuration 620 and can be displayed on the touch-sensitive display 114.Equipment 100 can be configured to describe respectively first support output configuration, 618 or second support output configuration 620.Perhaps, can present support output configuration 618,620 simultaneously.In certain embodiments, support output configuration 618,620 can be similar or identical with horizontal output configuration 612,616.The support state of revising 316 times, equipment 100 can also be configured to show one or two support output configuration 618,620.It should be understood that utilization (for example, can promote two-player game when support output disposed 618,620
Figure BDA00003283863200281
Chess, Chinese checkers etc.), two or more user shares multi-user's meeting and other application programs of same equipment 100.Be understandable that also can be moved to which state according to equipment 100, make display device 100 carry out the transition to any other configuration as described herein from showing one or two configuration 618,620.
Fig. 6 I has described opening the another kind output configuration that vertical state can adapt to for 320 times when equipment 100.Particularly, here be called as in vertical configuration of vertical maximum (PMAX) configuration 624, equipment 100 can be configured to provide crosses over two touch-sensitive displays 110,114 single consecutive image.In this configuration, can cut apart and show on one of touch-sensitive display top data (for example, single image, application program, window, icon, video etc.), and the other parts of data are displayed on another touch-sensitive display.Pmax configuration 624 can help to show at equipment 100 bigger demonstration and/or the better resolution of specific image.Similar with other output configuration, be moved to which state according to equipment 100, equipment 100 can be carried out the transition to any other output configuration described herein from Pmax configuration 624.
Fig. 6 J has described opening another output configuration that transverse state can adapt to for 348 times when equipment 100.Particularly, here be called as in the landscape configuration of horizontal maximum (LMAX) configuration 628, equipment 100 can be configured to provide crosses over two touch-sensitive displays 110,114 single consecutive image.In this configuration, can cut apart and show on one of touch-sensitive display top data (for example, single image, application program, window, icon, video etc.), and the other parts of data are displayed on another touch-sensitive display.The configuration 628 of Lmax can help to show at equipment 100 bigger demonstration and/or the better resolution of specific image.Similar with other output configuration, be moved to which state according to equipment 100, equipment 100 can be carried out the transition to any other output configuration described herein from the configuration 628 of Lmax.
Shown in Fig. 7 A and 7B, equipment 100 management have desktop and/or the window of at least one window stack 700.Window stack 700 is logic arrangement activity or inactive window or demonstration object of multi-screen equipment.For example, shown in Fig. 7 A and 7B, window stack 700 can logically be similar to piling up of pack or fragment of brick, and wherein, one or more windows or demonstration object (for example, desktop) are by arranged in sequence.Movable window is the current window that is being shown at least one touch-sensitive display 110,114.For example, window 1708 is movable windows and is displayed on in the touch-sensitive display 114 at least one.In the embodiment shown in Fig. 7 A, equipment 100 is in closed condition 304, to dispose 608 display windows 1708.Inactive window be opened and shown, but at the window of the window " back " of activity, and not shown.In an embodiment, the application program that inactive window can be used for suspend, so the window content of show events not.For example, window 2712 and window 3716 are inactive windows.
Window stack 700 can have various layouts or institutional framework.In the embodiment shown in Fig. 7 A, when equipment 100 was in closed condition 304 with second vertical configuration, equipment 100 comprised first storehouse 704 that is associated with first touch-sensitive display 114.Therefore, each touch-sensitive display 114 can have the window stack 704 that is associated.In other embodiments, there is the single window stack that comprises all composite displays.This composite display is to define the logical organization that comprises two touch-sensitive displays 110,114 whole display space.Equipment 100 can have the single storehouse of composite display, and wherein window or demonstration object are resized to occupy the some or all of of composite display.Therefore, storehouse 704 can be represented the part of bigger composite window storehouse, because equipment 100 in off position 304, therefore part does not wherein show.
In Fig. 7 B, two window stacks (perhaps two of window stack parts) 704,724 can have the window that is arranged in each storehouse 704,724 varying number or show object.And two window stacks 704,724 also can differently be identified and be managed respectively.Shown in Fig. 7 A, first window stack 704 can be from first window 708 to next window 712 to last window 716 and at last to desktop 720 arranged in sequence, and in an embodiment, desktop 720 is in " bottom " of window stack 704.In an embodiment, desktop 720 is not always in " bottom ", because application window can be disposed in the below of the desktop 720 in the window stack, and during desktop or the change of other direction, desktop 720 can be placed to " top " of the storehouse above other window.For example, shown in Fig. 7 B, equipment 100 is transformed into open mode 320.Equipment 100 is transformed into the demonstration configuration shown in Fig. 6 C.Therefore, touch-sensitive display 110 is the window that is not associated with touch-sensitive display 110.Therefore, show desktop 720 for touch-sensitive display 110.Therefore, second storehouse 724 can comprise desktop 720, and in an embodiment, described desktop 720 is single desktop areas, below desktop 720 all windows in window stack 704 and window stack 724.The logic data structure that is used for two window stacks of management or has two parts 704, a single window storehouse of 724 can be described in conjunction with Fig. 8.
Equipment is opened the layout of storehouse 704 of rear hatch as shown in Fig. 7 C to 7E.Window stack 704 is illustrated in three " facade " views.In Fig. 7 C, show the top of window stack 704/724.Two adjacent limits of window stack 704/724 have been shown among Fig. 7 D and the 7E.In this embodiment, window stack 704/724 is similar to one pile of brick.Window is piled up mutually.The top of the window stack 704/724 from Fig. 7 C has only the window at the top in the window stack 704/724 to be seen in the different piece of composite display 728.Desktop 786 or window can occupy the some or all of of composite display 728.Figure C shows composite display 728. composite displays 728 and contains or comprise all touch-sensitive displays 110,114 viewing area.The size of composite display 728 can change based on the direction of equipment 100.For example, shown in Fig. 7 A, when in off position, the size of the composite display 728 of equipment 100 can include only the zone of one of touch-sensitive display 110 or 114.When equipment 100 was opened, shown in Fig. 7 B and 7C, composite display 728 expanded to comprise two touch-sensitive displays 110,114.Varying sized when composite display 728, some windows or demonstration object are associated with composite display 728 and are also varying sized.In an embodiment, such demonstration object can be desktop 720, and opening to fill composite display 728 described desktops 720 when equipment can be varying sized.
In the illustrated embodiment, desktop 720 is demonstration object minimum in the window stack 704/724, window or " brick ".Thereon, window 1708, window 2712 and window 3716 are stacked.Window 1708, window 2712 and 3716 parts that occupy composite display 728 of window.Therefore, another part of storehouse 724 includes only desktop 718.Only present and show the top window in any part of composite display 728 practically or show object.Therefore, shown in the top view among Fig. 7 C, window 1708 and desktop 718 are displayed on the top of the different piece of window stack 704/724.Window can be resized only to occupy the part of composite display 728 with the lower window in " appearing " window stack 704.For example, desktop 718 is lower than window 1708, window 2712 and window 3716, but still shown.When the equipment of the top window in the positive display window storehouse 704 was opened, this layout of window and desktop took place.
Desktop 718 is positioned in storehouse and where and how places can be the context of the direction of equipment 100, the program that is being performed on equipment 100, function, software etc., how to place the function of storehouse etc. when equipment 100 is opened.When equipment 100 was opened, the logic data structure that is associated with desktop 718 or other window can not change, but how logic data structure can determine display window and desktop.When user interface or other event or task changed the layout of storehouse, the logical organization of window or desktop can be changed to reflect the change in the layout.
Fig. 7 F to 7J shows window stack and arranges another embodiment of 724, and described window stack arranges that 724 change owing to equipment is opened from the different layout of Fig. 7 A to 7C.In the embodiment shown in Fig. 7 F, equipment 100 in off position 304 is to dispose 608 display windows 1708.Window 2712 and window 3716 are inactive windows.In the embodiment shown in Fig. 7 F, when equipment 100 first vertically during the closed condition 304 of configuration 604, equipment 100 comprises second storehouse 724 that is associated with second touch-sensitive display 110.Therefore, touch-sensitive display 110 can have the window stack 724 that is associated.In other embodiments, there is whole single window of containing composite display.Equipment 100 can have the single storehouse of composite display, and wherein window or demonstration object are resized to occupy the some or all of of composite display.Therefore, storehouse 724 can be represented the part of bigger composite window storehouse, because equipment 100 in off position 304, therefore part does not wherein show.
In Fig. 7 G, two window stacks (perhaps two of window stack parts) 704,724 can have the window that is arranged in each storehouse 704,724 varying number or show object.And two window stacks 704,724 also can differently be identified and be managed respectively.Shown in Fig. 7 F, second window stack 724 can according to from first window 708 to next window 712 to last window 716 and at last to being disposed in order of desktop 720, in an embodiment, desktop 720 is in " bottom " of window stack 704.In an embodiment, desktop 720 is not always in " bottom ", because application window can be disposed in the below of the desktop 720 in the window stack, and during desktop or the change of other direction, desktop 720 can be placed to " top " of the storehouse above other window.For example, shown in Fig. 7 G, equipment 100 is transformed into open mode 320.Equipment 100 is transformed into the demonstration configuration shown in Fig. 6 C.Therefore, touch-sensitive display 114 is the window that is not associated with touch-sensitive display 114.Therefore, show desktop 720 for touch-sensitive display 114.Therefore, second storehouse 704 can comprise desktop 720, and in an embodiment, described desktop 720 is single desktop areas, below desktop 720 all windows in window stack 704 and window stack 724.The logic data structure that is used for two window stacks of management or has two parts 704, a single window storehouse of 724 can be described in conjunction with Fig. 8.
The layout of the window stack 724 after equipment is opened is as shown in Fig. 7 H to 7J.Window stack 704 is illustrated in three " facade " views.In Fig. 7 H, show the top of window stack 704/724.Two adjacent " limits " of window stack 704/724 have been shown among Fig. 7 I and the 7J.In this embodiment, window stack 704/724 is similar to one pile of brick.Window is piled up mutually.The top of the window stack 704/724 from Fig. 7 H has only the window/demonstration object at the top in the window stack 704/724 to be seen in the different piece of composite display 728.Desktop 720 or window can occupy the some or all of of composite display 728.
In the illustrated embodiment, desktop 720 is demonstration object minimum in the window stack 704/724, window or " brick ".Thereon, window 1708, window 2712 and window 3716 are stacked.Window 1708, window 2712 and 3716 parts that occupy composite display 728 of window.Therefore, another part of storehouse 724 includes only desktop 718.Only present and show the top window in any part of composite display 728 practically or show object.Therefore, shown in the top view among Fig. 7 H, window 1708 and desktop 720 are displayed on the top of the different piece of window stack 704/724.Window can be resized only to occupy the part of composite display 728 with the lower window in " appearing " window stack 704.For example, desktop 718 is lower than window 1708, window 2712 and window 3716, but still shown.When the equipment of the top window in the positive display window storehouse 704 was opened, this layout of window and desktop took place.
Desktop 718 is positioned in storehouse and where and how places can be the context of the direction of equipment 100, the program that is being performed on equipment 100, function, software etc., how to place the function of storehouse etc. when equipment 100 is opened.When equipment 100 was opened, the logic data structure that is associated with desktop 718 or other window can not change, but how logic data structure can determine display window and desktop.When user interface or other event or task changed the layout of storehouse, the logical organization of window or desktop can be changed to reflect the change in the layout.
Be used for management and be shown in Fig. 8 at the logic data structure 800 of the layout of the window of window stack or desktop.Logic data structure 800 can be any data structure for storage (no matter being object, record, file etc.) data.Logic data structure 800 can be stored in database or the data-storage system of any kind, and no matter agreement or standard.In an embodiment, logic data structure 800 is included in one or more parts, field, attribute of storage data in the logic arrangement that allows to be convenient to store with retrieving information etc.Hereinafter, these one or more parts, field, attribute etc. will simply be described as field.Field can memory window identifier 804, size 808, window stack location identifier 812, display identifier 816 and/or active designator 820.Each window in window stack can have the logic data structure 800 that is associated.As oval 828 represented, though have only single logic data structure 800 to be shown in Fig. 8, more or less logic data structure 800(of using with window user interface 700 can be arranged based on the quantity of the window in storehouse or desktop).In addition, represented as ellipse 828, also can have than the more or less field shown in Fig. 8.
Window identifier 804 can comprise any identifier (ID), described indications identify uniquely with window stack in other windows or show the window that is associated that object is relevant or show object.Window identifier 804 can be the identifier of Globally Unique Identifier (GUID), digital ID, alphanumeric ID or other types.In an embodiment, this window identifier 804 can be based on, two or any amount of numeral of the quantity of openable window or demonstration object.In alternative embodiment, the size of window identifier 804 can or show that the quantity of object changes based on the window of opening.When window or show to as if open, this window identifier 804 can be static and remain unchanged.
Size 808 can be included in the window in the composite display 704 or show the size of object.For example, size 808 can comprise window or show object two or more angles coordinate or can comprise window or show a wide and high coordinate and the size of object.These sizes 808 can define window or show object can occupy what part of composite display 704, and it can be whole composite display 704 or be the part of composite display 704.For example, shown in Fig. 7 C and 7H, window 1708 can have the only size 880 of part that indication window 1708 will occupy the viewing area of composite display 728.Because window or demonstration object are moved in window stack or insert, so size 808 can change.
Stack position identifier 812 can be any identifier, this any identifier can identification window or is shown the position of object in storehouse, and perhaps the control record (such as tabulation or storehouse) of the window that this stack position identifier 812 can be from data structure is inferred.Stack position identifier 812 can be the identifier of GUID, digital ID, alphanumeric ID or other types.Each window or demonstration object can comprise stack position identifier 812.For example, shown in Fig. 7 A, the window 1708 in storehouse 1704 can have identification window 708 to be first windows in the storehouse 704 and to be the stack position identifier 812 of movable window.Similarly, window 2712 can have the expression window 2712 are stack position identifiers 812 of second window in the storehouse 704.Therefore, according to the type of storehouse, stack position identifier 812 can be represented the window in the storehouse or show the position of object.
Display identifier 816 can identification window or is shown that object is associated with particular display (such as first display, 110 second displays 114) or by the composite display 728 that two displays are formed.Shown in Fig. 7 A, though many stack system may not need display identifier 816, display identifier 816 can be indicated the window in the serial storehouse of Fig. 7 or be shown whether object shows in particular display.Therefore, desktop 720 can have two parts shown in Fig. 7 C.First can have the display identifier of first display 110, yet and second portion can have the display identifier 816. of second display 114, in alternative embodiment, desktop 720 can have the individual monitor identifier 816 of identification composite display 728.
Similar to display identifier 816 since the window in the stack position 1 or show to as if movable and shown, the dual stack system of Fig. 7 A can not need activity indicators 820.In alternative embodiment, activity indicators 820 can indicate that (a bit) window in the storehouse shown.Therefore, window 1708 can be illustrated, and has activity indicators 820.Activity indicators 820 can be the expression window or show liking movable or shown simple mark or position.
The embodiment that is used for the method 900 of change window stack is shown in Fig. 9.The general sequence of the step of method 900 has been shown among Fig. 9, and in general, method 900 begins and finishes with end operation 924 to begin operating 904.Method 900 can comprise more or step still less, perhaps can come the order of alignment step to be different from step order shown in Figure 9.Method 900 can be used as by computer system and carries out and be performed in the set of the computer executable instructions of computer-readable medium coding or storage.Hereinafter, will come illustration method 900 with reference to system, assembly, module, software, data structure, the user interface described in conjunction with Fig. 1-8.
Multi-screen equipment 100 can receive the direction of describing as Fig. 3 A-3B and change, and described direction change changes to open mode 320 with equipment 100 from closed condition 304 in step 908.Direction changes the signal that can be detected and receive from the hardware input of hall effect sensor, timer etc.Direction changes and can be received and send to multi-display administration module 524 by task management module 540.Multi-display administration module 524 can explain that this change changes to vertical demonstration (shown in Fig. 6 C) of opening with the configuration with display from vertical demonstration 604,608 of closing, perhaps show 612,616 to the landscape configuration of opening (shown in Fig. 6 F, as being combined described with Fig. 6 A to 6F) from laterally closing.In an embodiment, task management module 540 is placed in the task stack 552 user interface interaction to be acted on by multi-display administration module 524.In addition, task management module 540 is waited for and is created window to send instructions to window management module 532 from the information of multi-display administration module 524 in window stack 704.
When receiving instruction from task management module 540, in step 912, multi-display administration module 524 determines whether to appear desktop 786.In an embodiment, before equipment 100 was opened, desktop 786 can be in the bottom of window stack 704/724.Yet equipment 100 can show last window in the storehouse 704/724.In other words, the top window in the storehouse 704/724 is shown, the position of the display that appears recently in filling open mode do not have a window.For example, in Fig. 7 A, " left side " of window 1 do not have window.Therefore, shown in Fig. 7 B, multi-display administration module 524 need present desktop 720 in display 110.Because desktop 720 is crossed over composite display 728 usually and is launched, unless desktop 720 always shows that at the equipment of opening 100 another window covers desktop 720.Shown in Fig. 7 B and 7F, owing to there is not window to cover desktop 720, multi-display administration module 524 determines to show desktop 720 in the display of opening.
In an embodiment, the device state module 574 of multi-display administration module 524 can determine equipment be how to be directed or equipment be in what state, that for example open, that close, wait longitudinally.In addition, preference module 572 and/or require module 580 to determine how desktop 786 is shown based on the preference of desktop 786.Showing then that configuration module 568 can use from the input of device state module 574, preference module 572 and/or other frame assembly 520 assesses current window stack 704/724.Because other window is mobile, just be illustrated in desktop 720 common other windows that do not influence in the window stack 704/724 in the new display.Yet the size 808 of desktop 720 can change when desktop 720 is modified to fill composite display 728, and desktop 720 is expansions opening equipment 100.
In an embodiment, the observability algorithm is that all parts of composite display 728 determine that those windows/demonstration object is at the top of storehouse 704/724.For example, shown in Fig. 7 B or 7F, the observability algorithm determines that after equipment 100 was opened, desktop 786 was appeared in the part of storehouse 704.In addition, shown in Fig. 7 B or 7F, shown in the other parts of window 1708 in storehouse 704.When having determined where to appear desktop 720, show that configuration module 568 can change display size 808, display identifier 816 and/or the stack position identifier 812 of desktop 720.Multi-display administration module 524 can send it back task management module 540 with size 808, display identifier 816 and/or stack position identifier 812 then.
In an embodiment, task management module 540 sends size 808, display identifier 816 and/or stack position identifier 812 and/or out of Memory and instruction desktop 786 is presented to window management module 532.In step 920, window management module 532 and task management module 540 can change logic data structure 800.Task management module 540 and window management module 532 can management window storehouse 704/724 copy.These copies of window stack 704/724 can be by the communication between window management module 532 and the task management module 540 by synchronously or be held similar.Therefore, based on the information that many display managements module 524 is determined, window management module 532 and task management module 540 can change size 808, display identifier 816 and/or stack position identifier 812 and one or more windows of desktop 720.Logic data structure 800 can be by window management module 532 and 540 storages of task management module then.And window management module 532 and task management module 540 can be at after this management window storehouse 704/724 and logic data structures 800.
Example system of the present disclosure and method have been relevant to the window stack that is associated with multi-screen equipment and have been described.Yet for fear of unnecessarily bluring the disclosure, some known structures and equipment have been omitted in the description of front.This omission should not be understood that the restriction to the scope of claim.Illustrate detail to provide understanding of the present disclosure.Yet, should be understood that the disclosure can be put into practice with the variety of way that exceeds detail described in this paper.
In addition, though the illustrative aspects shown in this paper, embodiment and/or configuration show each ingredient of the system of configuration side by side, the remote part long range positioning that some assembly of this system can be in distributed network (such as LAN and/or the Internet) or in dedicated system.Therefore, it should be understood that, the assembly of this system can be incorporated into one or more equipment, and (such as computing machine, notebook computer, net book, panel computer, smart mobile phone, mobile device etc., or the specific node in distributed network disposes (as simulation and/or digital telecommunication network, packet switching network or circuit-switched network) side by side.Be appreciated that owing to counting yield that from the description of front the assembly of system can be disposed in the interior any position of distributed network of assembly, and can not influence the operation of system.For example, under one or more users' prerequisite, each assembly can be arranged in the converter (switch) such as PBX and media server, gateway, in one or more communication facilitiess or in their some combinations.Similarly, one or more funtion parts of system can be distributed in telecommunication apparatus and the computing equipment that is associated between.
In addition, should be understood that the various links of Connection Element can be wired or wireless links, or their any combination, maybe can get at and provide from the element that is connected and/or any other element known or later exploitation of communication data.These wired or wireless links can also be safety chain and information that can communication encryption.Transmission medium as link can be any appropriate carriers that for example is used for electric signal, comprise concentric cable, copper cash and optical fiber, and can take the form of sound wave or light wave, such as in radiowave and infrared ray red data communication process, produce those.
And, though discuss and illustrate process flow diagram relatively with the particular sequence of event, it should be understood that variation, interpolation and the omission of this sequence can take place and the operation of not appreciable impact the disclosed embodiments, configuration and aspect.
Can use variations more of the present disclosure and modification.Can provide other for some feature of the present disclosure.For example, in an alternate embodiment, window stack is similar to travelling belt or network of personal connections but not secondary playing cards.Therefore, window can be recycled to another touch-sensitive display 114 from a touch-sensitive display 110.For example, window can be pushed to the right side and finish at the end of the storehouse of another window back.If storehouse continues to move to the right side, even window is opened in first touch-sensitive display 110, final window will occur at second touch-sensitive display.In the storehouse these move and change and can be managed by using method discussed above and logic data structure.In another alternate embodiments, can there be other layout of window stack.
In another embodiment, system and method of the present disclosure can be in conjunction with special purpose computer, programming microprocessor or microcontroller and peripheral integrated circuit component, ASIC or other integrated circuit, digital signal processor, hardwired electronics such as discrete element circuits or logical circuit, (such as PLD, PLA, FPGA, PAL's and so on) programmable logic device (PLD) or gate array, special purpose computer, any similar devices wait to realize.Under normal conditions, any equipment that can realize method shown in this article can be used for be realized various aspects of the present disclosure.The example hardware that can be used for the disclosed embodiments, configuration and aspect comprise computing machine, handheld device, phone (for example, cell phone, the internet of enabling, numeral, simulation, hybrid and other) and other hardware as known in the art.Some of these equipment comprise processor (for example, single or multiple microprocessors), storer, nonvolatile memory, input equipment and output device.In addition, include but not limited to that the alternative software implementations that distributed treatment or component/object distributed treatment, parallel processing or virtual machine are handled also can be fabricated to realize method described herein.
In another embodiment, disclosed method can be used in combination the software of object or OO software development environment and easily realize, described software development environment provides the portable source code that can use on various computing machines or workstation platform.Replacedly, disclosed system can partly or entirely realize with the hardware that uses standard logic circuit or VLSI design.Use the software person of still being hardware to realize that system according to the present invention depends on requirement, specific function and the specific software of being used or hardware system or microprocessor or the microcomputer system of speed and/or the efficient of system.
In another embodiment, disclosed method can be partly realizes at software, and described software can be stored on the storage medium and carry out at the multi-purpose computer of programming with cooperations such as controller and storer, special purpose computer, microprocessors.In these examples, system and method for the present disclosure may be implemented as the program that is embedded on the personal computer, as applet,
Figure BDA00003283863200381
Or CGI scripting, as the resource that resides on server or the computer workstation, as the conventional program that is embedded in the special measurement system, system component etc.This system can also be by physically with system and/or method is attached to software and/or hardware system is realized.
Though the disclosure with reference to specific criteria and protocol description aspect, the assembly and the function that realize in embodiment and/or the configuration, described aspect, embodiment and/or configuration are not limited to these standards and agreement.There are NM other the similar standard of this paper and agreement and are believed to comprise in the disclosure.In addition, the standard of mentioning here and agreement and other similar standards of not mentioning here and agreement are periodically replaced by faster or more effective equivalent with substantially the same function.Such alternate standard and agreement with identical function are considered as included in the interior equivalent of the disclosure.
Basically as describing herein and illustrate, the disclosure in all fields, among the embodiment and/or comprise assembly, method, processing, system and/or equipment in the configuration, comprise various aspects, embodiment, configuration embodiment, son in conjunction with and/or its subclass.Those skilled in the art will understand after understanding present disclosure, how to make and use disclose aspect, embodiment and/or configuration.The disclosure in all fields, embodiment and/or configuration comprise providing does not have project that this paper describes and/or not description entry destination device or process, He or in all fields, embodiment and/or its configuration comprise do not have can be formerly equipment or this project used of process, for example, be used for improving performance, realization and easily reduce implementation cost.
Present the discussion of front with the purpose of illustration and description.Aforementioned content also is not intended to the form restriction disclosure disclosed herein.For example in the detailed description of aforementioned exemplary, for simplifying purpose of the present disclosure, various features of the present disclosure are grouped in together aspect one or more, in embodiment and/or the configuration.The feature of aspect of the present disclosure, embodiment and/or configuration can be combined in alternative aspect, embodiment and/or those configurations in addition discussed above.Disclosed this method is not interpreted as reflecting that claim requires the intention than the more feature of putting down in writing of feature in each claim.On the contrary, reflect that as following claim creative aspect is to lack than all features of single above-mentioned disclosed aspect, embodiment and/or configuration.Therefore, following claim is incorporated in this detailed description, and each claim itself is independent of independent preferred embodiment of the present disclosure.
In addition, though describe the description that has comprised one or more aspects, embodiment and/or configuration, specific change and revise, other variation, make up and be modified in the scope of the present disclosure, for example, can the skills and knowledge scope after those skilled in the art understand the present invention in.Intention is obtained right, described right comprises alternative aspect, embodiment and/or configuration in allowed limits, comprise alternative, interchangeable and/or equivalent configurations, function, scope or step to claim, no matter and these substitute, whether exchange and/or equivalent configurations, function, scope or step open by this paper, and is not intended to contribute publicly any patentability theme.

Claims (20)

1. computer-readable medium has the computer executable instructions that is stored thereon, can be carried out by processor, and described computer executable instructions makes processor be implemented as the method for equipment control window stack, and described computer executable instructions comprises:
The instruction that receive direction changes, wherein said direction change the equipment that makes and are transformed into open mode from closed condition, wherein, in described closed condition, first display is movable, and in described open mode, described first display and second display all are movable;
Determine whether desktop should be in the shown instruction of described second display; And
Should be after described second display be shown at definite described desktop, show described desktop and show the instruction of first window at described first display at described second display.
2. computer-readable medium as claimed in claim 1, wherein, described window stack is activity and the logic arrangement inactive window of multi-screen equipment.
3. computer-readable medium as claimed in claim 2, wherein, described window stack provides window or shows the expression of object to the user.
4. computer-readable medium as claimed in claim 3, wherein, described desktop is to show object.
5. computer-readable medium as claimed in claim 4, wherein, described desktop is striden composite display and is shown, wherein, described composite display has occupied the whole of at least two or more touch-sensitive displays basically, and wherein said first display and described second display part that is composite display.
6. computer-readable medium as claimed in claim 5 further comprises the instruction of revising the logic data structure that is associated with desktop.
7. computer-readable medium as claimed in claim 6, wherein, described logic data structure comprises one or more:
Window identifier is applicable to be identified in desktop relevant with other window in the described window stack;
The window stack location identifier is applicable to the position of identification desktop in window stack; And
The display identifier is applicable to the composite display of identifying the multi-screen equipment that is associated with described desktop.
8. computer-readable medium as claimed in claim 6, wherein, first display of described multi-screen equipment is associated with the first of described window stack, and second display of described multi-screen equipment is associated with the second portion of described window stack.
9. computer-readable medium as claimed in claim 8, wherein, described logic data structure further comprises the window stack identifier, described window stack identifier is applicable to first and the second portion that identification is associated with desktop.
10. equipment comprises:
At least two displays;
Storer;
With each processor of communicating by letter in described storer and described at least two displays, described processor can be operated:
Composite display is provided, and wherein said composite display comprises a touch-sensitive display that is in closed condition;
The direction that receives multi-screen equipment changes, wherein said direction changes makes described multi-screen equipment be transformed into open mode from closed condition, wherein, in described open mode, described composite display comprises the first that is associated with first touch-sensitive display and the second portion that is associated with second touch-sensitive display;
Extended desktop is to cover described composite display;
Determine that first window is shown and described desktop is shown at the second portion of described composite display in the first of described composite display; And
Show described desktop and show described first window at described first touch-sensitive display at described second touch-sensitive display.
11. equipment as claimed in claim 10, wherein, when being transformed into described open mode, described desktop is modified.
12. equipment as claimed in claim 11 further comprises and revises the logic data structure that is associated with described desktop.
13. equipment as claimed in claim 12, wherein, described desktop is associated with described two touch-sensitive displays.
14. equipment as claimed in claim 13, wherein, described logic data structure comprises one or more:
Window identifier is applicable to be identified in desktop relevant with other window in the described window stack;
The window stack location identifier is applicable at the second portion at least of described composite display and identifies the tip position of described desktop in window stack; And
At least two display identifiers are applicable to each at least two touch-sensitive displays identifying the multi-screen equipment that is associated with described desktop.
15. equipment as claimed in claim 14, wherein, described desktop is opened in the bottom of window stack, and because in described composite display at least part of, do not have window to cover described desktop, so described desktop is shown.
16. equipment as claimed in claim 15, wherein said first window covers at least part of of described desktop.
17. one kind is the method for the device rendered demonstration of multi-screen, described method comprises:
When described multi-screen equipment in off position the time, provide the composite display of the part of expanding first touch display at least;
When described equipment in off position, show first window at the top of window stack;
The direction that receives multi-screen equipment changes, and wherein said direction change is the conversion from described closed condition to open mode;
Change described composite display at least part of with first touch-sensitive display of expanding described multi-screen equipment and second touch-sensitive display, wherein, the first of described composite display is associated with described first touch-sensitive display, and the second portion of described composite display is associated with described second touch-sensitive display;
Determine that desktop is associated with described composite display;
Revise described desktop to expand on the composite display;
Determine the top of the window stack of described first window in the first of described composite display;
Determine the top of the window stack of described desktop in the second portion of described composite display;
Show described first window at described first touch-sensitive display; And
Show described desktop at described second touch-sensitive display.
18. method as claimed in claim 17, wherein, described desktop is in the bottom of described window stack.
19. method as claimed in claim 18, wherein, described first window is last window that comprises in the window stack of two or more windows.
20. method as claimed in claim 19, wherein, described first touch-sensitive display is the part of main screen.
CN201180058017.7A 2010-10-01 2011-09-29 Show desktop when the device is opened Active CN103282955B (en)

Applications Claiming Priority (11)

Application Number Priority Date Filing Date Title
US38900010P 2010-10-01 2010-10-01
US38908710P 2010-10-01 2010-10-01
US38911710P 2010-10-01 2010-10-01
US61/389,087 2010-10-01
US61/389,117 2010-10-01
US61/389,000 2010-10-01
US201161539884P 2011-09-27 2011-09-27
US61/539,884 2011-09-27
PCT/US2011/053953 WO2012044806A2 (en) 2010-10-01 2011-09-29 Displaying the desktop upon device open
US13/248,138 US20120081318A1 (en) 2010-10-01 2011-09-29 Displaying the desktop upon device open
US13/248,138 2011-09-29

Publications (2)

Publication Number Publication Date
CN103282955A true CN103282955A (en) 2013-09-04
CN103282955B CN103282955B (en) 2016-11-30

Family

ID=

Cited By (7)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN105739758A (en) * 2016-01-21 2016-07-06 广州市莱麦互联网科技有限公司 Display control method and device of Android device
CN106250017A (en) * 2016-07-26 2016-12-21 努比亚技术有限公司 A kind of mobile terminal and multitask management process
CN106409249A (en) * 2015-07-31 2017-02-15 乐金显示有限公司 Display panel and multi display device using the same
CN107113468A (en) * 2015-01-07 2017-08-29 微软技术许可有限责任公司 Automatic main screen based on display device is determined
CN108427589A (en) * 2018-02-01 2018-08-21 联想(北京)有限公司 A kind of data processing method and electronic equipment
CN110058828A (en) * 2019-04-01 2019-07-26 Oppo广东移动通信有限公司 Application program display methods, device, electronic equipment and storage medium
TWI800204B (en) * 2021-04-15 2023-04-21 瑞鼎科技股份有限公司 Dual-screen device and dual-screen picture alignment method

Citations (8)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US6331840B1 (en) * 1998-03-27 2001-12-18 Kevin W. Nielson Object-drag continuity between discontinuous touch screens of a single virtual desktop
US20030179541A1 (en) * 2002-03-21 2003-09-25 Peter Sullivan Double screen portable computer
CN1757052A (en) * 2003-02-28 2006-04-05 株式会社半导体能源研究所 Display and folding mobile terminal
US20090322714A1 (en) * 2006-12-18 2009-12-31 Samsung Electronics Co., Ltd. Method and apparatus for multiscreen management for multiple screen configuration
WO2010028394A1 (en) * 2008-09-08 2010-03-11 Qualcomm Incorporated Multi-fold mobile device with configurable interface
US20100146464A1 (en) * 2003-03-25 2010-06-10 Microsoft Corporation Architecture For Controlling A Computer Using Hand Gestures
US20100182265A1 (en) * 2009-01-09 2010-07-22 Samsung Electronics Co., Ltd. Mobile terminal having foldable display and operation method for the same
US20100188371A1 (en) * 2009-01-27 2010-07-29 Research In Motion Limited Handheld electronic device having a touchscreen and a method of using a touchscreen of a handheld electronic device

Patent Citations (8)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US6331840B1 (en) * 1998-03-27 2001-12-18 Kevin W. Nielson Object-drag continuity between discontinuous touch screens of a single virtual desktop
US20030179541A1 (en) * 2002-03-21 2003-09-25 Peter Sullivan Double screen portable computer
CN1757052A (en) * 2003-02-28 2006-04-05 株式会社半导体能源研究所 Display and folding mobile terminal
US20100146464A1 (en) * 2003-03-25 2010-06-10 Microsoft Corporation Architecture For Controlling A Computer Using Hand Gestures
US20090322714A1 (en) * 2006-12-18 2009-12-31 Samsung Electronics Co., Ltd. Method and apparatus for multiscreen management for multiple screen configuration
WO2010028394A1 (en) * 2008-09-08 2010-03-11 Qualcomm Incorporated Multi-fold mobile device with configurable interface
US20100182265A1 (en) * 2009-01-09 2010-07-22 Samsung Electronics Co., Ltd. Mobile terminal having foldable display and operation method for the same
US20100188371A1 (en) * 2009-01-27 2010-07-29 Research In Motion Limited Handheld electronic device having a touchscreen and a method of using a touchscreen of a handheld electronic device

Cited By (13)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN107113468B (en) * 2015-01-07 2019-12-31 微软技术许可有限责任公司 Mobile computing equipment, implementation method and computer storage medium
CN107113468A (en) * 2015-01-07 2017-08-29 微软技术许可有限责任公司 Automatic main screen based on display device is determined
US10956008B2 (en) 2015-01-07 2021-03-23 Microsoft Technology Licensing, Llc Automatic home screen determination based on display device
CN106409249A (en) * 2015-07-31 2017-02-15 乐金显示有限公司 Display panel and multi display device using the same
CN106409249B (en) * 2015-07-31 2020-04-24 乐金显示有限公司 Display panel and multi-screen display device using same
US10768883B2 (en) 2015-07-31 2020-09-08 Lg Display Co., Ltd. Display panel and multi display device having at least display panels including a transparent region in a non-display region
CN105739758A (en) * 2016-01-21 2016-07-06 广州市莱麦互联网科技有限公司 Display control method and device of Android device
CN106250017A (en) * 2016-07-26 2016-12-21 努比亚技术有限公司 A kind of mobile terminal and multitask management process
CN108427589A (en) * 2018-02-01 2018-08-21 联想(北京)有限公司 A kind of data processing method and electronic equipment
CN108427589B (en) * 2018-02-01 2023-07-21 联想(北京)有限公司 Data processing method and electronic equipment
CN110058828A (en) * 2019-04-01 2019-07-26 Oppo广东移动通信有限公司 Application program display methods, device, electronic equipment and storage medium
CN110058828B (en) * 2019-04-01 2022-06-21 Oppo广东移动通信有限公司 Application program display method and device, electronic equipment and storage medium
TWI800204B (en) * 2021-04-15 2023-04-21 瑞鼎科技股份有限公司 Dual-screen device and dual-screen picture alignment method

Also Published As

Publication number Publication date
EP2622597A4 (en) 2015-03-18
JP6073793B2 (en) 2017-02-01
EP2622597A2 (en) 2013-08-07
JP2013546049A (en) 2013-12-26

Similar Documents

Publication Publication Date Title
CN103348311A (en) Long drag gesture in user interface
US10831358B2 (en) Email client display transitions between portrait and landscape
CN103262010A (en) Desktop reveal by moving a logical display stack with gestures
CN103076967B (en) Change bifocal method and double screen communication equipment for responding gesture
US9733665B2 (en) Windows position control for phone applications
US9189018B2 (en) Windows position control for phone applications
CN103329061A (en) Method and system for viewing stacked screen displays using gestures
US20160044152A1 (en) Windows position control for phone applications
US20120225693A1 (en) Windows position control for phone applications
US20120225694A1 (en) Windows position control for phone applications
US20120218302A1 (en) Windows position control for phone applications
US20120220340A1 (en) Windows position control for phone applications
US20120220341A1 (en) Windows position control for phone applications
CN103282955A (en) Displaying the desktop upon device open
CN103282955B (en) Show desktop when the device is opened

Legal Events

Date Code Title Description
C06 Publication
PB01 Publication
C10 Entry into substantive examination
SE01 Entry into force of request for substantive examination
C14 Grant of patent or utility model
GR01 Patent grant